首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
Digital forensic tools are being developed at a brisk pace in response to the ever increasing variety of forensic targets. Most tools are created for specific tasks – filesystem analysis, memory analysis, network analysis, etc. – and make little effort to interoperate with one another. This makes it difficult and extremely time-consuming for an investigator to build a wider view of the state of the system under investigation. In this work, we present FACE, a framework for automatic evidence discovery and correlation from a variety of forensic targets. Our prototype implementation demonstrates the integrated analysis and correlation of a disk image, memory image, network capture, and configuration log files. The results of this analysis are presented as a coherent view of the state of a target system, allowing investigators to quickly understand it. We also present an advanced open-source memory analysis tool, ramparser, for the automated analysis of Linux systems.  相似文献   

2.
Digital forensic tools are being developed at a brisk pace in response to the ever increasing variety of forensic targets. Most tools are created for specific tasks – filesystem analysis, memory analysis, network analysis, etc. – and make little effort to interoperate with one another. This makes it difficult and extremely time-consuming for an investigator to build a wider view of the state of the system under investigation. In this work, we present FACE, a framework for automatic evidence discovery and correlation from a variety of forensic targets. Our prototype implementation demonstrates the integrated analysis and correlation of a disk image, memory image, network capture, and configuration log files. The results of this analysis are presented as a coherent view of the state of a target system, allowing investigators to quickly understand it. We also present an advanced open-source memory analysis tool, ramparser, for the automated analysis of Linux systems.  相似文献   

3.
《Digital Investigation》2007,4(3-4):119-128
Carving is the term most often used to indicate the act of recovering a file from unstructured digital forensic images. The term unstructured indicates that the original digital image does not contain useful filesystem information which may be used to assist in this recovery.Typically, forensic analysts resort to carving techniques as an avenue of last resort due to the difficulty of current techniques. Most current techniques rely on manual inspection of the file to be recovered and manually reconstructing this file using trial and error. Manual processing is typically impractical for modern disk images which might contain hundreds of thousands of files.At the same time the traditional process of recovering deleted files using filesystem information is becoming less practical because most modern filesystems purge critical information for deleted files. As such the need for automated carving techniques is quickly arising even when a filesystem does exist on the forensic image.This paper explores the theory of carving in a formal way. We then proceed to apply this formal analysis to the carving of PDF and ZIP files based on the internal structure inherent within the file formats themselves. Specifically this paper deals with carving from the Digital Forensic Research Work-Shop's (DFRWS) 2007 carving challenge.  相似文献   

4.
《Digital Investigation》2007,4(3-4):129-137
In this paper we discuss how operating system design and implementation influence the methodology for computer forensics investigations, with the focus on forensic acquisition of memory. In theory the operating system could support such investigations both in terms of tools for analysis of data and by making the system data readily accessible for analysis. Conventional operating systems such as Windows and UNIX derivatives offer some memory-related tools that are geared towards the analysis of system crashes, rather than forensic investigations. In this paper we demonstrate how techniques developed for persistent operating systems, where lifetime of data is independent of the method of its creation and storage, could support computer forensics investigations delivering higher efficiency and accuracy. It is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. We further propose a new technique for forensically sound acquisition of memory based on the persistence paradigm.  相似文献   

5.
File carving is the process of reassembling files from disk fragments based on the file content in the absence of file system metadata. By leveraging both file header and footer pairs, traditional file carving mainly focuses on document and image files such as PDF and JPEG. With the vast amount of malware code appearing in the wild daily, recovery of binary executable files becomes an important problem, especially for the case in which malware deletes itself after compromising a computer. However, unlike image files that usually have both a header and footer pair, executable files only have header information, which makes the carving much harder. In this paper, we present Bin-Carver, a first-of-its-kind system to automatically recover executable files with deleted or corrupted metadata. The key idea is to explore the road map information defined in executable file headers and the explicit control flow paths present in the binary code. Our experiment with thousands of binary code files has shown our Bin-Carver to be incredibly accurate, with an identification rate of 96.3% and recovery rate of 93.1% on average when handling file systems ranging from pristine to chaotic and highly fragmented.  相似文献   

6.
File system forensics is an important part of Digital Forensics. Investigators of storage media have traditionally focused on the most commonly used file systems such as NTFS, FAT, ExFAT, Ext2-4, HFS+, APFS, etc. NTFS is the current file system used by Windows for the system volume, but this may change in the future. In this paper we will show the structure of the Resilient File System (ReFS), which has been available since Windows Server 2012 and Windows 8. The main purpose of ReFS is to be used on storage spaces in server systems, but it can also be used in Windows 8 or newer. Although ReFS is not the current standard file system in Windows, while users have the option to create ReFS file systems, digital forensic investigators need to investigate the file systems identified on a seized media. Further, we will focus on remnants of non-allocated metadata structures or attributes. This may allow metadata carving, which means searching for specific attributes that are not allocated. Attributes found can then be used for file recovery. ReFS uses superblocks and checkpoints in addition to a VBR, which is different from other Windows file systems. If the partition is reformatted with another file system, the backup superblocks can be used for partition recovery. Further, it is possible to search for checkpoints in order to recover both metadata and content.Another concept not seen for Windows file systems, is the sharing of blocks. When a file is copied, both the original and the new file will share the same content blocks. If the user changes the copy, new data runs will be created for the modified content, but unchanged blocks remain shared. This may impact file carving, because part of the blocks previously used by a deleted file might still be in use by another file. The large default cluster size, 64 KiB, in ReFS v1.2 is an advantage when carving for deleted files, since most deleted files are less than 64 KiB and therefore only use a single cluster. For ReFS v3.2 this advantage has decreased because the standard cluster size is 4 KiB.Preliminary support for ReFS v1.2 has been available in EnCase 7 and 8, but the implementation has not been documented or peer-reviewed. The same is true for Paragon Software, which recently added ReFS support to their forensic product. Our work documents how ReFS v1.2 and ReFS v3.2 are structured at an abstraction level that allows digital forensic investigation of this new file system. At the time of writing this paper, Paragon Software is the only digital forensic tool that supports ReFS v3.x.It is the most recent version of the ReFS file system that is most relevant for digital forensics, as Windows automatically updates the file system to the latest version on mount. This is why we have included information about ReFS v3.2. However, it is possible to change a registry value to avoid updating. The latest ReFS version observed is 3.4, but the information presented about 3.2 is still valid. In any criminal case, the investigator needs to investigate the file system version found.  相似文献   

7.
In this work, we describe our experiences in developing cloud forensics tools and use them to support three main points:First, we make the argument that cloud forensics is a qualitatively different problem. In the context of SaaS, it is incompatible with long-established acquisition and analysis techniques, and requires a new approach and forensic toolset. We show that client-side techniques, which are an extension of methods used over the last three decades, have inherent limitations that can only be overcome by working directly with the interfaces provided by cloud service providers.Second, we present our results in building forensic tools in the form of three case studies: kumodd–a tool for cloud drive acquisition, kumodocs–a tool for Google Docs acquisition and analysis, and kumofs–a tool for remote preview and screening of cloud drive data. We show that these tools, which work with the public and private APIs of the respective services, provide new capabilities that cannot be achieved by examining client-side artifacts.Finally, we use current IT trends, and our lessons learned, to outline the emerging new forensic landscape, and the most likely course of tool development over the next five years.  相似文献   

8.
This paper presents the first deep investigation of the kmem_cache facility in Linux from a forensics perspective. The kmem_cache is used by the Linux kernel to quickly allocate and deallocate kernel structures associated with processes, files, and the network stack. Our focus is on deallocated information that remains in the cache and the major contribution of this paper is to illustrate what forensically relevant information can be retrieved from the kmem_cache and what information is definitively not retrievable. We show that the kmem_cache contains a wealth of digital evidence, much of which was either previously unavailable or difficult to obtain, requiring ad hoc methods for extraction. Previously executed processes, memory mappings, sent and received network packets, NAT translations, accessed file system inodes, and more can all be recovered through examination of the kmem_cache contents. We also discuss portable methods for erasing this information, to ensure that private data is no longer recoverable.  相似文献   

9.
Current memory forensic tools concentrate mainly on system-related information like processes and sockets. There is a need for more memory forensic techniques to extract user-entered data retained in various Microsoft Windows applications such as the Windows command prompt. The command history is a prime source of evidence in many intrusions and other computer crimes, revealing important details about an offender’s activities on the subject system. This paper dissects the data structures of the command prompt history and gives forensic practitioners a tool for reconstructing the Windows command history from a Windows XP memory capture. At the same time, this paper demonstrates a methodology that can be generalized to extract user-entered data on other versions of Windows.  相似文献   

10.
“File carving” reconstructs files based on their content, rather than using metadata that points to the content. Carving is widely used for forensics and data recovery, but no file carvers can automatically reassemble fragmented files. We survey files from more than 300 hard drives acquired on the secondary market and show that the ability to reassemble fragmented files is an important requirement for forensic work. Next we analyze the file carving problem, arguing that rapid, accurate carving is best performed by a multi-tier decision problem that seeks to quickly validate or discard candidate byte strings – “objects” – from the media to be carved. Validators for the JPEG, Microsoft OLE (MSOLE) and ZIP file formats are discussed. Finally, we show how high speed validators can be used to reassemble fragmented files.  相似文献   

11.
The dramatic growth of storage capacity and network bandwidth is making it increasingly difficult for forensic examiners to report what is present on a piece of subject media. Instead, analysts are focusing on what characteristics of the media have changed between two snapshots in time. To date different algorithms have been implemented for performing differential analysis of computer media, memory, digital documents, network traces, and other kinds of digital evidence. This paper presents an abstract differencing strategy and applies it to all of these problem domains. Use of an abstract strategy allows the lessons gleaned in one problem domain to be directly applied to others.  相似文献   

12.
This practitioner paper provides an introduction to investigating IPv6 networks and systems. IPv6 addressing, packet structure, and supporting protocols are explained. Collecting information from IPv6 registries and databases such as WHOIS and DNS is demonstrated. Basic concepts and methods relevant for digital forensic investigators are highlighted, including the forensic analysis of IPv6 enabled systems. The enabling of IPv6 capability in a forensics lab is shown, including IPv6 connectivity and the use of IPv6 compatible tools. Collection and analysis of live network evidence from IPv6 networks is discussed, including investigation of remote IPv6 nodes, and promiscuous capture of IPv6 traffic.  相似文献   

13.
The exhibits obtained in wildlife offence cases quite often present a challenging situation for the forensic expert. The selection of proper approach for analysis is vital for a successful analysis. A generalised forensic analysis approach should proceed from the use of non-destructive techniques (morphological and microscopic examination) to partially destructive and finally destructive techniques (DNA analysis). The findings of non-destructive techniques may sometime be inconclusive but they definitely help in steering further forensic analysis in a proper direction. We describe a recent case where a very small dried skin piece (< 0.05 mg) with just one small trimmed guard hair (0.4 cm) on it was received for species identification. The single guard hair was examined microscopically to get an indication of the type of species. We also describe the extraction procedure with a lower amount of sample, using an automated extraction method (Qiagen Biorobot EZ1®) and PCR amplification of three mitochondrial genes (16s rRNA, 12s rRNA and cytochrome b) for species identification. Microscopic examination of the single hair indicated a viverrid species but the initial DNA analysis with 16s rRNA (through NCBI BLAST) showed the highest homology (93%) with a hyaenid species (Hyaena hyaena). However, further DNA analysis based on 12s rRNA and cytochrome b gene proved that the species was indeed a viverrid i.e. Viverricula indica (small Indian civet). The highest homology shown with a Hyaenid species by the 16s rRNA sequence from the case sample was due to lack of a 16s rRNA sequence for Viverricula indica in the NCBI data base. The case highlights the importance of morphological and microscopic examinations in wildlife offence cases. With respect to DNA extraction technology we found that automatic extraction method of Biorobot EZ1® (Qiagen) is quite useful with less amount of sample (much below recommended amount).  相似文献   

14.
The sharp rise in consumer computing, electronic and mobile devices and data volumes has resulted in increased workloads for digital forensic investigators and analysts. The number of crimes involving electronic devices is increasing, as is the amount of data for each job. This is becoming unscaleable and alternate methods to reduce the time trained analysts spend on each job are necessary.This work leverages standardised knowledge representations techniques and automated rule-based systems to encapsulate expert knowledge for forensic data. The implementation of this research can provide high-level analysis based on low-level digital artefacts in a way that allows an understanding of what decisions support the facts. Analysts can quickly make determinations as to which artefacts warrant further investigation and create high level case data without manually creating it from the low-level artefacts. Extraction and understanding of users and social networks and translating the state of file systems to sequences of events are the first uses for this work.A major goal of this work is to automatically derive ‘events’ from the base forensic artefacts. Events may be system events, representing logins, start-ups, shutdowns, or user events, such as web browsing, sending email. The same information fusion and homogenisation techniques are used to reconstruct social networks. There can be numerous social network data sources on a single computer; internet cache can locate Facebook, LinkedIn, Google Plus caches; email has address books and copies of emails sent and received; instant messenger has friend lists and call histories. Fusing these into a single graph allows a more complete, less fractured view for an investigator.Both event creation and social network creation are expected to assist investigator-led triage and other fast forensic analysis situations.  相似文献   

15.
《Digital Investigation》2014,11(3):160-174
Immature IT security, increasing network connectivity and unwavering media attention is causing an increase in the number of control system cyber security incidents. For forensic examinations in these environments, knowledge and skills are needed in the field of hardware, networks and data analysis. For forensic examiners, this paper is meant to be a crash course on control systems and their forensic opportunities, focussing on the differences compared to regular IT systems. Assistance from experienced field engineers during forensic acquisition of control systems seems inevitable in order to guarantee process safety, business continuity and examination efficiency. For people working in the control system community, this paper may be helpful to get an idea about specific forensic issues about which they would normally not bother, but may be crucial as soon as their systems are under attack or become part of a law enforcement investigation. For analysis of acquired data, existing tools for network security monitoring have useful functionality for forensic applications but are designed for real-time acquisition and often not directly usable for post-mortem analysis of acquired data in a forensically sound way. The constant and predictable way in which control systems normally behave makes forensic application of anomaly-based threat detection an interesting topic for further research.  相似文献   

16.
17.
There is a lack of clear guidelines for project managers, laboratory managers and forensic scientists on strategies for the automation of forensic DNA laboratory processes and operational implementation of new technologies. This is reflected in the failure rate of projects in the forensic DNA testing environment. We present a set of guidelines and concepts important for forensic laboratory automation. Some case studies from past projects are presented. These consist of partial (or modular) automation (n = 2) and full automated robotically integrated systems (n = 2).Technology Management principles and concepts are crucial to prevent failure of projects, e.g. early adoption of untried technologies, and organizational factors. The future of laboratory automation is modular until such time as new discontinuous technologies will replace the need of the traditional manual laboratory configuration in totality.  相似文献   

18.
Memory forensics has gradually moved into the focus of researchers and practitioners alike in recent years. With an increasing effort to extract valuable information from a snapshot of a computer's RAM, the necessity to properly assess the respective solutions rises as well. In this paper, we present an evaluation platform for forensic memory acquisition software. The platform is capable of measuring distinct factors that determine the quality of a generated memory image, specifically its correctness, atomicity, and integrity. Tests are performed for three popular open source applications, win32dd, WinPMEM, and mdd, as well as for different memory sizes.  相似文献   

19.
The role of live forensics in digital forensic investigations has become vital due to the importance of volatile data such as encryption keys, network activity, currently running processes, in memory only malware, and other key pieces of data that are lost when a device is powered down. While the technology to perform the first steps of a live investigation, physical memory collection and preservation, is available, the tools for completing the remaining steps remain incomplete. First-generation memory analyzers performed simple string and regular expression operations on the memory dump to locate data such as passwords, credit card numbers, fragments of chat conversations, and social security numbers. A more in-depth analysis can reveal information such as running processes, networking information, open file data, loaded kernel modules, and other critical information that can be used to gain insight into activity occurring on the machine when a memory acquisition occurred. To be useful, tools for performing this in-depth analysis must support a wide range of operating system versions with minimum configuration. Current live forensics tools are generally limited to a single kernel version, a very restricted set of closely related versions, or require substantial manual intervention.This paper describes techniques developed to allow automatic adaptation of memory analysis tools to a wide range of kernel versions. Dynamic reconstruction of kernel data structures is obtained by analyzing the memory dump for the instructions that reference needed kernel structure members. The ability to dynamically recreate C structures used within the kernel allows for a large amount of information to be obtained and processed. Currently, this capability is used within a tool called RAMPARSER that is able to simulate commands such as ps and netstat as if an investigator were sitting at the machine at the time of the memory acquisition. Other applications of the developed capabilities include kernel-level malware detection, recovery of processes memory and file mappings, and other areas of forensics interest.  相似文献   

20.
In their day-to-day work, carrying out complex tasks, forensic scientists use a combination of explicit, codified standard operating procedures and tacit knowledge developed through their ongoing practice. We show that tacit knowledge is an integral part of the activities of expert forensic science practitioners who continually add to their knowledge repertoire by engaging other scientists through communities of practice. We wish to shed fresh light on the gaining of tacit knowledge by forensic scientists during their apprentice formative years, termed as legitimate peripheral participation. In quantifying tacit knowledge exchanges, we use social network analysis, a methodology for the analysis of social structures, to map relational knowledge flows between forensic scientists within communities of practice at the Forensic Science Laboratory, Ireland. This paper sheds light on the importance of tacit knowledge within the training regime of forensic scientists and its recognition as equal to the part played by explicit knowledge.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号