首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
2.
《Digital Investigation》2007,4(3-4):119-128
Carving is the term most often used to indicate the act of recovering a file from unstructured digital forensic images. The term unstructured indicates that the original digital image does not contain useful filesystem information which may be used to assist in this recovery.Typically, forensic analysts resort to carving techniques as an avenue of last resort due to the difficulty of current techniques. Most current techniques rely on manual inspection of the file to be recovered and manually reconstructing this file using trial and error. Manual processing is typically impractical for modern disk images which might contain hundreds of thousands of files.At the same time the traditional process of recovering deleted files using filesystem information is becoming less practical because most modern filesystems purge critical information for deleted files. As such the need for automated carving techniques is quickly arising even when a filesystem does exist on the forensic image.This paper explores the theory of carving in a formal way. We then proceed to apply this formal analysis to the carving of PDF and ZIP files based on the internal structure inherent within the file formats themselves. Specifically this paper deals with carving from the Digital Forensic Research Work-Shop's (DFRWS) 2007 carving challenge.  相似文献   

3.
The sharp rise in consumer computing, electronic and mobile devices and data volumes has resulted in increased workloads for digital forensic investigators and analysts. The number of crimes involving electronic devices is increasing, as is the amount of data for each job. This is becoming unscaleable and alternate methods to reduce the time trained analysts spend on each job are necessary.This work leverages standardised knowledge representations techniques and automated rule-based systems to encapsulate expert knowledge for forensic data. The implementation of this research can provide high-level analysis based on low-level digital artefacts in a way that allows an understanding of what decisions support the facts. Analysts can quickly make determinations as to which artefacts warrant further investigation and create high level case data without manually creating it from the low-level artefacts. Extraction and understanding of users and social networks and translating the state of file systems to sequences of events are the first uses for this work.A major goal of this work is to automatically derive ‘events’ from the base forensic artefacts. Events may be system events, representing logins, start-ups, shutdowns, or user events, such as web browsing, sending email. The same information fusion and homogenisation techniques are used to reconstruct social networks. There can be numerous social network data sources on a single computer; internet cache can locate Facebook, LinkedIn, Google Plus caches; email has address books and copies of emails sent and received; instant messenger has friend lists and call histories. Fusing these into a single graph allows a more complete, less fractured view for an investigator.Both event creation and social network creation are expected to assist investigator-led triage and other fast forensic analysis situations.  相似文献   

4.
Classification of particles as gunshot residues (GSRs) is conducted using a semiautomatic approach in which the system first classifies particles based on an automatic elemental analysis, and then, examiners manually analyze particles having compositions which are characteristic of or consistent with GSRs. Analyzing all the particles in the second stage is time consuming with many particles classified by the initial automated system as being potentially GSRs excluded as such by the forensic examiner. In this paper, a new algorithm is developed to improve the initial classification step. The algorithm is based on a binary tree that was trained on almost 16,000 particles from 43 stubs used to sample hands of suspects. The classification algorithm was tested on 5,900 particles from 23 independent stubs and performed very well in terms of false positive and false negative rates. A routine use of the new algorithm can reduce significantly the analysis time of GSRs.  相似文献   

5.
Computer forensic tools for Apple Mac hardware have traditionally focused on low-level file system details. Mac OS X and common applications on the Mac platform provide an abundance of information about the user's activities in configuration files, caches, and logs. We are developing MEGA, an extensible tool suite for the analysis of files on Mac OS X disk images. MEGA provides simple access to Spotlight metadata maintained by the operating system, yielding efficient file content search and exposing metadata such as digital camera make and model. It can also help investigators to assess FileVault encrypted home directories. MEGA support tools are under development to interpret files written by common Mac OS applications such as Safari, Mail, and iTunes.  相似文献   

6.
Computer forensic tools for Apple Mac hardware have traditionally focused on low-level file system details. Mac OS X and common applications on the Mac platform provide an abundance of information about the user's activities in configuration files, caches, and logs. We are developing MEGA, an extensible tool suite for the analysis of files on Mac OS X disk images. MEGA provides simple access to Spotlight metadata maintained by the operating system, yielding efficient file content search and exposing metadata such as digital camera make and model. It can also help investigators to assess FileVault encrypted home directories. MEGA support tools are under development to interpret files written by common Mac OS applications such as Safari, Mail, and iTunes.  相似文献   

7.
《Digital Investigation》2014,11(4):323-335
The battle between malware developers and security analysts continues, and the number of malware and malware variants keeps increasing every year. Automated malware generation tools and various detection evasion techniques are also developed every year. To catch up with the advance of malware development technologies, malware analysis techniques need to be advanced to help security analysts. In this paper, we propose a malware analysis method to categorize malware using dynamic mnemonic frequencies. We also proposed a redundancy filtering technique to alleviate drawbacks of dynamic analysis. Experimental results show that our proposed method can categorize malware and can reduce storage overheads of dynamic analysis.  相似文献   

8.
《Digital Investigation》2014,11(3):187-200
A recent increase in the prevalence of embedded systems has led them to become a primary target of digital forensic investigations. Embedded systems with DVR (Digital Video Recorder) capabilities are able to generate multimedia (video/audio) data, and can act as vital pieces of evidence in the field of digital forensics.To counter anti-forensics, it is necessary to derive systematic forensic techniques that can be used on data fragments in unused (unallocated) areas of files or images. Specifically, the techniques should extract meaningful information from various types of data fragments, such as non-sequential fragmentation and missing fragments overwritten by other data.This paper proposes a new digital forensic system for use on video data fragments related to DVRs. We demonstrate in detail special techniques for the classification, reassembly, and extraction of video data fragments, and introduce an integrated framework for data fragment forensics based on techniques described in this paper.  相似文献   

9.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

10.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

11.
“File carving” reconstructs files based on their content, rather than using metadata that points to the content. Carving is widely used for forensics and data recovery, but no file carvers can automatically reassemble fragmented files. We survey files from more than 300 hard drives acquired on the secondary market and show that the ability to reassemble fragmented files is an important requirement for forensic work. Next we analyze the file carving problem, arguing that rapid, accurate carving is best performed by a multi-tier decision problem that seeks to quickly validate or discard candidate byte strings – “objects” – from the media to be carved. Validators for the JPEG, Microsoft OLE (MSOLE) and ZIP file formats are discussed. Finally, we show how high speed validators can be used to reassemble fragmented files.  相似文献   

12.
In this paper, we propose two methods to recover damaged audio files using deep neural networks. The presented audio file recovery methods differ from the conventional file carving-based recovery method because the former restore lost data, which are difficult to recover with the latter method. This research suggests that recovery tasks, which are essential yet very difficult or very time consuming, can be automated with the proposed recovery methods using deep neural networks. We apply feed-forward and Long Short Term Memory neural networks for the tasks. The experimental results show that deep neural networks can distinguish speech signals from non-speech signals, and can also identify the encoding methods of the audio files at the level of bits. This leads to successful recovery of the damaged audio files, which are otherwise difficult to recover using the conventional file-carving-based methods.  相似文献   

13.
In order to facilitate forensic intelligence efforts in managing large collections of physical feature data pertaining to illicit tablets, we have developed an automated shape classification method. This approach performs categorical shape annotation for the domain of illicit tablets. It is invariant to scale, rotation and translation and operates on digital images of seized tablets. The approach employs two processing levels. The first (coarse) level is being based on comparing the contour curvature space of tablet pairs. The second (fine) level is a rule based approach, implemented as a classification tree, that exploits characteristic similarities of shape categories. Annotation is demonstrated over a collection of 169 tablets selected for their diverse shapes with an accuracy of 97.6% when 19 shape categories are defined.  相似文献   

14.
Abstract: Deployed airbags can be a valuable source of probative forensic materials. During an accident, trace evidence can be deposited on the airbag cover and in addition, the residue produced by the gas generation system is released into the passenger compartment of the vehicle as the airbag deflates. This residue can be used to associate a suspect with the vehicle at the time of the accident. This study identifies particles containing zirconium, strontium, and/or copper–cobalt along with other elements from the gas generation systems and aluminum silicon microfibers from airbag filters as the probative material which may be produced and deposited on a suspect’s hands and/or clothing. Scanning electron microscopy can be used to identify this metallic residue. Modification of the search criterion used for gunshot residue analysis allows for automated analysis of the samples. Proper collection of the airbag standard is essential to identify which materials were produced. Prompt collection of suspect samples allows the analysts the ability to make the proper identifications and associations. This analytical technique can be a probative tool in criminal investigations.  相似文献   

15.
File carving is the process of reassembling files from disk fragments based on the file content in the absence of file system metadata. By leveraging both file header and footer pairs, traditional file carving mainly focuses on document and image files such as PDF and JPEG. With the vast amount of malware code appearing in the wild daily, recovery of binary executable files becomes an important problem, especially for the case in which malware deletes itself after compromising a computer. However, unlike image files that usually have both a header and footer pair, executable files only have header information, which makes the carving much harder. In this paper, we present Bin-Carver, a first-of-its-kind system to automatically recover executable files with deleted or corrupted metadata. The key idea is to explore the road map information defined in executable file headers and the explicit control flow paths present in the binary code. Our experiment with thousands of binary code files has shown our Bin-Carver to be incredibly accurate, with an identification rate of 96.3% and recovery rate of 93.1% on average when handling file systems ranging from pristine to chaotic and highly fragmented.  相似文献   

16.
Chan KW  Tan GH  Wong RC 《Science & justice》2012,52(3):136-141
Statistical classification remains the most useful statistical tool for forensic chemists to assess the relationships between samples. Many clustering techniques such as principal component analysis and hierarchical cluster analysis have been employed to analyze chemical data for pattern recognition. Due to the feeble foundation of this statistics knowledge among novice drug chemists, a tetrahedron method was designed to simulate how advanced chemometrics operates. In this paper, the development of the graphical tetrahedron and computational matrices derived from the possible tetrahedrons are discussed. The tetrahedron method was applied to four selected parameters obtained from nine illicit heroin samples. Pattern analysis and mathematical computation of the differences in areas for assessing the dissimilarity between the nine tetrahedrons were found to be user-convenient and straightforward for novice cluster analysts.  相似文献   

17.
This paper demonstrates the feasibility of the automation of forensic hair analysis and comparison task using neural network explanation systems (NNESs). Our system takes as input microscopic images of two hairs and produces a classification decision as to whether or not the hairs came from the same person. Hair images were captured using a NEXTDimension video board in a NEXTDimension color turbo computer, connected to a video camera. Image processing was done on an SGI indigo workstation. Each image is segmented into a number of pieces appropriate for classification of different features. A variety of image processing techniques are used to enhance this information. Use of wavelet analysis and the Haralick texture algorithm to pre-process data has allowed us to compress large amounts of data into smaller, yet representative data. Neural networks are then used for feature classification. Finally, statistical tests determine the degree of match between the resulting collection of hair feature vectors. An important issue in automation of any task in criminal investigations is the reliability and understandability of the resulting system. To address this concern, we have developed methods to facilitate explanation of neural network's behavior using a decision tree. The system was able to achieve a performance of 83% hair match accuracy, using 5 of the 21 morphological characteristics used by experts. This shows promise for the usefulness of a fuller scale system. While an automated system would not replace the expert, it would make the task easier by providing a means for pre-processing the large amount of data with which the expert must contend.  相似文献   

18.
Conventional confirmatory biochemical tests used in the forensic analysis of body fluid traces found at a crime scene are destructive and not universal. Recently, we reported on the application of near-infrared (NIR) Raman microspectroscopy for non-destructive confirmatory identification of pure blood, saliva, semen, vaginal fluid and sweat. Here we expand the method to include dry mixtures of semen and blood. A classification algorithm was developed for differentiating pure body fluids and their mixtures. The classification methodology is based on an effective combination of Support Vector Machine (SVM) regression (data selection) and SVM Discriminant Analysis of preprocessed experimental Raman spectra collected using an automatic mapping of the sample. This extensive cross-validation of the obtained results demonstrated that the detection limit of the minor contributor is as low as a few percent. The developed methodology can be further expanded to any binary mixture of complex solutions, including but not limited to mixtures of other body fluids.  相似文献   

19.
Crime Mapping and the Training Needs of Law Enforcement   总被引:4,自引:0,他引:4  
This paper explores some of the more recent developments within crime mapping and the broader application of geographical information technology within law enforcement. The information technology (IT) revolution and the reduction in computing costs since the 1980s has brought a range of analytical tools within the budgets of most police services, and one of the most significant changes has been in the way that spatial data are handled. Law enforcement has strong geographic currents at all levels of the organisation, and this paper examines three applications of geographical information systems (GIS) within policing: hotspot mapping; CompStat; and geographic profiling. The paper concludes by discussing the future training needs using a simple model of intelligence-led crime reduction. This model suggests that training for managers to enable a greater understanding of the analyses presented to them, and how to use mapping to further crime prevention and reduction, may be as important as increasing the technical ability of crime analysts. The challenge for the immediate future of crime reduction practice in law enforcement is less to worry about the training of analysts, and more to address the inability of law enforcement management to understand and act on the crime analysis they are given.  相似文献   

20.
The computer automated scanning electron microscope. X-ray microanalysis of Firearms Discharge Residue (FDR) can reveal substantial information about the circumstances of their generation beyond the presence of characteristic gunshot residue (GSR). Indicators of the type of weapon and ammunition used can he obtained from the distribution of GSR particle shapes and from the multi-element analysis of the FDR sample. This is demonstrated for a large database of GSR samples from nine different handguns and over 60 different ammunitions. An example classification scheme is presented for the supporting particles generally found present in FDR. When particle type area concentration ratios are normalized to the iron (Fe) particle type, results show it is possible to distinguish much about the metal used in the weapon manufacture, whether it was of large or small caliber, whether the bullets were jacketed or plated, and whether the cartridge cases were of aluminum, brass, or nickel-plated brass. Standardization of such analytical schemes would be advantageous.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号