首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
File carving is a technique whereby data files are extracted from a digital device without the assistance of file tables or other disk meta-data. One of the primary challenges in file carving can be found in attempting to recover files that are fragmented. In this paper, we show how detecting the point of fragmentation of a file can benefit fragmented file recovery. We then present a sequential hypothesis testing procedure to identify the fragmentation point of a file by sequentially comparing adjacent pairs of blocks from the starting block of a file until the fragmentation point is reached. By utilizing serial analysis we are able to minimize the errors in detecting the fragmentation points. The performance results obtained from the fragmented test-sets of DFRWS 2006 and 2007 show that the method can be effectively used in recovery of fragmented files.  相似文献   

2.
“File carving” reconstructs files based on their content, rather than using metadata that points to the content. Carving is widely used for forensics and data recovery, but no file carvers can automatically reassemble fragmented files. We survey files from more than 300 hard drives acquired on the secondary market and show that the ability to reassemble fragmented files is an important requirement for forensic work. Next we analyze the file carving problem, arguing that rapid, accurate carving is best performed by a multi-tier decision problem that seeks to quickly validate or discard candidate byte strings – “objects” – from the media to be carved. Validators for the JPEG, Microsoft OLE (MSOLE) and ZIP file formats are discussed. Finally, we show how high speed validators can be used to reassemble fragmented files.  相似文献   

3.
File carving is the process of reassembling files from disk fragments based on the file content in the absence of file system metadata. By leveraging both file header and footer pairs, traditional file carving mainly focuses on document and image files such as PDF and JPEG. With the vast amount of malware code appearing in the wild daily, recovery of binary executable files becomes an important problem, especially for the case in which malware deletes itself after compromising a computer. However, unlike image files that usually have both a header and footer pair, executable files only have header information, which makes the carving much harder. In this paper, we present Bin-Carver, a first-of-its-kind system to automatically recover executable files with deleted or corrupted metadata. The key idea is to explore the road map information defined in executable file headers and the explicit control flow paths present in the binary code. Our experiment with thousands of binary code files has shown our Bin-Carver to be incredibly accurate, with an identification rate of 96.3% and recovery rate of 93.1% on average when handling file systems ranging from pristine to chaotic and highly fragmented.  相似文献   

4.
《Digital Investigation》2007,4(3-4):119-128
Carving is the term most often used to indicate the act of recovering a file from unstructured digital forensic images. The term unstructured indicates that the original digital image does not contain useful filesystem information which may be used to assist in this recovery.Typically, forensic analysts resort to carving techniques as an avenue of last resort due to the difficulty of current techniques. Most current techniques rely on manual inspection of the file to be recovered and manually reconstructing this file using trial and error. Manual processing is typically impractical for modern disk images which might contain hundreds of thousands of files.At the same time the traditional process of recovering deleted files using filesystem information is becoming less practical because most modern filesystems purge critical information for deleted files. As such the need for automated carving techniques is quickly arising even when a filesystem does exist on the forensic image.This paper explores the theory of carving in a formal way. We then proceed to apply this formal analysis to the carving of PDF and ZIP files based on the internal structure inherent within the file formats themselves. Specifically this paper deals with carving from the Digital Forensic Research Work-Shop's (DFRWS) 2007 carving challenge.  相似文献   

5.
用电子取证中的文件雕复相关技术,并以AVI文件为实例,阐述一种侦测AVI文件碎片,并将碎片排序组合成完整文件的文件雕复方法。通过实例的文件雕复方法基于AVI文件格式的结构特征,借鉴了文件结构关键字匹配,二分碎片间隙雕复等文件雕复思想来完成实验。此方法能在失去系统元信息的情况下完成对AVI文件的文件雕复。实验结果表明,此方法提高了有碎片的AVI文件的雕复成功率。  相似文献   

6.
Globe positioning system (GPS) devices are an increasing importance source of evidence, as more of our devices have built-in GPS capabilities. In this paper, we propose a novel framework to efficiently recover National Marine Electronics Association (NMEA) logs and reconstruct GPS trajectories. Unlike existing approaches that require file system metadata, our proposed algorithm is designed based on the file carving technique without relying on system metadata. By understanding the characteristics and intrinsic structure of trajectory data in NMEA logs, we demonstrate how to pinpoint all data blocks belonging to the NMEA logs from the acquired forensic image of GPS device. Then, a discriminator is presented to determine whether two data blocks can be merged. And based on the discriminator, we design a reassembly algorithm to re-order and merge the obtained data blocks into new logs. In this context, deleted trajectories can be reconstructed by analyzing the recovered logs. Empirical experiments demonstrate that our proposed algorithm performs well when the system metadata is available/unavailable, log files are heavily fragmented, one or more parts of the log files are overwritten, and for different file systems of variable cluster sizes.  相似文献   

7.
File system forensics is an important part of Digital Forensics. Investigators of storage media have traditionally focused on the most commonly used file systems such as NTFS, FAT, ExFAT, Ext2-4, HFS+, APFS, etc. NTFS is the current file system used by Windows for the system volume, but this may change in the future. In this paper we will show the structure of the Resilient File System (ReFS), which has been available since Windows Server 2012 and Windows 8. The main purpose of ReFS is to be used on storage spaces in server systems, but it can also be used in Windows 8 or newer. Although ReFS is not the current standard file system in Windows, while users have the option to create ReFS file systems, digital forensic investigators need to investigate the file systems identified on a seized media. Further, we will focus on remnants of non-allocated metadata structures or attributes. This may allow metadata carving, which means searching for specific attributes that are not allocated. Attributes found can then be used for file recovery. ReFS uses superblocks and checkpoints in addition to a VBR, which is different from other Windows file systems. If the partition is reformatted with another file system, the backup superblocks can be used for partition recovery. Further, it is possible to search for checkpoints in order to recover both metadata and content.Another concept not seen for Windows file systems, is the sharing of blocks. When a file is copied, both the original and the new file will share the same content blocks. If the user changes the copy, new data runs will be created for the modified content, but unchanged blocks remain shared. This may impact file carving, because part of the blocks previously used by a deleted file might still be in use by another file. The large default cluster size, 64 KiB, in ReFS v1.2 is an advantage when carving for deleted files, since most deleted files are less than 64 KiB and therefore only use a single cluster. For ReFS v3.2 this advantage has decreased because the standard cluster size is 4 KiB.Preliminary support for ReFS v1.2 has been available in EnCase 7 and 8, but the implementation has not been documented or peer-reviewed. The same is true for Paragon Software, which recently added ReFS support to their forensic product. Our work documents how ReFS v1.2 and ReFS v3.2 are structured at an abstraction level that allows digital forensic investigation of this new file system. At the time of writing this paper, Paragon Software is the only digital forensic tool that supports ReFS v3.x.It is the most recent version of the ReFS file system that is most relevant for digital forensics, as Windows automatically updates the file system to the latest version on mount. This is why we have included information about ReFS v3.2. However, it is possible to change a registry value to avoid updating. The latest ReFS version observed is 3.4, but the information presented about 3.2 is still valid. In any criminal case, the investigator needs to investigate the file system version found.  相似文献   

8.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

9.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

10.
In this paper, we propose two methods to recover damaged audio files using deep neural networks. The presented audio file recovery methods differ from the conventional file carving-based recovery method because the former restore lost data, which are difficult to recover with the latter method. This research suggests that recovery tasks, which are essential yet very difficult or very time consuming, can be automated with the proposed recovery methods using deep neural networks. We apply feed-forward and Long Short Term Memory neural networks for the tasks. The experimental results show that deep neural networks can distinguish speech signals from non-speech signals, and can also identify the encoding methods of the audio files at the level of bits. This leads to successful recovery of the damaged audio files, which are otherwise difficult to recover using the conventional file-carving-based methods.  相似文献   

11.
When theft of a physical item occurs it is detectable by the fact that the object is missing, however, when the theft of a digital item occurs it can go unnoticed as exact replicas can be created. The original file is left intact but valuable information has been absconded. One of the challenges facing digital forensic examiners is detecting when files have been copied off of a computer system in some fashion. While certain methods do leave residual evidence behind, CD Burning has long been held as a copying method that cannot be identified. Through testing of the burning process and close examination of the New Technology File System (NTFS), artifacts from the master file table in the various versions of Microsoft Windows, markers have been found that are associated with copying or "burning" files to CD or DVD. Potential evidence that was once overlooked may now be detectable.  相似文献   

12.
This paper explores the use of purpose-built functions and cryptographic hashes of small data blocks for identifying data in sectors, file fragments, and entire files. It introduces and defines the concept of a “distinct” disk sector—a sector that is unlikely to exist elsewhere except as a copy of the original. Techniques are presented for improved detection of JPEG, MPEG and compressed data; for rapidly classifying the forensic contents of a drive using random sampling; and for carving data based on sector hashes.  相似文献   

13.
Thermally altered skeletal remains can be very fragile and fragmented and are typically further fragmented or even destroyed when handled; recovery of such remains from a scene can therefore be extremely challenging. There are few recommendations and no generally accepted practices for preserving burned bone for recovery and transport. Here, we test whether the application of a gelatin‐based consolidant at the scene can preserve thermally altered bone in the condition and relative anatomical position in which it was discovered. A solution of Knox® brand gelatin and water was applied to burned pig mandibles using a spray bottle. Qualitative and quantitative analysis indicates that the application of the consolidant significantly decreased fragmentation as compared to untreated controls (p < 0.05), with most of the treated mandibles remaining completely intact after recovery and transport to a secondary location. In addition to the effectiveness for preservation, the method is also easy to apply, inexpensive, and reversible.  相似文献   

14.
民事诉讼案件中可疑文件制作方法研究   总被引:1,自引:0,他引:1  
从司法鉴定角度来看,民事诉讼案件中的可疑文件与刑事案件中的文件物证相比,虽具共性又各有其特点。以构成可疑 文件各要素为出发点,对该类文件制作方法及其特点进行研究,以利于拓宽检验鉴定思路。  相似文献   

15.
《Digital Investigation》2014,11(3):224-233
The allocation algorithm of the Linux FAT32 file system driver positions files on disk in such a way that their relative positions reveal information on the order in which these files have been created. This provides an opportunity to enrich information from (carved) file fragments with time information, even when such file fragments lack the file system metadata in which time-related information is usually to be found.Through source code analysis and experiments the behaviour of the Linux FAT allocator is examined. How an understanding of this allocator can be applied in practice is demonstrated with a case study involving a TomTom GPS car navigation device. In this case, time information played a crucial role. Large amounts of location records could be carved from this device's flash storage, yielding insight into the locations the device has visited—yet the carved records themselves offered no information on when the device had been at the locations. Still, bounds on the records' time of creation could be inferred when making use of filesystem timestamps related to neighbouring on-disk positions.Finally, we perform experiments which contrast the Linux behaviour with that of Windows 7. We show that the latter differs subtly, breaking the strong relation between creation order and position.  相似文献   

16.
Minnaard proposed a novel method that constructs a creation time bound of files recovered without time information. The method exploits a relationship between the creation order of files and their locations on a storage device managed with the Linux FAT32 file system. This creation order reconstruction method is valid only in non-wraparound situations, where the file creation time in a former position is earlier than that in a latter position. In this article, we show that if the Linux FAT32 file allocator traverses the storage space more than once, the creation time of a recovered file is possibly earlier than that of a former file and possibly later than that of a latter file on the Linux FAT32 file system. Also it is analytically verified that there are at most n candidates for the creation time bound of each recovered file where n is the number of traversals by the file allocator. Our analysis is evaluated by examining file allocation patterns of two commercial in-car dashboard cameras.  相似文献   

17.
《Digital Investigation》2014,11(2):102-110
Anti-forensics has developed to prevent digital forensic investigations, thus forensic investigations to prevent anti-forensic behaviors have been studied in various area. In the area of user activity analysis, “IconCache.db” files contain icon cache information related to applications, which can yield meaningful information for digital forensic investigations such as the traces of deleted files. A previous study investigated the general artifacts found in the IconCache.db file. In the present study, further features and structures of the IconCache.db file are described. We also propose methods for analyzing anti-forensic behaviors (e.g., time information related to the deletion of files). Finally, we introduce an analytical tool that was developed based on the file structure of IconCache.db. The tool parses out strings from the IconCache.db to assist an analyst. Therefore, an analyst can more easily analyze the IconCache.db file using the tool.  相似文献   

18.
A problem that arises in computer forensics is to determine the type of a file fragment. An extension to the file name indicating the type is stored in the disk directory, but when a file is deleted, the entry for the file in the directory may be overwritten. This problem is easily solved when the fragment includes the initial header, which contains explicit type-identifying information, but it is more difficult to determine the type of a fragment from the middle of a file.We investigate two algorithms for predicting the type of a fragment: one based on Fisher's linear discriminant and the other based on longest common subsequences of the fragment with various sets of test files. We test the ability of the algorithms to predict a variety of common file types. Algorithms of this kind may be useful in designing the next generation of file-carvers – programs that reconstruct files when directory information is lost or deleted. These methods may also be useful in designing virus scanners, firewalls and search engines to find files that are similar to a given file.  相似文献   

19.
Digital video is used in criminal trials as evidence with legal responsibility because video content vividly depicts events occurring at a crime scene. However, using sophisticated video editing software, assailants can easily manipulate visible clues for their own benefit. Therefore, the integrity of digital video files acquired or submitted as evidence must be ensured. Forensic analysis of digital video is key to ensuring the integrity of links with individual cameras. In this study, we analyzed whether it is possible to ensure the integrity of MTS video files. Herein, we propose a method to verify the integrity of MTS files encoded by advanced video coding high definition (AVCHD), which is frequently used for video recording. To verify MTS file integrity, we propose five features. Codec information, picture timing, and camera manufacture/model are modified AVI and MP4-like format video verification features. Group of pictures and Universally Unique Identifier patterns were specifically developed for MTS streams. We analyzed the features of 44 standard files recorded using all recording options of seven cameras. We checked whether integrity can be validated on unmanipulated videos recorded in various environments. In addition, we considered whether manipulated MTS files edited in video editing software could be validated. Experimental results show that all unmanipulated and manipulated MTS files with known recording devices were discriminated only when all five features were checked. These results show that the proposed method verifies the integrity of MTS files, strengthening the validity of MTS file-based evidence in trials.  相似文献   

20.
The National Software Reference Library (NSRL) is an essential data source for forensic investigators, providing in its Reference Data Set (RDS) a set of hash values of known software. However, the NSRL RDS has not previously been tested against a broad spectrum of real-world data. The current work did this using a corpus of 36 million files on 2337 drives from 21 countries. These experiments answered a number of important questions about the NSRL RDS, including what fraction of files it recognizes of different types. NSRL coverage by vendor/product was also tested, finding 51% of the vendor/product names in our corpus had no hash values at all in NSRL. It is shown that coverage or “recall” of the NSRL can be improved with additions from our corpus such as frequently-occurring files and files whose paths were found previously in NSRL with a different hash value. This provided 937,570 new hash values which should be uncontroversial additions to NSRL. Several additional tests investigated the accuracy of the NSRL data. Experiments testing the hash values saw no evidence of errors. Tests of file sizes showed them to be consistent except for a few cases. On the other hand, the product types assigned by NSRL can be disputed, and it failed to recognize any of a sample of virus-infected files. The file names provided by NSRL had numerous discrepancies with the file names found in the corpus, so the discrepancies were categorized; among other things, there were apparent spelling and punctuation errors. Some file names suggest that NSRL hash values were computed on deleted files, not a safe practice. The tests had the secondary benefit of helping identify occasional errors in the metadata obtained from drive imaging on deleted files in our corpus. This research has provided much data useful in improving NSRL and the forensic tools that depend upon it. It also provides a general methodology and software for testing hash sets against corpora.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号