首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The ever-increasing size of digital media presents a continuous challenge to digital investigators who must rapidly assess computer media to find and identify evidence. To meet this challenge, methods must continuously be sought to expedite the examination process. This paper investigates using the file ownership property as an analytical tool focusing on activity by individuals associated with the computer. Research centered on the New Technology File System (NTFS), which is the default file system in Microsoft Windows Operating System (OS). This was done because Microsoft's worldwide market penetration makes Windows and NTFS the most likely OS and file system to be encountered in digital forensic examinations. Significantly, digital forensic software now allows examination of NTFS file attributes and properties including the ownership property. The paper outlines potential limitations regarding interpreting ownership findings, and suggests areas for further research. Overall, file ownership is seen as a potentially viable new digital forensic tool.  相似文献   

2.
用电子取证中的文件雕复相关技术,并以AVI文件为实例,阐述一种侦测AVI文件碎片,并将碎片排序组合成完整文件的文件雕复方法。通过实例的文件雕复方法基于AVI文件格式的结构特征,借鉴了文件结构关键字匹配,二分碎片间隙雕复等文件雕复思想来完成实验。此方法能在失去系统元信息的情况下完成对AVI文件的文件雕复。实验结果表明,此方法提高了有碎片的AVI文件的雕复成功率。  相似文献   

3.
Globe positioning system (GPS) devices are an increasing importance source of evidence, as more of our devices have built-in GPS capabilities. In this paper, we propose a novel framework to efficiently recover National Marine Electronics Association (NMEA) logs and reconstruct GPS trajectories. Unlike existing approaches that require file system metadata, our proposed algorithm is designed based on the file carving technique without relying on system metadata. By understanding the characteristics and intrinsic structure of trajectory data in NMEA logs, we demonstrate how to pinpoint all data blocks belonging to the NMEA logs from the acquired forensic image of GPS device. Then, a discriminator is presented to determine whether two data blocks can be merged. And based on the discriminator, we design a reassembly algorithm to re-order and merge the obtained data blocks into new logs. In this context, deleted trajectories can be reconstructed by analyzing the recovered logs. Empirical experiments demonstrate that our proposed algorithm performs well when the system metadata is available/unavailable, log files are heavily fragmented, one or more parts of the log files are overwritten, and for different file systems of variable cluster sizes.  相似文献   

4.
“File carving” reconstructs files based on their content, rather than using metadata that points to the content. Carving is widely used for forensics and data recovery, but no file carvers can automatically reassemble fragmented files. We survey files from more than 300 hard drives acquired on the secondary market and show that the ability to reassemble fragmented files is an important requirement for forensic work. Next we analyze the file carving problem, arguing that rapid, accurate carving is best performed by a multi-tier decision problem that seeks to quickly validate or discard candidate byte strings – “objects” – from the media to be carved. Validators for the JPEG, Microsoft OLE (MSOLE) and ZIP file formats are discussed. Finally, we show how high speed validators can be used to reassemble fragmented files.  相似文献   

5.
《Digital Investigation》2007,4(3-4):119-128
Carving is the term most often used to indicate the act of recovering a file from unstructured digital forensic images. The term unstructured indicates that the original digital image does not contain useful filesystem information which may be used to assist in this recovery.Typically, forensic analysts resort to carving techniques as an avenue of last resort due to the difficulty of current techniques. Most current techniques rely on manual inspection of the file to be recovered and manually reconstructing this file using trial and error. Manual processing is typically impractical for modern disk images which might contain hundreds of thousands of files.At the same time the traditional process of recovering deleted files using filesystem information is becoming less practical because most modern filesystems purge critical information for deleted files. As such the need for automated carving techniques is quickly arising even when a filesystem does exist on the forensic image.This paper explores the theory of carving in a formal way. We then proceed to apply this formal analysis to the carving of PDF and ZIP files based on the internal structure inherent within the file formats themselves. Specifically this paper deals with carving from the Digital Forensic Research Work-Shop's (DFRWS) 2007 carving challenge.  相似文献   

6.
《Digital Investigation》2007,4(3-4):116-118
The NTFS file system underlying modern Windows Versions provides the user with a number of novel ways in which to configure data storage and data paths within the NTFS environment. This article seeks to explain two of these, Volume Mount Points and Directory Junctions, such than when they are encountered the forensic examiner will have some information as to their use and structure.  相似文献   

7.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

8.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

9.
File carving is the process of reassembling files from disk fragments based on the file content in the absence of file system metadata. By leveraging both file header and footer pairs, traditional file carving mainly focuses on document and image files such as PDF and JPEG. With the vast amount of malware code appearing in the wild daily, recovery of binary executable files becomes an important problem, especially for the case in which malware deletes itself after compromising a computer. However, unlike image files that usually have both a header and footer pair, executable files only have header information, which makes the carving much harder. In this paper, we present Bin-Carver, a first-of-its-kind system to automatically recover executable files with deleted or corrupted metadata. The key idea is to explore the road map information defined in executable file headers and the explicit control flow paths present in the binary code. Our experiment with thousands of binary code files has shown our Bin-Carver to be incredibly accurate, with an identification rate of 96.3% and recovery rate of 93.1% on average when handling file systems ranging from pristine to chaotic and highly fragmented.  相似文献   

10.
When theft of a physical item occurs it is detectable by the fact that the object is missing, however, when the theft of a digital item occurs it can go unnoticed as exact replicas can be created. The original file is left intact but valuable information has been absconded. One of the challenges facing digital forensic examiners is detecting when files have been copied off of a computer system in some fashion. While certain methods do leave residual evidence behind, CD Burning has long been held as a copying method that cannot be identified. Through testing of the burning process and close examination of the New Technology File System (NTFS), artifacts from the master file table in the various versions of Microsoft Windows, markers have been found that are associated with copying or "burning" files to CD or DVD. Potential evidence that was once overlooked may now be detectable.  相似文献   

11.
目的在电子数据取证过程中,数据的加解密经常是取证人员关注的重点。数据保护接口(DPAPI)作为Windows系统提供的数据保护接口被广泛使用,目前主要用于保护加密的数据。其特性主要表现在加密和解密必须在同一台计算机上操作,密钥的生成、使用和管理由Windows系统内部完成,如果更换计算机则无法解开DPAPI加密数据。通过对DPAPI加密机制的分析,以达到对Windows系统存储区的DPAPI加密数据进行离线解密的目的。方法通过深入研究分析Windows XP、Windows 7、Windows 10等多款操作系统的DPAPI加密流程和解密流程,确定系统存储区数据离线解密主要依赖于系统的注册表文件和主密钥文件。结果利用还原后的解密流程和算法,以及系统的注册表文件和主密钥文件,可以正常解开DPAPI加密数据。结论该方法可达到对Windows系统存储区的DPAPI加密数据进行离线解密的目的。  相似文献   

12.
Document forensics remains an important field of digital forensics. To date, previously existing methods focused on the last saved version of the document file stored on the PC; however, the drawback of this approach is that this provides no indication as to how the contents have been modified. This paper provides a novel method for document forensics based on tracking the revision history of a Microsoft Word file. The proposed method concentrates on the TMP file created when the author saves the file and the ASD file created periodically by Microsoft Word during editing. A process whereby the revision history lists are generated based on metadata of the Word, TMP, and ASD files is presented. Furthermore, we describe a technique developed to link the revision history lists based on similarity. These outcomes can provide considerable assistance to a forensic investigator trying to establish the extent to which document file contents have been changed and when the file was created, modified, deleted, and copied.  相似文献   

13.
The Windows registry serves as a primary storage location for system configurations and as such provides a wealth of information to investigators. Numerous researchers have worked to interpret the information stored in the registry from a digital forensic standpoint, but no definitive resource is yet available which describes how Windows deletes registry data structures under NT-based systems. This paper explores this topic and provides an algorithm for recovering deleted keys, values, and other structures in the context of the registry as a whole.  相似文献   

14.
The Windows registry serves as a primary storage location for system configurations and as such provides a wealth of information to investigators. Numerous researchers have worked to interpret the information stored in the registry from a digital forensic standpoint, but no definitive resource is yet available which describes how Windows deletes registry data structures under NT-based systems. This paper explores this topic and provides an algorithm for recovering deleted keys, values, and other structures in the context of the registry as a whole.  相似文献   

15.
目的分析Windows系统中不同因素对文件时间属性的影响,总结文件时间属性的变化规律。方法在FAT32和NTFS两种文件系统中,对文件和文件夹进行各种操作,记录其时间属性的变化情况,总结其规律并分析各种因素的影响。结果文件时间属性的更新与系统环境、操作方法、文件类型等因素有关,而且文件时间属性更新有特定的周期。结论Windows系统中文件时间属性的变化既有特定的规律,又受其它因素影响,在检验中应加以注意。  相似文献   

16.
When digital forensics started in the mid-1980s most of the software used for analysis came from writing and debugging software. Amongst these tools was the UNIX utility ‘dd’ which was used to create an image of an entire storage device. In the next decade the practice of creating and using ‘an image’ became established as a fundamental base of what we call ‘sound forensic practice’. By virtue of its structure, every file within the media was an integrated part of the image and so we were assured that it was wholesome representation of the digital crime scene. In an age of terabyte media ‘the image’ is becoming increasingly cumbersome to process, simply because of its size. One solution to this lies in the use of distributed systems. However, the data assurance inherent in a single media image file is lost when data is stored in separate files distributed across a system. In this paper we assess current assurance practices and provide some solutions to the need to have assurance within a distributed system.  相似文献   

17.
Using validated carving techniques, we show that popular operating systems (e.g. Windows, Linux, and OSX) frequently have residual IP packets, Ethernet frames, and associated data structures present in system memory from long-terminated network traffic. Such information is useful for many forensic purposes including establishment of prior connection activity and services used; identification of other systems present on the system’s LAN or WLAN; geolocation of the host computer system; and cross-drive analysis. We show that network structures can also be recovered from memory that is persisted onto a mass storage medium during the course of system swapping or hibernation. We present our network carving techniques, algorithms and tools, and validate these against both purpose-built memory images and a readily available forensic corpora. These techniques are valuable to both forensics tasks, particularly in analyzing mobile devices, and to cyber-security objectives such as malware analysis.  相似文献   

18.
《Digital Investigation》2007,4(3-4):138-145
Pidgin, formerly known as Gaim, is a multi-protocol instant messaging (IM) client that supports communication on most of the popular IM networks. Pidgin is chiefly popular under Linux, and is available for Windows, BSD and other UNIX versions. This article presents a number of traces that are left behind after the use of Pidgin on Linux, enabling digital investigators to search for and interpret instant messaging activities, including online conversations and file transfers. Specifically, the contents and structures of user settings, log files, contact files and the swap partition are discussed. In addition looking for such information in active files on a computer, forensic examiners can recover deleted items by searching a hard drive for file signatures and known file structures detailed in this article.  相似文献   

19.
《Digital Investigation》2007,4(3-4):146-157
Post-event timeline reconstruction plays a critical role in forensic investigation and serves as a means of identifying evidence of the digital crime. We present an artificial neural networks based approach for post-event timeline reconstruction using the file system activities. A variety of digital forensic tools have been developed during the past two decades to assist computer forensic investigators undertaking digital timeline analysis, but most of the tools cannot handle large volumes of data efficiently. This paper looks at the effectiveness of employing neural network methodology for computer forensic analysis by preparing a timeline of relevant events occurring on a computing machine by tracing the previous file system activities. Our approach consists of monitoring the file system manipulations, capturing file system snapshots at discrete intervals of time to characterise the use of different software applications, and then using this captured data to train a neural network to recognise execution patterns of the application programs. The trained version of the network may then be used to generate a post-event timeline of a seized hard disk to verify the execution of different applications at different time intervals to assist in the identification of available evidence.  相似文献   

20.
Computer forensic tools for Apple Mac hardware have traditionally focused on low-level file system details. Mac OS X and common applications on the Mac platform provide an abundance of information about the user's activities in configuration files, caches, and logs. We are developing MEGA, an extensible tool suite for the analysis of files on Mac OS X disk images. MEGA provides simple access to Spotlight metadata maintained by the operating system, yielding efficient file content search and exposing metadata such as digital camera make and model. It can also help investigators to assess FileVault encrypted home directories. MEGA support tools are under development to interpret files written by common Mac OS applications such as Safari, Mail, and iTunes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号