首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
本文利用现有的篡改Microsoft Office办公文件的工具和方法,设计了若干套篡改Microsoft Office办公文件内容、属性的实验方案,使用工具软件对Microsoft Office办公文件的属性进行检验和分析,总结出检验Microsoft Office办公文件的方法。  相似文献   

2.
本文利用现有的篡改Microsoft Office办公文件的工具和方法,设计了若干套篡改Microsoft Office办公文件内容、属性的实验方案,使用工具软件对Microsoft Office办公文件的属性进行检验和分析,总结出检验Microsoft Office办公文件的方法。  相似文献   

3.
文件检验是刑事科学技术的重要组成部门,是运用现代科学的理论和方法,为揭露犯罪,证实犯罪提供科学证据的专门技术手段。本文在此就文件检验案件的特点以及相关问题做了简要的论述。  相似文献   

4.
利用暗记特征鉴别彩色激光打印、复印文件   总被引:1,自引:0,他引:1  
目的探讨激光打印机的暗记特征,为鉴别彩色激光打印、复印文件找到一种有效的方法。方法根据不同物质对光的反射特性及蓝色与黄色互补特性的原理,采用420nm左右蓝光照射方法,对部分品牌彩色激光打印、复印文件进行了初步分析研究,总结了暗记点阵形态特征、小点形态特征、出现位置特征和排列含义等特征。结果此方法对彩色激光打印、复印机具的种类鉴别和个体鉴别简便、有效。  相似文献   

5.
文件形成时间鉴定是判断文件真伪的重要途径。文件形成时间鉴定可通过三个技术通道来完成,即墨迹(包括印文印迹)的形成时间、纸制品的形成时间及文件内容的系统性分析。其中以墨迹的书写时间鉴定最为普遍。所以本文选取墨迹的形成时间鉴定技术作为评析重点,系统分析了半个多世纪以来涌现出来的众多墨迹形成时间鉴定的相关技术,并对其具体方法进行了归纳总结和简单的评析。本文在回顾历史、总结现在的基础上,提出了文件形成时间鉴定在未来发展的方向。  相似文献   

6.
新修改的《行政诉讼法》规定了规范性文件附带审查制度,其重要性毋庸置疑。但这一制度如何设计和构建,现有的研究存在很大不足。规范性文件附带审查制度的设计应从请求条件、审查范围、标准、方式,以及审查后的处理出发,完成科学的、有效运转的制度构建。  相似文献   

7.
文件鉴定技术标准刍议   总被引:1,自引:0,他引:1  
文件鉴定技术标准不是新课题,上世纪50年代就开始规范程序和方法。进入21世纪以来,一批刑事技术和司法鉴定机构,通过了中国合格评定国家认可委员会的国家实验室认可,大大推动了文件鉴定技术标准化的进程。制定文件鉴定技术标准化的意义、条件、标准的内容及相关问题,是对标准制定者有启发和指导作用的。  相似文献   

8.
本文通过一个实际案例,探讨了基于引证文件的修改问题,对其修改方式进行了讨论,并给出了倾向性意见和分析。在此基础上,分析了现行基于引证文件修改的审查标准中存在的问题,并针对性地提出了一些改进措施和建议。  相似文献   

9.
通过对传真机工作原理和性能的分析,判断传真文件痕迹特征的形成方式和过程,对传真文件检验的方法和技术进行系统研究,归纳出传真文件检验的要点,并提出了制定传真文件检验标准的必要性和可行性。  相似文献   

10.
本文简要回顾了文件检验学的发展历程,着重总结了当今文件检验的热点问题,包括:笔迹的自动化检验,文件的制成时间,可疑文件的无损检验,可疑文件的综合检验、可疑文件的定量检验、计算机技术在文件检验中的应用以及今后的发展趋势,文件检验工作中存在的问题进行了阐述,并针对存在的问题提出了解决方案。  相似文献   

11.
《Digital Investigation》2007,4(3-4):119-128
Carving is the term most often used to indicate the act of recovering a file from unstructured digital forensic images. The term unstructured indicates that the original digital image does not contain useful filesystem information which may be used to assist in this recovery.Typically, forensic analysts resort to carving techniques as an avenue of last resort due to the difficulty of current techniques. Most current techniques rely on manual inspection of the file to be recovered and manually reconstructing this file using trial and error. Manual processing is typically impractical for modern disk images which might contain hundreds of thousands of files.At the same time the traditional process of recovering deleted files using filesystem information is becoming less practical because most modern filesystems purge critical information for deleted files. As such the need for automated carving techniques is quickly arising even when a filesystem does exist on the forensic image.This paper explores the theory of carving in a formal way. We then proceed to apply this formal analysis to the carving of PDF and ZIP files based on the internal structure inherent within the file formats themselves. Specifically this paper deals with carving from the Digital Forensic Research Work-Shop's (DFRWS) 2007 carving challenge.  相似文献   

12.
File carving is the process of reassembling files from disk fragments based on the file content in the absence of file system metadata. By leveraging both file header and footer pairs, traditional file carving mainly focuses on document and image files such as PDF and JPEG. With the vast amount of malware code appearing in the wild daily, recovery of binary executable files becomes an important problem, especially for the case in which malware deletes itself after compromising a computer. However, unlike image files that usually have both a header and footer pair, executable files only have header information, which makes the carving much harder. In this paper, we present Bin-Carver, a first-of-its-kind system to automatically recover executable files with deleted or corrupted metadata. The key idea is to explore the road map information defined in executable file headers and the explicit control flow paths present in the binary code. Our experiment with thousands of binary code files has shown our Bin-Carver to be incredibly accurate, with an identification rate of 96.3% and recovery rate of 93.1% on average when handling file systems ranging from pristine to chaotic and highly fragmented.  相似文献   

13.
This paper explores the use of purpose-built functions and cryptographic hashes of small data blocks for identifying data in sectors, file fragments, and entire files. It introduces and defines the concept of a “distinct” disk sector—a sector that is unlikely to exist elsewhere except as a copy of the original. Techniques are presented for improved detection of JPEG, MPEG and compressed data; for rapidly classifying the forensic contents of a drive using random sampling; and for carving data based on sector hashes.  相似文献   

14.
File carving is a technique whereby data files are extracted from a digital device without the assistance of file tables or other disk meta-data. One of the primary challenges in file carving can be found in attempting to recover files that are fragmented. In this paper, we show how detecting the point of fragmentation of a file can benefit fragmented file recovery. We then present a sequential hypothesis testing procedure to identify the fragmentation point of a file by sequentially comparing adjacent pairs of blocks from the starting block of a file until the fragmentation point is reached. By utilizing serial analysis we are able to minimize the errors in detecting the fragmentation points. The performance results obtained from the fragmented test-sets of DFRWS 2006 and 2007 show that the method can be effectively used in recovery of fragmented files.  相似文献   

15.
File carving is a technique whereby data files are extracted from a digital device without the assistance of file tables or other disk meta-data. One of the primary challenges in file carving can be found in attempting to recover files that are fragmented. In this paper, we show how detecting the point of fragmentation of a file can benefit fragmented file recovery. We then present a sequential hypothesis testing procedure to identify the fragmentation point of a file by sequentially comparing adjacent pairs of blocks from the starting block of a file until the fragmentation point is reached. By utilizing serial analysis we are able to minimize the errors in detecting the fragmentation points. The performance results obtained from the fragmented test-sets of DFRWS 2006 and 2007 show that the method can be effectively used in recovery of fragmented files.  相似文献   

16.
File system forensics is an important part of Digital Forensics. Investigators of storage media have traditionally focused on the most commonly used file systems such as NTFS, FAT, ExFAT, Ext2-4, HFS+, APFS, etc. NTFS is the current file system used by Windows for the system volume, but this may change in the future. In this paper we will show the structure of the Resilient File System (ReFS), which has been available since Windows Server 2012 and Windows 8. The main purpose of ReFS is to be used on storage spaces in server systems, but it can also be used in Windows 8 or newer. Although ReFS is not the current standard file system in Windows, while users have the option to create ReFS file systems, digital forensic investigators need to investigate the file systems identified on a seized media. Further, we will focus on remnants of non-allocated metadata structures or attributes. This may allow metadata carving, which means searching for specific attributes that are not allocated. Attributes found can then be used for file recovery. ReFS uses superblocks and checkpoints in addition to a VBR, which is different from other Windows file systems. If the partition is reformatted with another file system, the backup superblocks can be used for partition recovery. Further, it is possible to search for checkpoints in order to recover both metadata and content.Another concept not seen for Windows file systems, is the sharing of blocks. When a file is copied, both the original and the new file will share the same content blocks. If the user changes the copy, new data runs will be created for the modified content, but unchanged blocks remain shared. This may impact file carving, because part of the blocks previously used by a deleted file might still be in use by another file. The large default cluster size, 64 KiB, in ReFS v1.2 is an advantage when carving for deleted files, since most deleted files are less than 64 KiB and therefore only use a single cluster. For ReFS v3.2 this advantage has decreased because the standard cluster size is 4 KiB.Preliminary support for ReFS v1.2 has been available in EnCase 7 and 8, but the implementation has not been documented or peer-reviewed. The same is true for Paragon Software, which recently added ReFS support to their forensic product. Our work documents how ReFS v1.2 and ReFS v3.2 are structured at an abstraction level that allows digital forensic investigation of this new file system. At the time of writing this paper, Paragon Software is the only digital forensic tool that supports ReFS v3.x.It is the most recent version of the ReFS file system that is most relevant for digital forensics, as Windows automatically updates the file system to the latest version on mount. This is why we have included information about ReFS v3.2. However, it is possible to change a registry value to avoid updating. The latest ReFS version observed is 3.4, but the information presented about 3.2 is still valid. In any criminal case, the investigator needs to investigate the file system version found.  相似文献   

17.
Video file format standards define only a limited number of mandatory features and leave room for interpretation. Design decisions of device manufacturers and software vendors are thus a fruitful resource for forensic video authentication. This paper explores AVI and MP4-like video streams of mobile phones and digital cameras in detail. We use customized parsers to extract all file format structures of videos from overall 19 digital camera models, 14 mobile phone models, and 6 video editing toolboxes. We report considerable differences in the choice of container formats, audio and video compression algorithms, acquisition parameters, and internal file structure. In combination, such characteristics can help to authenticate digital video files in forensic settings by distinguishing between original and post-processed videos, verifying the purported source of a file, or identifying the true acquisition device model or the processing software used for video processing.  相似文献   

18.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

19.
This paper describes research and analysis that were performed to identify a robust and accurate method for identifying and extracting the residual contents of deleted files stored within an HFS+ file system. A survey performed during 2005 of existing tools and techniques for HFS+ deleted file recovery reinforced the need for newer, more accurate techniques.Our research and analysis were based on the premise that a transactional history of file I/O operations is maintained in a Journal on HFS+ file systems, and that this history could be used to reconstruct recent deletions of active files from the file system. Such an approach offered a distinct advantage over other current techniques, including recovery of free/unallocated blocks and “file carving” techniques. If the journal entries contained or referenced file attributes such as the extents that specify which file system blocks were occupied by each file, then a much more accurate identification and recovery of deleted file data would be possible.  相似文献   

20.
“File carving” reconstructs files based on their content, rather than using metadata that points to the content. Carving is widely used for forensics and data recovery, but no file carvers can automatically reassemble fragmented files. We survey files from more than 300 hard drives acquired on the secondary market and show that the ability to reassemble fragmented files is an important requirement for forensic work. Next we analyze the file carving problem, arguing that rapid, accurate carving is best performed by a multi-tier decision problem that seeks to quickly validate or discard candidate byte strings – “objects” – from the media to be carved. Validators for the JPEG, Microsoft OLE (MSOLE) and ZIP file formats are discussed. Finally, we show how high speed validators can be used to reassemble fragmented files.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号