共查询到20条相似文献,搜索用时 0 毫秒
1.
3.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market. 相似文献
4.
The continuing decline in the cost-per-megabyte of hard disk storage has inevitably led to a ballooning volume of data that needs to be reviewed in digital investigations. The result: case backlogs that commonly stretch for months at forensic labs, and per-case processing that occupies days or weeks of analytical effort. Yet speed is critical in situations where delay may render the evidence useless or endanger personal safety, such as when a suspect may flee, a victim is at risk, criminal tactics or control infrastructure may change, etc. In these and other cases, investigators need tools to enable quick triage of computer evidence in order to answer urgent questions, maintain the pace of an investigation and assess the likelihood of acquiring pertinent information from the device.This paper details the design and application of a tool, OpenLV, that not only meets the needs for speedy initial triage, but also can facilitate the review of digital evidence at later stages of investigation. With OpenLV, an investigator can quickly and safely interact with collected evidence, much as if they had sat down at the computer at the time the evidence was collected. Since OpenLV works without modifying the evidence, its use in triage does not preclude subsequent, in-depth forensic analysis. Unlike many popular forensics tools, OpenLV requires little training and facilitates a unprecedented level of interaction with the evidence. 相似文献
5.
Progress in computer forensics research has been limited by the lack of a standardized data sets—corpora—that are available for research purposes. We explain why corpora are needed to further forensic research, present a taxonomy for describing corpora, and announce the availability of several forensic data sets. 相似文献
6.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market. 相似文献
7.
Writing digital forensics (DF) tools is difficult because of the diversity of data types that needs to be processed, the need for high performance, the skill set of most users, and the requirement that the software run without crashing. Developing this software is dramatically easier when one possesses a few hundred disks of other people's data for testing purposes. This paper presents some of the lessons learned by the author over the past 14 years developing DF tools and maintaining several research corpora that currently total roughly 30TB. 相似文献
8.
The current generation of Graphics Processing Units (GPUs) contains a large number of general purpose processors, in sharp contrast to previous generation designs, where special-purpose hardware units (such as texture and vertex shaders) were commonly used. This fact, combined with the prevalence of multicore general purpose CPUs in modern workstations, suggests that performance-critical software such as digital forensics tools be “massively” threaded to take advantage of all available computational resources.Several trends in digital forensics make the availability of more processing power very important. These trends include a large increase in the average size (measured in bytes) of forensic targets, an increase in the number of digital forensics cases, and the development of “next-generation” tools that require more computational resources. This paper presents the results of a number of experiments that evaluate the effectiveness of offloading processing common to digital forensics tools to a GPU, using “massive” numbers of threads to parallelize the computation. These results are compared to speedups obtainable by simple threading schemes appropriate for multicore CPUs. Our results indicate that in many cases, the use of GPUs can substantially increase the performance of digital forensics tools. 相似文献
9.
There is an alarming increase in the number of cybercrime incidents through anonymous e-mails. The problem of e-mail authorship attribution is to identify the most plausible author of an anonymous e-mail from a group of potential suspects. Most previous contributions employed a traditional classification approach, such as decision tree and Support Vector Machine (SVM), to identify the author and studied the effects of different writing style features on the classification accuracy. However, little attention has been given on ensuring the quality of the evidence. In this paper, we introduce an innovative data mining method to capture the write-print of every suspect and model it as combinations of features that occurred frequently in the suspect's e-mails. This notion is called frequent pattern, which has proven to be effective in many data mining applications, but it is the first time to be applied to the problem of authorship attribution. Unlike the traditional approach, the extracted write-print by our method is unique among the suspects and, therefore, provides convincing and credible evidence for presenting it in a court of law. Experiments on real-life e-mails suggest that the proposed method can effectively identify the author and the results are supported by a strong evidence. 相似文献
10.
11.
《Digital Investigation》2014,11(3):234-248
Interpretation of traces found on Android devices is an important aspect of mobile forensics. This is especially true for timestamps encountered on the device under investigation. In the presence of both naive and UTC timestamps, some form of timestamp normalisation is required. In addition, the investigator needs to gain some understanding of potential clock skew that may exist, especially when evidence from the device under investigation has to be correlated to real world events or evidence from other devices. A case study is presented where the time zone on the Android device was set incorrectly, while the clock was set to correspond to the time zone where the device was actually located. Initially, the fact that both time zones enforced daylight saving time (DST) at different periods was expected to complicate the timestamps normalisation. However, it was found that the version of the Time Zone Database on the device was outdated and did not correspond to the actual time zone rules for the given period. After the case study, the results of experiments on a broader range of devices are presented. Among other things, these results demonstrate a method to detect clock skew based on the mmssms.db database. However, it was also found that the applicability of this method is highly dependent on specific implementation choices made by different vendors. 相似文献
13.
Computer forensic tools for Apple Mac hardware have traditionally focused on low-level file system details. Mac OS X and common applications on the Mac platform provide an abundance of information about the user's activities in configuration files, caches, and logs. We are developing MEGA, an extensible tool suite for the analysis of files on Mac OS X disk images. MEGA provides simple access to Spotlight metadata maintained by the operating system, yielding efficient file content search and exposing metadata such as digital camera make and model. It can also help investigators to assess FileVault encrypted home directories. MEGA support tools are under development to interpret files written by common Mac OS applications such as Safari, Mail, and iTunes. 相似文献
14.
Computer forensic tools for Apple Mac hardware have traditionally focused on low-level file system details. Mac OS X and common applications on the Mac platform provide an abundance of information about the user's activities in configuration files, caches, and logs. We are developing MEGA, an extensible tool suite for the analysis of files on Mac OS X disk images. MEGA provides simple access to Spotlight metadata maintained by the operating system, yielding efficient file content search and exposing metadata such as digital camera make and model. It can also help investigators to assess FileVault encrypted home directories. MEGA support tools are under development to interpret files written by common Mac OS applications such as Safari, Mail, and iTunes. 相似文献
15.
Today’s Golden Age of computer forensics is quickly coming to an end. Without a clear strategy for enabling research efforts that build upon one another, forensic research will fall behind the market, tools will become increasingly obsolete, and law enforcement, military and other users of computer forensics products will be unable to rely on the results of forensic analysis. This article summarizes current forensic research directions and argues that to move forward the community needs to adopt standardized, modular approaches for data representation and forensic processing. 相似文献
16.
The big data era has a high impact on forensic data analysis. Work is done in speeding up the processing of large amounts of data and enriching this processing with new techniques. Doing forensics calls for specific design considerations, since the processed data is incredibly sensitive. In this paper we explore the impact of forensic drivers and major design principles like security, privacy and transparency on the design and implementation of a centralized digital forensics service. 相似文献
17.
International regulations about the safety of ships at sea require every modern vessel to be equipped with a Voyage Data Recorder to assist investigations in the event of an accident. As such, these devices are the primary means for acquiring reliable data about an accident involving a ship, and so they must be the first targets in an investigation. Although regulations describe the sources and amount of data to be recorded, they say nothing about the format of the recording. Because of this, nowadays investigators are forced to rely solely on the help of the builder of the system, which provides proprietary software to “replay” the voyage recordings. This paper delves into the examination of data found in the VDR from the actual Costa Concordia accident in 2012, and describes the recovery of information useful for the investigation, both by deduction and by reverse engineering of the data, some of which were not even shown by the official replay software. 相似文献
18.
Matthew B. Robinson 《American Journal of Criminal Justice》2000,24(2):169-179
This paper isolates crime prevention policy implications which stem from a series of interrelated environmental studies of
residential burglaries. A number of crime prevention strategies are developed using a systems approach. It is argued that
changes made to the environments of individuals, groups, communities, organizations, and society can achieve lower risks of
residential burglary victimization. 相似文献
19.
LT B. A. Bayma Jr. 《The Journal of Technology Transfer》1979,3(2):43-51
Technology and the dissemination of information has become a public policy issue in the United States. This article examines the concept of technology transfer and the issue of exporting technological innovation by providing an historic overview of the process and the private and public sector's attitude towards the transfer of technology. 相似文献
20.
Dynamic malware analysis aims at revealing malware's runtime behavior. To evade analysis, advanced malware is able to detect the underlying analysis tool (e.g. one based on emulation.) On the other hand, existing malware-transparent analysis tools incur significant performance overhead, making them unsuitable for live malware monitoring and forensics. In this paper, we present IntroLib, a practical tool that traces user-level library calls made by malware with low overhead and high transparency. IntroLib is based on hardware virtualization and resides outside of the guest virtual machine where the malware runs. Our evaluation of an IntroLib prototype with 93 real-world malware samples shows that IntroLib is immune to emulation and API hooking detection by malware, uncovers more semantic information about malware behavior than system call tracing, and incurs low overhead (<15% in all-but-one test case) in performance benchmark testing. 相似文献