首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Abstract

Various terms have been used to describe the intersection between computing technology and violations of the law-including computer crime, electronic crime, and cybercrime. While there remains little agreement on terminology, most experts agree that the use of electronic devices to commit crime has increased dramatically and is now commonplace. It is the role of the digital investigator to bring cybercriminals to justice. Cybercrime however differs from traditional crime and presents a variety of unique challenges including the variety of electronic devices available, amount of data produced by these devices, the absence of standard practices and guidelines for analyzing that data, the lack qualified personnel to perform investigations and the lack of resources to provide on-going training. This paper examines these challenges  相似文献   

2.
《Science & justice》2021,61(5):477-492
Software invisibly permeates our everyday lives: operating devices in our physical world (traffic lights and cars), effecting our business transactions and powering the vast World Wide Web. We have come to rely on such software to work correctly and efficiently. The generally accepted narrative is that any software errors that do occur can be traced back to a human operator’s mistakes. Software engineers know that this is merely a comforting illusion. Software sometimes has bugs, which might lead to erratic performance: intermittently generating errors. The software, hardware and communication infrastructure can all introduce errors, which are often challenging to isolate and correct. Anomalies that manifest are certainly not always due to an operator’s actions. When the general public and the courts believe the opposite, that errors are usually attributable to some human operator’s actions, it is entirely possible for some hapless innocent individual to be blamed for anomalies and discrepancies whose actual source is a software malfunction. This is what occurred in the Post Office Horizon IT case, where unquestioning belief in the veracity of software-generated evidence led to a decade of wrongful convictions. We will use this case as a vehicle to demonstrate the way biases can influence investigations, and to inform the development of a framework to guide and inform objective digital forensics investigations. This framework, if used, could go some way towards neutralising biases and preventing similar miscarriages of justice in the future.  相似文献   

3.
4.
5.
Smart cities are comprised of diverse and interconnected components constantly exchanging data and facilitating improved living for a nation's population. Our view of a typical smart city consists of four key components, namely, Smart Grids, Building Automation Systems (BAS), Unmanned Aerial Vehicles (UAVs), Smart Vehicles; with enabling Internet of Things (IoT) sensors and the Cloud platform. The adversarial threats and criminal misuses in a smart city are increasingly heterogenous and significant, with provisioning of resilient and end-to-end security being a daunting task. When a cyber incident involving critical components of the smart city infrastructure occurs, appropriate measures can be taken to identify and enumerate concrete evidence to facilitate the forensic investigation process. Forensic preparedness and lessons learned from past forensic analysis can help protect the smart city against future incidents. This paper presents a holistic view of the security landscape of a smart city, identifying security threats and providing deep insight into digital investigation in the context of the smart city.  相似文献   

6.
7.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market.  相似文献   

8.
Recently, digital forensics has become increasingly important as it is used by investigation agencies, corporate, and private sector. To supplement the limitations of evidence capacity and be recognized in court, it is essential to establish an environment that ensures the integrity of the entire process ranging from collecting and analyzing to submitting digital evidence to court. In this study, common elements were extracted by comparing and analyzing ISO/IEC 17025, 27001 standards and Interpol and Council of Europe (CoE) guidelines to derive the necessary components for building a digital forensic laboratory. Subsequently, based on 21 digital forensic experts in the field, Delphi survey and verifications were conducted in three rounds. As a result, 40 components from seven areas were derived. The research results are based on the establishment, operation, management, and authentication of a digital forensics laboratory suitable for the domestic environment, with added credibility through collection of the opinions of 21 experts in the field of digital forensics in Korea. This study can be referred to in establishing digital forensic laboratories in national, public, and private digital forensic organizations as well as for employing as competency measurement criteria in courts to evaluate the reliability of the analysis results.  相似文献   

9.
The continuing decline in the cost-per-megabyte of hard disk storage has inevitably led to a ballooning volume of data that needs to be reviewed in digital investigations. The result: case backlogs that commonly stretch for months at forensic labs, and per-case processing that occupies days or weeks of analytical effort. Yet speed is critical in situations where delay may render the evidence useless or endanger personal safety, such as when a suspect may flee, a victim is at risk, criminal tactics or control infrastructure may change, etc. In these and other cases, investigators need tools to enable quick triage of computer evidence in order to answer urgent questions, maintain the pace of an investigation and assess the likelihood of acquiring pertinent information from the device.This paper details the design and application of a tool, OpenLV, that not only meets the needs for speedy initial triage, but also can facilitate the review of digital evidence at later stages of investigation. With OpenLV, an investigator can quickly and safely interact with collected evidence, much as if they had sat down at the computer at the time the evidence was collected. Since OpenLV works without modifying the evidence, its use in triage does not preclude subsequent, in-depth forensic analysis. Unlike many popular forensics tools, OpenLV requires little training and facilitates a unprecedented level of interaction with the evidence.  相似文献   

10.
Progress in computer forensics research has been limited by the lack of a standardized data sets—corpora—that are available for research purposes. We explain why corpora are needed to further forensic research, present a taxonomy for describing corpora, and announce the availability of several forensic data sets.  相似文献   

11.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market.  相似文献   

12.
13.
Writing digital forensics (DF) tools is difficult because of the diversity of data types that needs to be processed, the need for high performance, the skill set of most users, and the requirement that the software run without crashing. Developing this software is dramatically easier when one possesses a few hundred disks of other people's data for testing purposes. This paper presents some of the lessons learned by the author over the past 14 years developing DF tools and maintaining several research corpora that currently total roughly 30TB.  相似文献   

14.
The current generation of Graphics Processing Units (GPUs) contains a large number of general purpose processors, in sharp contrast to previous generation designs, where special-purpose hardware units (such as texture and vertex shaders) were commonly used. This fact, combined with the prevalence of multicore general purpose CPUs in modern workstations, suggests that performance-critical software such as digital forensics tools be “massively” threaded to take advantage of all available computational resources.Several trends in digital forensics make the availability of more processing power very important. These trends include a large increase in the average size (measured in bytes) of forensic targets, an increase in the number of digital forensics cases, and the development of “next-generation” tools that require more computational resources. This paper presents the results of a number of experiments that evaluate the effectiveness of offloading processing common to digital forensics tools to a GPU, using “massive” numbers of threads to parallelize the computation. These results are compared to speedups obtainable by simple threading schemes appropriate for multicore CPUs. Our results indicate that in many cases, the use of GPUs can substantially increase the performance of digital forensics tools.  相似文献   

15.
There is an alarming increase in the number of cybercrime incidents through anonymous e-mails. The problem of e-mail authorship attribution is to identify the most plausible author of an anonymous e-mail from a group of potential suspects. Most previous contributions employed a traditional classification approach, such as decision tree and Support Vector Machine (SVM), to identify the author and studied the effects of different writing style features on the classification accuracy. However, little attention has been given on ensuring the quality of the evidence. In this paper, we introduce an innovative data mining method to capture the write-print of every suspect and model it as combinations of features that occurred frequently in the suspect's e-mails. This notion is called frequent pattern, which has proven to be effective in many data mining applications, but it is the first time to be applied to the problem of authorship attribution. Unlike the traditional approach, the extracted write-print by our method is unique among the suspects and, therefore, provides convincing and credible evidence for presenting it in a court of law. Experiments on real-life e-mails suggest that the proposed method can effectively identify the author and the results are supported by a strong evidence.  相似文献   

16.
There is an alarming increase in the number of cybercrime incidents through anonymous e-mails. The problem of e-mail authorship attribution is to identify the most plausible author of an anonymous e-mail from a group of potential suspects. Most previous contributions employed a traditional classification approach, such as decision tree and Support Vector Machine (SVM), to identify the author and studied the effects of different writing style features on the classification accuracy. However, little attention has been given on ensuring the quality of the evidence. In this paper, we introduce an innovative data mining method to capture the write-print of every suspect and model it as combinations of features that occurred frequently in the suspect's e-mails. This notion is called frequent pattern, which has proven to be effective in many data mining applications, but it is the first time to be applied to the problem of authorship attribution. Unlike the traditional approach, the extracted write-print by our method is unique among the suspects and, therefore, provides convincing and credible evidence for presenting it in a court of law. Experiments on real-life e-mails suggest that the proposed method can effectively identify the author and the results are supported by a strong evidence.  相似文献   

17.
《Digital Investigation》2014,11(2):102-110
Anti-forensics has developed to prevent digital forensic investigations, thus forensic investigations to prevent anti-forensic behaviors have been studied in various area. In the area of user activity analysis, “IconCache.db” files contain icon cache information related to applications, which can yield meaningful information for digital forensic investigations such as the traces of deleted files. A previous study investigated the general artifacts found in the IconCache.db file. In the present study, further features and structures of the IconCache.db file are described. We also propose methods for analyzing anti-forensic behaviors (e.g., time information related to the deletion of files). Finally, we introduce an analytical tool that was developed based on the file structure of IconCache.db. The tool parses out strings from the IconCache.db to assist an analyst. Therefore, an analyst can more easily analyze the IconCache.db file using the tool.  相似文献   

18.
19.
《Digital Investigation》2014,11(3):234-248
Interpretation of traces found on Android devices is an important aspect of mobile forensics. This is especially true for timestamps encountered on the device under investigation. In the presence of both naive and UTC timestamps, some form of timestamp normalisation is required. In addition, the investigator needs to gain some understanding of potential clock skew that may exist, especially when evidence from the device under investigation has to be correlated to real world events or evidence from other devices. A case study is presented where the time zone on the Android device was set incorrectly, while the clock was set to correspond to the time zone where the device was actually located. Initially, the fact that both time zones enforced daylight saving time (DST) at different periods was expected to complicate the timestamps normalisation. However, it was found that the version of the Time Zone Database on the device was outdated and did not correspond to the actual time zone rules for the given period. After the case study, the results of experiments on a broader range of devices are presented. Among other things, these results demonstrate a method to detect clock skew based on the mmssms.db database. However, it was also found that the applicability of this method is highly dependent on specific implementation choices made by different vendors.  相似文献   

20.
This paper discusses the challenges of performing a forensic investigation against a multi-node Hadoop cluster and proposes a methodology for examiners to use in such situations. The procedure's aim of minimising disruption to the data centre during the acquisition process is achieved through the use of RAM forensics. This affords initial cluster reconnaissance which in turn facilitates targeted data acquisition on the identified DataNodes. To evaluate the methodology's feasibility, a small Hadoop Distributed File System (HDFS) was configured and forensic artefacts simulated upon it by deleting data originally stored in the cluster. RAM acquisition and analysis was then performed on the NameNode in order to test the validity of the suggested methodology. The results are cautiously positive in establishing that RAM analysis of the NameNode can be used to pinpoint the data blocks affected by the attack, allowing a targeted approach to the acquisition of data from the DataNodes, provided that the physical locations can be determined. A full forensic analysis of the DataNodes was beyond the scope of this project.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号