首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Science & justice》2021,61(5):477-492
Software invisibly permeates our everyday lives: operating devices in our physical world (traffic lights and cars), effecting our business transactions and powering the vast World Wide Web. We have come to rely on such software to work correctly and efficiently. The generally accepted narrative is that any software errors that do occur can be traced back to a human operator’s mistakes. Software engineers know that this is merely a comforting illusion. Software sometimes has bugs, which might lead to erratic performance: intermittently generating errors. The software, hardware and communication infrastructure can all introduce errors, which are often challenging to isolate and correct. Anomalies that manifest are certainly not always due to an operator’s actions. When the general public and the courts believe the opposite, that errors are usually attributable to some human operator’s actions, it is entirely possible for some hapless innocent individual to be blamed for anomalies and discrepancies whose actual source is a software malfunction. This is what occurred in the Post Office Horizon IT case, where unquestioning belief in the veracity of software-generated evidence led to a decade of wrongful convictions. We will use this case as a vehicle to demonstrate the way biases can influence investigations, and to inform the development of a framework to guide and inform objective digital forensics investigations. This framework, if used, could go some way towards neutralising biases and preventing similar miscarriages of justice in the future.  相似文献   

2.
Abstract

Various terms have been used to describe the intersection between computing technology and violations of the law-including computer crime, electronic crime, and cybercrime. While there remains little agreement on terminology, most experts agree that the use of electronic devices to commit crime has increased dramatically and is now commonplace. It is the role of the digital investigator to bring cybercriminals to justice. Cybercrime however differs from traditional crime and presents a variety of unique challenges including the variety of electronic devices available, amount of data produced by these devices, the absence of standard practices and guidelines for analyzing that data, the lack qualified personnel to perform investigations and the lack of resources to provide on-going training. This paper examines these challenges  相似文献   

3.
In this paper we present an approach to digital forensics specification based on forensic policy definition. Our methodology borrows from computer security policy specification, which has accumulated a significant body of research over the past 30 years. We first define the process of specifying forensics properties through a forensics policy and then present an example application of the process. This approach lends itself to formal policy specification and verification, which would allow for more clarity and less ambiguity in the specification process.  相似文献   

4.
5.
6.
Smart cities are comprised of diverse and interconnected components constantly exchanging data and facilitating improved living for a nation's population. Our view of a typical smart city consists of four key components, namely, Smart Grids, Building Automation Systems (BAS), Unmanned Aerial Vehicles (UAVs), Smart Vehicles; with enabling Internet of Things (IoT) sensors and the Cloud platform. The adversarial threats and criminal misuses in a smart city are increasingly heterogenous and significant, with provisioning of resilient and end-to-end security being a daunting task. When a cyber incident involving critical components of the smart city infrastructure occurs, appropriate measures can be taken to identify and enumerate concrete evidence to facilitate the forensic investigation process. Forensic preparedness and lessons learned from past forensic analysis can help protect the smart city against future incidents. This paper presents a holistic view of the security landscape of a smart city, identifying security threats and providing deep insight into digital investigation in the context of the smart city.  相似文献   

7.
《Digital Investigation》2014,11(2):102-110
Anti-forensics has developed to prevent digital forensic investigations, thus forensic investigations to prevent anti-forensic behaviors have been studied in various area. In the area of user activity analysis, “IconCache.db” files contain icon cache information related to applications, which can yield meaningful information for digital forensic investigations such as the traces of deleted files. A previous study investigated the general artifacts found in the IconCache.db file. In the present study, further features and structures of the IconCache.db file are described. We also propose methods for analyzing anti-forensic behaviors (e.g., time information related to the deletion of files). Finally, we introduce an analytical tool that was developed based on the file structure of IconCache.db. The tool parses out strings from the IconCache.db to assist an analyst. Therefore, an analyst can more easily analyze the IconCache.db file using the tool.  相似文献   

8.
Recently, digital forensics has become increasingly important as it is used by investigation agencies, corporate, and private sector. To supplement the limitations of evidence capacity and be recognized in court, it is essential to establish an environment that ensures the integrity of the entire process ranging from collecting and analyzing to submitting digital evidence to court. In this study, common elements were extracted by comparing and analyzing ISO/IEC 17025, 27001 standards and Interpol and Council of Europe (CoE) guidelines to derive the necessary components for building a digital forensic laboratory. Subsequently, based on 21 digital forensic experts in the field, Delphi survey and verifications were conducted in three rounds. As a result, 40 components from seven areas were derived. The research results are based on the establishment, operation, management, and authentication of a digital forensics laboratory suitable for the domestic environment, with added credibility through collection of the opinions of 21 experts in the field of digital forensics in Korea. This study can be referred to in establishing digital forensic laboratories in national, public, and private digital forensic organizations as well as for employing as competency measurement criteria in courts to evaluate the reliability of the analysis results.  相似文献   

9.
10.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market.  相似文献   

11.
Progress in computer forensics research has been limited by the lack of a standardized data sets—corpora—that are available for research purposes. We explain why corpora are needed to further forensic research, present a taxonomy for describing corpora, and announce the availability of several forensic data sets.  相似文献   

12.
Recently, “Speed” is one of the hot issues in digital forensics. Thanks to a recent advanced technology, today we can get bigger hard drive disks at a lower price than previously. But unfortunately, it means for forensic investigators that they need tremendous time and effort in the sequence of process of creating forensic images, searching into them and analyzing them. In order to solve this problem, some methods have been proposed to improve performance of forensic tools. One of them getting attention is a hardware-based approach. However, such a way is limited in the field of evidence cloning or password cracking while it is rarely used in searching and analysis of the digital evidence. In this paper, we design and implement a high-speed search engine using a Tarari content processor. Furthermore, we show feasibility of our approach by comparing its performance and features to those of a popular forensic tool currently on the market.  相似文献   

13.
As electronic documents become more important and valuable in the modern era, attempts are invariably made to take undue-advantage by tampering with them. Tampering with the modification, access and creation date and time stamps (MAC DTS) of digital documents pose a great threat and proves to be a major handicap in digital forensic investigation. Authentic date and time stamps (ADTS) can provide crucial evidence in linking crime to criminal in cases of Computer Fraud and Cyber Crimes (CFCC) through reliable time lining of digital evidence. But the ease with which the MAC DTS of stored digital documents can be changed raises some serious questions about the integrity and admissibility of digital evidence, potentially leading to rejection of acquired digital evidence in the court of Law. MAC DTS procedures of popular operating systems are inherently flawed and were created only for the sake of convenience and not necessarily keeping in mind the security and digital forensic aspects. This paper explores these issues in the context of the Ext2 file system and also proposes one solution to tackle such issues for the scenario where systems have preinstalled plug-ins in the form of Loadable Kernel Modules, which provide the capability to preserve ADTS.  相似文献   

14.
Writing digital forensics (DF) tools is difficult because of the diversity of data types that needs to be processed, the need for high performance, the skill set of most users, and the requirement that the software run without crashing. Developing this software is dramatically easier when one possesses a few hundred disks of other people's data for testing purposes. This paper presents some of the lessons learned by the author over the past 14 years developing DF tools and maintaining several research corpora that currently total roughly 30TB.  相似文献   

15.
The continuing decline in the cost-per-megabyte of hard disk storage has inevitably led to a ballooning volume of data that needs to be reviewed in digital investigations. The result: case backlogs that commonly stretch for months at forensic labs, and per-case processing that occupies days or weeks of analytical effort. Yet speed is critical in situations where delay may render the evidence useless or endanger personal safety, such as when a suspect may flee, a victim is at risk, criminal tactics or control infrastructure may change, etc. In these and other cases, investigators need tools to enable quick triage of computer evidence in order to answer urgent questions, maintain the pace of an investigation and assess the likelihood of acquiring pertinent information from the device.This paper details the design and application of a tool, OpenLV, that not only meets the needs for speedy initial triage, but also can facilitate the review of digital evidence at later stages of investigation. With OpenLV, an investigator can quickly and safely interact with collected evidence, much as if they had sat down at the computer at the time the evidence was collected. Since OpenLV works without modifying the evidence, its use in triage does not preclude subsequent, in-depth forensic analysis. Unlike many popular forensics tools, OpenLV requires little training and facilitates a unprecedented level of interaction with the evidence.  相似文献   

16.
The current generation of Graphics Processing Units (GPUs) contains a large number of general purpose processors, in sharp contrast to previous generation designs, where special-purpose hardware units (such as texture and vertex shaders) were commonly used. This fact, combined with the prevalence of multicore general purpose CPUs in modern workstations, suggests that performance-critical software such as digital forensics tools be “massively” threaded to take advantage of all available computational resources.Several trends in digital forensics make the availability of more processing power very important. These trends include a large increase in the average size (measured in bytes) of forensic targets, an increase in the number of digital forensics cases, and the development of “next-generation” tools that require more computational resources. This paper presents the results of a number of experiments that evaluate the effectiveness of offloading processing common to digital forensics tools to a GPU, using “massive” numbers of threads to parallelize the computation. These results are compared to speedups obtainable by simple threading schemes appropriate for multicore CPUs. Our results indicate that in many cases, the use of GPUs can substantially increase the performance of digital forensics tools.  相似文献   

17.
18.
Due to the democratisation of new technologies, computer forensics investigators have to deal with volumes of data which are becoming increasingly large and heterogeneous. Indeed, in a single machine, hundred of events occur per minute, produced and logged by the operating system and various software. Therefore, the identification of evidence, and more generally, the reconstruction of past events is a tedious and time-consuming task for the investigators. Our work aims at reconstructing and analysing automatically the events related to a digital incident, while respecting legal requirements. To tackle those three main problems (volume, heterogeneity and legal requirements), we identify seven necessary criteria that an efficient reconstruction tool must meet to address these challenges. This paper introduces an approach based on a three-layered ontology, called ORD2I, to represent any digital events. ORD2I is associated with a set of operators to analyse the resulting timeline and to ensure the reproducibility of the investigation.  相似文献   

19.
20.
The ability to reconstruct the data stored in a database at an earlier time is an important aspect of database forensics. Past research shows that the log file in a database can be useful for reconstruction. However, in many database systems there are various options that control which information is included in the logs. This paper introduces the notion of the ideal log setting necessary for an effective reconstruction process in database forensics. The paper provides a survey of the default logging preferences in some of the popular database management systems and identifies the information that a database log should contain in order to be useful for reconstruction. The challenges that may be encountered in storing the information as well as ways of overcoming the challenges are discussed. Possible logging preferences that may be considered as the ideal log setting for the popular database systems are also proposed. In addition, the paper relates the identified requirements to the three dimensions of reconstruction in database forensics and points out the additional requirements and/or techniques that may be required in the different dimensions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号