scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Digital Evidence in 2002"


Journal Article
TL;DR: This paper explores the development of the digital forensics process, compares and contrasts four particular forensic methodologies, and finally proposes an integrated methodology that encompasses the forensic analysis of all genres of digital crime scene investigations.
Abstract: Law enforcement is in a perpetual race with criminals in the application of digital technologies, and requires the development of tools to systematically search digital devices for pertinent evidence. Another part of this race, and perhaps more crucial, is the development of a methodology in digital forensics that encompasses the forensic analysis of all genres of digital crime scene investigations. This paper explores the development of the digital forensics process, compares and contrasts four particular forensic methodologies, and finally proposes an

487 citations


Journal Article
TL;DR: In this paper, the authors discuss inherent uncertainties in network related evidence that can be compounded by data corruption, loss, tampering, or errors in interpretation and analysis, and introduce methods of estimating and categorizing uncertainty in digital data.
Abstract: Despite the potentially grave ramifications of relying on faulty information in the investigative or probabitive stages, the uncertainty in digital evidence is not being evaluated at present, thus making it difficult to assess the reliability of evidence stored on and transmitted using computer networks. As scientists, forensic examiners have a responsibility to reverse this trend and address formally the uncertainty in any evidence they rely on to reach conclusions. This paper discusses inherent uncertainties in network related evidence that can be compounded by data corruption, loss, tampering, or errors in interpretation and analysis. Methods of estimating and categorizing uncertainty in digital data are introduced and examples are presented.

138 citations


Journal Article
TL;DR: The dramatic move from paper to bits combined with the ability and necessity to bring digital data to court, however, creates a critical question: how do the authors prove the integrity of this new form of information known as “digital evidence”?
Abstract: During the latter half of the 20 century, a dramatic move from paper to bits occurred. Our use of digital communication methods such as the world-wide-web and e-mail have dramatically increased the amount of information that is routinely stored in only a digital form. On October 1, 2000 the Electronic Signatures in National and Global Commerce Act was enacted, allowing transactions signed electronically to be enforceable in a court of law. (Longley) The dramatic move from paper to bits combined with the ability and necessity to bring digital data to court, however, creates a critical question. How do we prove the integrity of this new form of information known as “digital evidence”?

52 citations


Journal Article
TL;DR: The knowledge base processor provides the esential functions for symbolic reasoning, and establishes a framework for building the knowledge system which permits application program developers to exploit the best available conventional data processing capabilities.
Abstract: A knowledge base processor is callable by an application program to access a knowledge base and to govern the execution or interpretation of the knowledge base to find the values of selected objects or expressions defined in the knowledge base. The application program is written in a conventional computer language which specifies control by the ordering of program steps. The application program provides a user interface for input/output and provides top level control for calling the knowledge base processor to find values for goal expressions. During its search for the values of goal expressions, the knowledge base processor calls the application program to determine values of expressions which are not concluded by the knowledge base, and to signal important events during the execution of the knowledge base. Preferably the knowledge base processor and the application program each include a library of subroutines which are linked-loaded to provide a complete knowledge system for a specific application or task. Therefore, the knowledge base processor provides the esential functions for symbolic reasoning, and establishes a framework for building the knowledge system which permits application program developers to exploit the best available conventional data processing capabilities. The application programmer is free to exercise his or her knowledge and skill regarding the use of conventional programming languages and their support facilities such as utility libraries, optimizing compliers and user interfaces.

41 citations


Journal Article
TL;DR: Encryption can also delay investigations, increase their costs, and necessitate the use of investigative methods which are more dangerous or invasive of privacy.
Abstract: The threat [of encryption] is manifest in four ways: failure to get evidence needed for convictions, failure to get intelligence vital to criminal investigations, failure to avert catastrophic or harmful attacks, and failure to get foreign intelligence vital to national security. Encryption can also delay investigations, increase their costs, and necessitate the use of investigative methods which are more dangerous or invasive of privacy. (Demming, Baugh, 1997a)

36 citations


Journal Article
TL;DR: This paper discusses some of the unique military requirements and challenges in Cyber Forensics, and how these technologies and capabilities are transferable to civilian law enforcement, critical infrastructure protection, and industry.
Abstract: This paper discusses some of the unique military requirements and challenges in Cyber Forensics. A definition of Cyber Forensics is presented in a military context. Capabilities needed to perform cyber forensic analysis in a networked environment are discussed, along with a list of current shortcomings in providing these capabilities and a technology needs list. Finally, it is shown how these technologies and capabilities are transferable to civilian law enforcement, critical infrastructure protection, and industry.

23 citations


Journal Article
TL;DR: Analysis of the registry indicated the system time zone was set to Pacific Standard Time and the MAC times of the files involved in the intrusion indicated that SUBJECT intruded on VICTIM systems on December 5, 1998.
Abstract: • The MAC times of the files involved in the intrusion indicated that SUBJECT intruded on VICTIM systems on December 5, 1998, 1:00am (CST.) • Analysis of the registry indicated the system time zone was set to Pacific Standard Time. • Analysis of the Internet History file corroborated the MAC times by indicating that the intrusion occurred on December 4, 1998, 11:00pm (PST) or December 5, 1998, 1:00am (CST.)

17 citations


Journal Article
TL;DR: A web-based Lessons Learned Repository (LLR) is described that facilitates contribution of Lessons, and their subsequent retrieval in the Law Enforcement community.
Abstract: The Law Enforcement community possesses a large, but informal, community memory with respect to digital forensics. Large, because the experiences of every forensics technician and investigator contribute to the whole. Informal because there is seldom an explicit mechanism for disseminating this wisdom except “over the water cooler”. As a consequence, the same problems and mistakes continue to resurface and the same solutions are re-invented. In order to better exploit this informal collection of wisdom, the key points of each experience can be placed into a Repository for later dissemination. We describe a web-based Lessons Learned Repository (LLR) that facilitates contribution of Lessons, and their subsequent retrieval.

14 citations


Journal Article
TL;DR: The invention provides the significant advantages of decaying small image features, such as speckle noise at a significantly faster rate than large image features , such as target returns.
Abstract: A method of operating a computing machine for reducing speckle noise in video images, particularly radar images, utilizes a complementary hulling technique on vertical pixel grids of the array. The vertical pixel contours which are subjected to the complementary hulling are derived from intersections of vertical grids with conceptual superposed gray-scale surfaces which have front end values corresponding to the gray-scale pixel values. The invention provides the significant advantages of decaying small image features, such as speckle noise at a significantly faster rate than large image features, such as target returns.

13 citations


Journal Article
TL;DR: The objective of Onion Routing is to make it completely impossible for third parties to perform traffic analysis by applying cryptographic techniques to networking, which is basically a sort of “no cache” system.
Abstract: The objective of Onion Routing is to make it completely impossible for third parties to perform traffic analysis. This goal is achieved by applying cryptographic techniques to networking. The packets transiting the chain of onion routers thus appear anonymous. Yes, we are talking about a chain. Practically speaking, there is a group of onion routers distributed around the public network, each of which has the task of encrypting the socket connections and to act in turn as a proxy. Experiments with Onion Routing have already been carried out on Sun Solaris 2.4 using proxies for http (www) and RLOGIN. At the moment, proxy operations are planned for e-mail (SMTP), FTP and a slew of other protocols. Let’s imagine we have to make an http transaction. This is how it works: 1) The application does not connect directly to the destination Web server, but rather to a socket connection with an Onion Routing proxy; 2) The Onion Routing proxy establishes a direct anonymous connection with its nearest sister. To guarantee the impossibility of interceptions, the first Onion Routing proxy makes another connection with others of its ilk to complete the chain. To avoid hijacking and man-in-the-middle phenomena, the communication between onion routers is forced. Practically speaking, each onion router is only able to identify and dialog with its adjacent kin included in the route. Each packet can currently make a maximum of 11 hops, then it has to reach its destination. 3) Each time an onion router handles a transaction, it strips away a layer of encryption with respect to the preceding hop. This means that at the end of the route the packet arrives in cleartext. This is one of the first problems an investigator may encounter. Practically speaking, both because of the encryption and because at each hop the link to the preceding routing point is literally stripped away, traceback becomes impossible. The only way to carry out an effective investigation is to implement a logging function at the proxy level as we will describe in greater detail below; 4) In addition, the encryption and transmission of data through the links of the chain is carried out randomly in such a way as to render impossible any sort of “sequence prediction”. Furthermore, whenever the connection is interrupted, for any reason, all information relating to a given transaction is deleted from the rest of the chain. It is basically a sort of “no cache” system.

9 citations


Journal Article
TL;DR: This initial edition of the International Journal of Digital Evidence (IJDE) has an opportunity to identify, prioritize, and focus upon some of the most important aspects of this issue, free of irrelevant influences.
Abstract: www.ijde.org 1 Digital Evidence: The Moral Challenge Tom Talleur, Managing Director, KPMG LLP’s Forensic Practice My colleagues, co-founders, and I, are fortunate to have this opportunity to characterize a framework for discourse on the topic of digital evidence in this initial edition of the International Journal of Digital Evidence (IJDE). In this respect, we have an opportunity to identify, prioritize, and focus upon some of the most important aspects of this issue, free of irrelevant influences.

Journal Article
TL;DR: It is asserted that a constructive formalization of peripheral input and output for a computer can address lack of verifiability, as well as several other concerns.
Abstract: Currently, it is not practical for any single software system to perform forensically acceptable verification of the contents of all possible file systems on a disk, let alone the contents of more esoteric peripherals Recent court decisions that require judges to restrict testimony based on their understanding of the validity of the science behind it will only make such verification even more difficult This problem, critical to forensic examiners, is actually symptomatic of a larger problem, which lies partly in the domain of digital forensics and partly in the domain of pure computer science Lack of verifiability, along with a host of other problems, points to inadequate formal description of file systems and I/O methodology A review of the literature finds, in fact, that little effort has been put into such formalization We assert that a constructive formalization of peripheral input and output for a computer can address this and several other concerns