scispace - formally typeset
Search or ask a question

Showing papers on "Digital forensics published in 2004"


Book ChapterDOI
23 May 2004
TL;DR: This work describes several statistical techniques for detecting traces of digital tampering in the absence of any digital watermark or signature, and quantifies statistical correlations that result from specific forms ofdigital tampering.
Abstract: A digitally altered photograph, often leaving no visual clues of having been tampered with, can be indistinguishable from an authentic photograph. As a result, photographs no longer hold the unique stature as a definitive recording of events. We describe several statistical techniques for detecting traces of digital tampering in the absence of any digital watermark or signature. In particular, we quantify statistical correlations that result from specific forms of digital tampering, and devise detection schemes to reveal these correlations.

467 citations


Journal Article
TL;DR: A framework for digital forensics that includes an investigation process model based on physical crime scene procedures and the focus of the investigation is on the reconstruction of events using evidence so that hypotheses can be developed and tested.

260 citations


Journal Article
TL;DR: A brief overview of forensic models is presented and a new model based on the Integrated Digital Investigation Model is proposed based on which a computer crime culprit may walk Scot-free or an innocent suspect may walkScot-free.

246 citations


Journal ArticleDOI
TL;DR: The results indicated that education/training and certification were the most reported issue and lack of funding was the least reported, which further support the criticism that there is a disproportional focus on the applied aspects of computer forensics, at the expense of the development of fundamental theories.

157 citations


Book
01 Jan 2004
TL;DR: Digital forensics from a unique perspective because it examines the systems that create digital evidence in addition to the techniques used to find it, and introduces a powerful approach that can often recover evidence considered lost forever.
Abstract: "Don't look now, but your fingerprints are all over the cover of this book. Simply picking it up off the shelf to read the cover has left a trail of evidence that you were here."If you think book covers are bad, computers are worse. Every time you use a computer, you leave elephant-sized tracks all over it. As Dan and Wietse show, even people trying to be sneaky leave evidence all over, sometimes in surprising places."This book is about computer archeology. It's about finding out what might have been based on what is left behind. So pick up a tool and dig in. There's plenty to learn from these masters of computer security."--Gary McGraw, Ph.D., CTO, Cigital, coauthor of Exploiting Software and Building Secure Software "A wonderful book. Beyond its obvious uses, it also teaches a great deal about operating system internals."--Steve Bellovin, coauthor of Firewalls and Internet Security, Second Edition, and Columbia University professor "A must-have reference book for anyone doing computer forensics. Dan and Wietse have done an excellent job of taking the guesswork out of a difficult topic."--Brad Powell, chief security architect, Sun Microsystems, Inc. "Farmer and Venema provide the essential guide to 'fossil' data. Not only do they clearly describe what you can find during a forensic investigation, they also provide research found nowhere else about how long data remains on disk and in memory. If you ever expect to look at an exploited system, I highly recommend reading this book."--Rik Farrow, Consultant, author of Internet Security for Home and Office "Farmer and Venema do for digital archaeology what Indiana Jones did for historical archaeology. Forensic Discovery unearths hidden treasures in enlightening and entertaining ways, showing how a time-centric approach to computer forensics reveals even the cleverest intruder."--Richard Bejtlich, technical director, ManTech CFIA, and author of The Tao of Network Security Monitoring "Farmer and Venema are 'hackers' of the old school: They delight in understanding computers at every level and finding new ways to apply existing information and tools to the solution of complex problems."--Muffy Barkocy, Senior Web Developer, Shopping.com "This book presents digital forensics from a unique perspective because it examines the systems that create digital evidence in addition to the techniques used to find it. I would recommend this book to anyone interested in learning more about digital evidence from UNIX systems."--Brian Carrier, digital forensics researcher, and author of File System Forensic AnalysisThe Definitive Guide to Computer Forensics: Theory and Hands-On Practice Computer forensics--the art and science of gathering and analyzing digital evidence, reconstructing data and attacks, and tracking perpetrators--is becoming ever more important as IT and law enforcement professionals face an epidemic in computer crime. In Forensic Discovery, two internationally recognized experts present a thorough and realistic guide to the subject. Dan Farmer and Wietse Venema cover both theory and hands-on practice, introducing a powerful approach that can often recover evidence considered lost forever. The authors draw on their extensive firsthand experience to cover everything from file systems, to memory and kernel hacks, to malware. They expose a wide variety of computer forensics myths that often stand in the way of success. Readers will find extensive examples from Solaris, FreeBSD, Linux, and Microsoft Windows, as well as practical guidance for writing one's own forensic tools. The authors are singularly well-qualified to write this book: They personally created some of the most popular security tools ever written, from the legendary SATAN network scanner to the powerful Coroner's Toolkit for analyzing UNIX break-ins. After reading this book you will be able to Understand essential forensics concepts: volatility, layering, and trust Gather the maximum amount of reliable evidence from a running system Recover partially destroyed information--and make sense of it Timeline your system: understand what really happened when Uncover secret changes to everything from system utilities to kernel modules Avoid cover-ups and evidence traps set by intruders Identify the digital footprints associated with suspicious activity Understand file systems from a forensic analyst's point of view Analyze malware--without giving it a chance to escape Capture and examine the contents of main memory on running systems Walk through the unraveling of an intrusion, one step at a time The book's companion Web site contains complete source and binary code for open source software discussed in the book, plus additional computer forensics case studies and resource links.

152 citations


Journal ArticleDOI
TL;DR: The role that file system metadata play in digital forensics is examined and what kind of information is desirable for different types of forensic investigations is analyzed.

110 citations



Journal ArticleDOI
TL;DR: This opening issue of ''Digital Investigation'' explores some possible approaches to specialization in digital forensic practice, sources of tools, techniques and emerging research to support such specialization, and why specialization may well be the future of digital Forensic practice.

90 citations


Journal Article
TL;DR: A formal model for analyzing and constructing forensic procedures, showing the advantages of formalization, is proposed and applied in a real-world scenario with focus on Linux and OS X.
Abstract: Forensic investigative procedures are used in the case of an intrusion into a networked computer system to detect the scope or nature of the attack. In many cases, the forensic procedures employed are constructed in an informal manner that can impede the effectiveness or integrity of the investigation. We propose a formal model for analyzing and constructing forensic procedures, showing the advantages of formalization. A mathematical description of the model will be presented demonstrating the construction of the elements and their relationships. The model highlights definitions and updating of forensic procedures, identification of attack coverage, and portability across different platforms. The forensic model is applied in a real-world scenario with focus on Linux and OS X.

68 citations


Journal ArticleDOI
TL;DR: This work begins the process of formulating a framework for digital forensics research by identifying fundamental properties and abstractions.

59 citations


Journal ArticleDOI
TL;DR: A clock model is presented that can account for factors and can simulate the behaviour of each independent clock and be used to remove the predicted clock errors from the time stamps to get a more realistic indication of the actual time at which the events occurred.

Journal Article
TL;DR: This study used four scenarios to test the ability to determine whether contraband images located on a system running Windows XP were intentionally downloaded or downloaded without the user’s consent or knowledge and determined a model consisting of two characteristics was the best model for discriminating the intentional action.
Abstract: The current study was exploratory and represents a first attempt at a standardized method for digital forensics event reconstruction based on statistical significance at a given error rate (α = .01). The study used four scenarios to test the ability to determine whether contraband images located on a system running Windows XP were intentionally downloaded or downloaded without the user’s consent or knowledge. Seven characteristics or system variables were identified for comparison; using a stepwise discriminant analysis, the seven characteristics were reduced to four. It was determined that a model consisting of two characteristics-- the average of the difference between file creation times and the median of the difference between file creation times -- was the best model for discriminating the intentional action at α = .01. The implications of this finding and suggestions for future research are discussed.

Journal Article
TL;DR: This paper is the result of an investigation into applying statistical tools and methodologies to the discovery of digital evidence and contains practical examples using modified Sleuthkit tools containing the proposed statistical measurements.
Abstract: This paper is the result of an investigation into applying statistical tools and methodologies to the discovery of digital evidence. Multiple statistical methods were reviewed; the two most useful are presented here. It is important to note that this paper represents an inquiry into the value of applied mathematical analysis to digital forensics investigations. Readers are encouraged to explore the concepts and make use of the tools presented here, in the hope that a synergy can be developed and concepts can be expanded to meet future challenges. In addition, this paper contains practical examples using modified Sleuthkit tools containing the proposed statistical measurements.

Book
01 Jan 2004
TL;DR: What Is Computer Forensics and Why Is It of Vital interest to You?
Abstract: What Is Computer Forensics and Why Is It of Vital Interest to You? Exactly Where Is Potentially Incriminating Data Stored in One's Computer and How Did It Get There. Specialized Forensics Techniques. Practical Means of Protecting Proprietary and Other Confidential Information in Computers. Protection from Online Computer Forensics and from Privacy Compromises: Legal Issues. Security Aspects of Evolving Technologies.

Journal ArticleDOI
TL;DR: Forensics is the use of science and technology to investigate and establish facts in criminal or civil courts of law to determine how the attack was carried out and what the attacker did.
Abstract: The dictionary defines forensics as “the use of science and technology to investigate and establish facts in criminal or civil courts of law.” I am more interested, however, in the usage common in the computer world: using evidence remaining after an attack on a computer to determine how the attack was carried out and what the attacker did. The standard approach to forensics is to see what can be retrieved after an attack has been made, but this leaves a lot to be desired. The first and most obvious problem is that successful attackers often go to great lengths to ensure that they cover their trails. The second is that unsuccessful attacks often go unnoticed, and even when they are noticed, little information is available to assist with diagnosis.

Proceedings ArticleDOI
05 Apr 2004
TL;DR: Using the Neyman-Pearson Lemma to proactively build online forensics tests with the best possible critical regions for hypothesis testing, and using classical stopping rules for sequential hypothesis testing to determine which users are deviating from standard usage behavior and should be the focus of more investigative resources are proposed.
Abstract: We examine principles and approaches for proactive computer-system forensics. Proactive computer-system forensics is the design, construction and configuring of systems to make them most amenable to digital forensics analyses in the future. The primary goals of proactive computer-system forensics are system structuring and augmentation for automated data discovery, lead formation, and efficient data preservation. We propose: (1) using the Neyman-Pearson Lemma to proactively build online forensics tests with the best possible critical regions for hypothesis testing, and (2) using classical stopping rules for sequential hypothesis testing to determine which users are deviating from standard usage behavior and should be the focus of more investigative resources. Here the focus is on security breaches by the employees or stakeholders of an organization. The main measurements are event-driven logs of program executions.

Journal ArticleDOI
TL;DR: The strengths and shortcomings of ProDiscover IR and EnCase Enterprise Edition are discussed and several enhancements are proposed for tools used to process digital evidence on remote, live systems.

Book
26 Nov 2004
TL;DR: This new edition presents you with a completely updated overview of the basic skills that are required as a computer forensics professional and introduces the latest software and tools that exist and the available certifications in this growing segment of IT that can help take your career to a new level.
Abstract: Essential reading for launching a career in computer forensicsInternet crime is on the rise, catapulting the need for computer forensics specialists. This new edition presents you with a completely updated overview of the basic skills that are required as a computer forensics professional. The author team of technology security veterans introduces the latest software and tools that exist and they review the available certifications in this growing segment of IT that can help take your career to a new level. A variety of real-world practices take you behind the scenes to look at the root causes of security attacks and provides you with a unique perspective as you launch a career in this fast-growing field.Explores the profession of computer forensics, which is more in demand than ever due to the rise of Internet crimeDetails the ways to conduct a computer forensics investigationHighlights tips and techniques for finding hidden data, capturing images, documenting your case, and presenting evidence in court as an expert witnessWalks you through identifying, collecting, and preserving computer evidenceExplains how to understand encryption and examine encryption filesComputer Forensics JumpStart is the resource you need to launch a career in computer forensics.

Book ChapterDOI
14 May 2004
TL;DR: This paper develops a fuzzy logic based expert system for network forensics that can analyze computer crimes in networked environments and make digital evidences automatically and reduce the time and cost of forensic analysis.
Abstract: The field of digital forensic science emerged as a response to the growth of a computer crime. Digital forensics is the art of discovering and retrieving information about a crime in such a way to make digital evidence admissible in court. Especially, network forensics is digital forensic science in networked environments. The more network traffic, the harder network analyzing. Therefore, we need an effective and automated analyzing system for network forensics. In this paper, we develop a fuzzy logic based expert system for network forensics that can analyze computer crimes in networked environments and make digital evidences automatically. This system can provide an analyzed information for forensic experts and reduce the time and cost of forensic analysis.

Journal ArticleDOI
TL;DR: The Session Token Protocol (STOP) is a new protocol that can assist in the forensic analysis of a computer involved in malicious network activity and utilizes the Identification Protocol infrastructure, improving both its capabilities and user privacy.
Abstract: In this paper we present the Session Token Protocol (STOP), a new protocol that can assist in the forensic analysis of a computer involved in malicious network activity. It has been designed to help automate the process of tracing attackers who log on to a series of hosts to hide their identity. STOP utilizes the Identification Protocol infrastructure, improving both its capabilities and user privacy. On request, the STOP protocol saves user-level and application-level data associated with a particular TCP connection and returns a random token specifically related to that session. The saved data are not revealed to the requester unless the token is returned to the local administrator, who verifies the legitimacy of the need for the release of information. The protocol supports recursive traceback requests to gather information about the entire path of a connection. This allows an incident investigator to trace attackers to their home systems, but does not violate the privacy of normal users. This paper details the new protocol and presents implementation and performance results.

Proceedings ArticleDOI
25 Jul 2004
TL;DR: This work proposes a fuzzy logic based expert system for network forensics that can analyze computer crimes in networked environments and make digital evidences automatically and reduce the time and cost of forensic analysis.
Abstract: The field of digital forensic science emerged as a response to the growth of computer crimes. Digital forensics is the art of discovering and retrieving information about a crime in such a way to make a digital evidence admissible in court. Network forensics is digital forensic in networked environments. However, the amount of network traffic is huge and might crash the traffic capture system if left unattended. Not all the information captured or recorded can be useful for analysis or evidence. The more the network traffic, the harder the network analyzing. Therefore, we need an effective and automated analyzing system for network forensics. We propose a fuzzy logic based expert system for network forensics that can analyze computer crimes in networked environments and make digital evidences automatically. This system can provide an analyzed information for forensic experts and reduce the time and cost of forensic analysis.

01 Jan 2004
TL;DR: Computer forensics is the scientific collection; recovery preservation, legal analysis and presentation of data held or retrieved from computer storage media in such a way that the information can be used as evidence in a court of law.
Abstract: Computer forensics is the scientific collection; recovery preservation, legal analysis and presentation of data held or retrieved from computer storage media in such a way that the information can be used as evidence in a court of law. Forensics specialists must consider the legal and ethical parameters of evidence collection so that critical elements are not corrupted.

Journal ArticleDOI
Bruce J. Nikkel1
TL;DR: This paper presents a systematic approach to investigating a complex Internet presence, including collecting, time-stamping, packaging, preserving, and presenting evidence, geared towards the network forensics practitioner.

Journal Article
TL;DR: In this paper, the authors describe several statistical techniques for detecting traces of digital tampering in the absence of any digital watermark or signature, and devise detection schemes to reveal these correlations.
Abstract: A digitally altered photograph, often leaving no visual clues of having been tampered with, can be indistinguishable from an authentic photograph. As a result, photographs no longer hold the unique stature as a definitive recording of events. We describe several statistical techniques for detecting traces of digital tampering in the absence of any digital watermark or signature. In particular, we quantify statistical correlations that result from specific forms of digital tampering, and devise detection schemes to reveal these correlations.

Journal Article
TL;DR: This paper argues for the need for on-the-spot digital forensics tools that supplement lab methods and discusses the specific user and software engineering requirements for such tools.
Abstract: Traditional digital forensics methods are based on the in-depth examination of computer systems in a lab setting. Such methods are standard practice in acquiring digital evidence and are indispensable as an investigative approach. However, they are also relatively heavyweight and expensive and require significant expertise on part of the investigator. Thus, they cannot be applied on a wider scale and, in particular, they cannot be used as a tool by regular law enforcement officers in their daily work. This paper argues for the need for on-the-spot digital forensics tools that supplement lab methods and discuss the specific user and software engineering requirements for such tools. The authors present the Bluepipe architecture for on-the-spot investigation and the Bluepipe remote forensics protocol that they have developed and relate them to a set of requirements. They also discuss some of the details of their ongoing prototype implementation.

Book
11 Nov 2004
TL;DR: The second edition has been updated to offer more detailed how-to guidance on protecting the confidentiality of data stored on computers and specific information on the vulnerabilities of commonly used ancillary computing devices, such as PDAs, cellular telephones, smart cards, GPS devices, telephone calling cards, fax machines, and photocopiers as discussed by the authors.
Abstract: This thoroughly revised edition of an Artech House bestseller goes far beyond the typical computer forensics books on the market, emphasizing how to protect one's privacy from data theft and hostile computer forensics. The second edition has been updated to offer more detailed how-to guidance on protecting the confidentiality of data stored on computers, and specific information on the vulnerabilities of commonly used ancillary computing devices, such as PDAs, cellular telephones, smart cards, GPS devices, telephone calling cards, fax machines, and photocopiers. This cutting-edge book helps you identify the specific areas where sensitive and potentially incriminating data is hiding in personal computers and consumer electronics, and explains how to go about truly removing this data because mere deletionu or even overwritingu does not accomplish this. You get a systematic process for installing operating systems and application software that will help to minimize the possibility of security compromises, and numerous specific steps that need to be taken to prevent the hostile exploitation of one's computer. This unique resource provides a method for ensuring that computers that are connected to the Internet are protected from malicious mobile code (code that can allow a remote hacker to read, remove, damage, or even add data to one's computer) the new fashion of adware/spyware, and Web bugs.u Moreover, you learn how to detect whether advanced investigative tools, such as keystroke storing and relaying hardware and software, are in use in a computer. You also learn of commercially available capabilities for intercepting radio signals unintentionally radiated by a computer. Other critical topics include the pitfalls of encryption and how to use it effectively, the practical aspects of online anonymity, and the current legal issues that pertain to the use of computers.

Journal ArticleDOI
TL;DR: This paper presents strengths and shortcomings of WinHex Specialist Edition (version 11.25 SR-7) in the context of the overall digital forensics process, focusing on its ability to preserve and examine data on storage media.

Patent
20 Dec 2004
TL;DR: In this article, the authors proposed a method to provide digital forensics capable of efficiently executing identification of an illicit actor in a method capable of verifying evidential property with the reliability of identification being hardly influenced by human elements.
Abstract: PROBLEM TO BE SOLVED: To provide digital forensics capable of efficiently executing identification of an illicit actor in a method capable of verifying evidential property with the reliability of identification being hardly influenced by human elements. SOLUTION: Continuous monitoring 1 is performed in the stage of network forensics 12, and filtered 2 in a predetermined condition to detect abnormality. In the event of abnormality 4, log analysis 5 is performed to the abnormality to narrow down the outline of the abnormality and an object terminal. An examination object terminal 6 that is the narrowing down result by the network forensics 12 is obtained, and the process is transferred to the stage of computer forensics 13, in which perpetuation of evidence 7 of the narrowed down terminal is performed, and analysis 8 is executed to data for the perpetuation of evidence. In the analysis 8, examination priority order is determined in reference to a log analysis result 5 to efficiently progress the examination. An evidence report 9 for the fact obtained by the analysis 8 is finally created. COPYRIGHT: (C)2006,JPO&NCIPI

Journal ArticleDOI
TL;DR: Some forensic analysts enhance the effectiveness of their work by using extremely complex and powerful tools such as GREP (Global Regular Expression Print), developed in the early 1970s to search for words or word fragments anywhere on the disk.
Abstract: Forensic investigations focus on searches of files or portions of files. These portions may come from active or deleted files, slack space, or non-allocated space. Things may be even more complicated with distributed file systems or large hard disks, which can create further and often unjustifiable demands on processing power. Some forensic analysts enhance the effectiveness of their work by using extremely complex and powerful tools such as GREP (Global Regular Expression Print). This tool was developed in the early 1970s to search for words or word fragments anywhere on the disk. The GREP expressions are so effective that even automated tools such as Encase and FTK make broad use of them, although their power depends strongly on the technical expertise of the user.

Book ChapterDOI
TL;DR: This chapter will address the identification, extraction, and presentation of evidence from electronic media as it is typically performed within law enforcement agencies, describe the current state of the practice, as well as discuss opportunities for new technologies.
Abstract: The use of computers to either directly or indirectly store evidence by criminals has become more prevalent as society has become increasingly computerized. It is now routine to find calendars, e-mails, financial account information, detailed plans of crimes, and other artifacts that can be used as evidence in a criminal case stored on a computer's hard drive. Computer forensics is rapidly becoming an essential part of the investigative process, at both local law enforcement levels and federal levels. It is estimated that half of all federal criminal cases require a computer forensics examination. This chapter will address the identification, extraction, and presentation of evidence from electronic media as it is typically performed within law enforcement agencies, describe the current state of the practice, as well as discuss opportunities for new technologies.