scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Digital Evidence in 2004"


Journal Article
TL;DR: This paper proposes a ten step process for an organisation to implement forensic readiness, which aims to maximise its potential to use digital evidence whilst minimising the costs of an investigation.
Abstract: A forensic investigation of digital evidence is commonly employed as a post-event response to a serious information security incident. In fact, there are many circumstances where an organisation may benefit from an ability to gather and preserve digital evidence before an incident occurs. Forensic readiness is defined as the ability of an organisation to maximise its potential to use digital evidence whilst minimising the costs of an investigation. The costs and benefits of such an approach are outlined. Preparation to use digital evidence may involve enhanced system and staff monitoring, technical, physical and procedural means to secure data to evidential standards of admissibility, processes and procedures to ensure that staff recognise the importance and legal sensitivities of evidence, and appropriate legal advice and interfacing with law enforcement. This paper proposes a ten step process for an organisation to implement forensic readiness.

272 citations


Journal Article
TL;DR: A model of investigations is presented which combines the existing models, generalises them, and extends them by explicitly addressing certain activities not included in them and captures the full scope of an investigation, rather than only the processing of evidence.
Abstract: A comprehensive model of cybercrime investigations is important for standardising terminology, defining requirements, and supporting the development of new techniques and tools for investigators. In this paper a model of investigations is presented which combines the existing models, generalises them, and extends them by explicitly addressing certain activities not included in them. Unlike previous models, this model explicitly represents the information flows in an investigation and captures the full scope of an investigation, rather than only the processing of evidence. The results of an evaluation of the model by practicing cybercrime investigators are presented. This new model is compared to some important existing models and applied to a real investigation.

232 citations


Journal Article
TL;DR: In an automatic transmission control system, it is determined that the gear-shifting is completed when the rotational speed of the output shaft of the torque converter falls within the predetermined rotationalSpeed range and the rate of change in rotational Speed is reduced below a predetermined value.
Abstract: In an automatic transmission control system, completion of gear-shifting is detected on the basis of both the rate of change in rotational speed of the output shaft of the torque converter and the rotational speed at which the output shaft of the torque converter is expected to rotate. That is, a rotational speed range within which the rotational speed of the output shaft of the torque converter is expected to fall upon completion of the gear-shifting is determined taking into account the rotational speed of the output shaft of the torque converter upon initiation of the gear-shifting and the gear ratio, and it is determined that the gear-shifting is completed when the rotational speed of the output shaft of the torque converter falls within the predetermined rotational speed range and the rate of change in rotational speed of the output shaft of the torque converter is reduced below a predetermined value.

108 citations


Journal Article
TL;DR: An obstacle in any Child Pornography (CP) investigation is the investigator’s ability to determine whether the pictures in question have been altered.
Abstract: An obstacle in any Child Pornography (CP) investigation is the investigator’s ability to determine whether the pictures in question have been altered. Because of the court ruling in Ashcroft v. Free Speech, many agents are asked on the stand if they can prove the pictures they recovered were altered in any way. If the picture doesn’t match any known CP hashes, then it can be very difficult to prove they are untouched. One way an investigator may be able determine if a picture is authentic is through extraction of metadata. In the case of digital pictures, they may contain EXIF headers that can help the investigator to verify the authenticity of a picture.

93 citations


Journal Article
TL;DR: A formal model for analyzing and constructing forensic procedures, showing the advantages of formalization, is proposed and applied in a real-world scenario with focus on Linux and OS X.
Abstract: Forensic investigative procedures are used in the case of an intrusion into a networked computer system to detect the scope or nature of the attack. In many cases, the forensic procedures employed are constructed in an informal manner that can impede the effectiveness or integrity of the investigation. We propose a formal model for analyzing and constructing forensic procedures, showing the advantages of formalization. A mathematical description of the model will be presented demonstrating the construction of the elements and their relationships. The model highlights definitions and updating of forensic procedures, identification of attack coverage, and portability across different platforms. The forensic model is applied in a real-world scenario with focus on Linux and OS X.

68 citations


Journal Article
TL;DR: This study used four scenarios to test the ability to determine whether contraband images located on a system running Windows XP were intentionally downloaded or downloaded without the user’s consent or knowledge and determined a model consisting of two characteristics was the best model for discriminating the intentional action.
Abstract: The current study was exploratory and represents a first attempt at a standardized method for digital forensics event reconstruction based on statistical significance at a given error rate (α = .01). The study used four scenarios to test the ability to determine whether contraband images located on a system running Windows XP were intentionally downloaded or downloaded without the user’s consent or knowledge. Seven characteristics or system variables were identified for comparison; using a stepwise discriminant analysis, the seven characteristics were reduced to four. It was determined that a model consisting of two characteristics-- the average of the difference between file creation times and the median of the difference between file creation times -- was the best model for discriminating the intentional action at α = .01. The implications of this finding and suggestions for future research are discussed.

36 citations


Journal Article
TL;DR: This paper is the result of an investigation into applying statistical tools and methodologies to the discovery of digital evidence and contains practical examples using modified Sleuthkit tools containing the proposed statistical measurements.
Abstract: This paper is the result of an investigation into applying statistical tools and methodologies to the discovery of digital evidence. Multiple statistical methods were reviewed; the two most useful are presented here. It is important to note that this paper represents an inquiry into the value of applied mathematical analysis to digital forensics investigations. Readers are encouraged to explore the concepts and make use of the tools presented here, in the hope that a synergy can be developed and concepts can be expanded to meet future challenges. In addition, this paper contains practical examples using modified Sleuthkit tools containing the proposed statistical measurements.

36 citations


Journal Article
TL;DR: To enable digital forensic analysis of e-mails, this work proposes behavioral biometric based authentication, which is analogous to a signature in paper documents, and in the proposed system, if someone other than a genuine user tries to authenticate himself, then detection and fixing is possible.
Abstract: E-mail has revolutionized business, academic, and personal communication The advantages of e-mail include speedy delivery, ease of communication, cost effectiveness, geographical independence, and the portability of mailboxes. The last two are the biggest advantages over snail mail. However, with e-mail comes the threat of a genuine user being compromised through key loggers, social engineering, shoulder surfing, password guessing and other similar, though less technical, methods. This passive espionage can have a direct impact on the genuine user in terms of denial of information, loss of money, loss of time, mental harassment and an attack of personal privacy. To enable digital forensic analysis of e-mails, we propose behavioral biometric based authentication, which is analogous to a signature in paper documents. In the proposed system, if someone other than a genuine user tries to authenticate himself, then detection and fixing is possible.

21 citations


Journal Article
TL;DR: This paper proposes a methodology based upon formal modeling of the security processes in an enterprise under attack that shows the nature of the state changes that occurred as a result of the incident and how to insert appropriate safeguards and countermeasures to prevent future occurrences of the same type of incident.
Abstract: Numerous current regulations and standards mandate incident response for virtually all segments of the private sector. According to most incident response experts there is the need to perform a root cause analysis (or “incident post mortem”) following recovery from such incidents. To date there has not been a structured, formal approach to conducting this type of post incident analysis. This paper proposes a methodology based upon formal modeling of the security processes in an enterprise under attack. The enterprise is segmented into manageable and securityrelevant policy domains and the interactions of those domains including both pre- and postincident states are modeled. The paper then shows how to analyze the nature of the state changes that occurred as a result of the incident and, finally, how to insert appropriate safeguards and countermeasures to prevent future occurrences of the same type of incident. This methodology is based upon an ongoing research project, field testing, and other peerreviewed papers. The formalism selected is Colored Petri Nets.

19 citations


Journal Article
TL;DR: An intensive six-month investigation into encryption technologies conducted at the Computer Forensic Research & Development Center (CFRDC) at Utica College resulted in a roadmap for the identification of the unique characteristics of encrypted file formats.
Abstract: This paper is the result of an intensive six-month investigation into encryption technologies conducted at the Computer Forensic Research & Development Center (CFRDC) at Utica College. A significant number of encryption applications were collected and cataloged. A roadmap for the identification of the unique characteristics of encrypted file formats was created. A number of avenues were explored and the results documented. The actual process is not outlined comprehensively due to proprietary needs; however, the following briefly details the process and the significance of our findings.

17 citations


Journal Article
TL;DR: This paper argues for the need for on-the-spot digital forensics tools that supplement lab methods and discusses the specific user and software engineering requirements for such tools.
Abstract: Traditional digital forensics methods are based on the in-depth examination of computer systems in a lab setting. Such methods are standard practice in acquiring digital evidence and are indispensable as an investigative approach. However, they are also relatively heavyweight and expensive and require significant expertise on part of the investigator. Thus, they cannot be applied on a wider scale and, in particular, they cannot be used as a tool by regular law enforcement officers in their daily work. This paper argues for the need for on-the-spot digital forensics tools that supplement lab methods and discuss the specific user and software engineering requirements for such tools. The authors present the Bluepipe architecture for on-the-spot investigation and the Bluepipe remote forensics protocol that they have developed and relate them to a set of requirements. They also discuss some of the details of their ongoing prototype implementation.

Journal Article
TL;DR: Decoy Systems, also known as deception systems, honey-pots or tar-pits, are phony components setup to entice unauthorized users by presenting numerous system vulnerabilities, while attempting to restrict unauthorized access to network information systems.
Abstract: Interconnectivity on the Internet is growing, as more and more organizations, private companies and governmental institutions connect for critical information processing. This interconnectivity allows for better productivity, faster communication capabilities and immeasurable personal conveniences. It also opens the door to many unforeseeable risks, such as individuals gaining unauthorized access to critical enterprise information infrastructure. These organizations are discovering that traditional means of preventing and detecting network infringements with firewalls, router access control-list (ACLs), anti-viruses and intrusion detection systems (IDS) are not enough. Hackers are able to obtain easy to use tools to scan various networks on the Internet for system vulnerabilities, then use the information gathered from the scans to launch their attacks with script kiddies. A solution that has been catching on in the network security and computer incident response environment is to employ “Decoy Systems.” Decoy Systems, also known as deception systems, honey-pots or tar-pits, are phony components setup to entice unauthorized users by presenting numerous system vulnerabilities, while attempting to restrict unauthorized access to network information systems.

Journal Article
TL;DR: An evaluation methodology that can be used to assess the performance of intelligent techniques at detecting, as well as predicting, unauthorised activities in networks is discussed.
Abstract: This paper discusses an evaluation methodology that can be used to assess the performance of intelligent techniques at detecting, as well as predicting, unauthorised activities in networks. The effectiveness and the performance of any developed intrusion detection model will be determined by means of evaluation and validation. The evaluation and the learning prediction performance for this task will be discussed, together with a description of validation procedures. The performance of developed detection models that incorporate intelligent elements can be evaluated using well known standard methods, such as matrix confusion, ROC curves and Lift charts. In this paper these methods, as well as other useful evaluation approaches, are discussed.

Journal Article
TL;DR: The author repeats an earlier experiment to verify the issue in version 2.4 of the kernel and then shows that the issue has been resolved in versions 2.6.
Abstract: No official version of the Linux kernel, up through and including version 2.4, allowed a user land process to access the last sector of a hard disk or hard disk partition with an odd number of sectors. Although the inability to access this last sector did not affect normal operation of the system, it did prevent the complete forensic acquisition of such a disk. The author repeats an earlier experiment to verify the issue in version 2.4 of the kernel and then shows that the issue has been resolved in version 2.6. Systems using version 2.6 of the Linux kernel can completely forensically acquire disks or partitions with an odd number of sectors.

Journal Article
TL;DR: The authors make the case that an accurate and reliable checkpointing tool could create a new source of evidence for the forensic investigator when considering process migration, fault tolerance, or load balancing.
Abstract: The goal of this paper is to introduce a new area of computer forensics: process forensics. Process forensics involves extracting information from a process’s address space for the purpose of finding digital evidence pertaining to a computer crime. The challenge of this sub-field is that the address space of a given process is usually lost long before the forensic investigator is analyzing the hard disk and file system of a computer. Therefore, the authors make the case that an accurate and reliable checkpointing tool could create a new source of evidence for the forensic investigator. The technology of checkpointing is nothing new when considering process migration, fault tolerance, or load balancing. However, with respect to computer forensics, the gains from checkpointing have yet to be explored.