scispace - formally typeset
Search or ask a question

Showing papers on "Digital forensics published in 2017"


Journal ArticleDOI
TL;DR: In this article, a proof-of-concept tool, CIFT, is proposed to support identification, acquisition and analysis of both native artifacts from the cloud and client-centric artifacts from local devices (mobile applications and web browsers).

124 citations


Journal ArticleDOI
01 Nov 2017
TL;DR: The future of digital forensics is explored, with an emphasis on these challenges and the advancements needed to effectively protect modern societies and pursue cybercriminals.
Abstract: Today’s huge volumes of data, heterogeneous information and communication technologies, and borderless cyberinfrastructures create new challenges for security experts and law enforcement agencies investigating cybercrimes. The future of digital forensics is explored, with an emphasis on these challenges and the advancements needed to effectively protect modern societies and pursue cybercriminals.

91 citations


Journal ArticleDOI
TL;DR: There are many datasets for use out there but finding them can be challenging and the importance of sharing datasets to allow researchers to replicate results and improve the state of the art is stressed.

88 citations


Journal ArticleDOI
TL;DR: The state-of-the-art in memoryForensics is surveyed, critical analysis of current-generation techniques are provided, important changes in operating systems design that impact memory forensics are described, and important areas for further research are sketched.

79 citations


Proceedings ArticleDOI
26 Apr 2017
TL;DR: In this paper, a revision of IoT digital evidence acquisition procedure is provided and an improved theoretical framework for IoT forensic model that copes with evidence acquisition issues is proposed and discussed.
Abstract: Digital evidence plays a vital role in determining legal case admissibility in electronic- and cyber-oriented crimes. Considering the complicated level of the Internet of Things (IoT) technology, performing the needed forensic investigation will be definitely faced by a number of challenges and obstacles, especially in digital evidence acquisition and analysis phases. Based on the currently available network forensic methods and tools, the performance of IoT forensic will be producing a deteriorated digital evidence trail due to the sophisticated nature of IoT connectivity and data exchangeability via the “things”. In this paper, a revision of IoT digital evidence acquisition procedure is provided. In addition, an improved theoretical framework for IoT forensic model that copes with evidence acquisition issues is proposed and discussed.

62 citations


Proceedings ArticleDOI
29 Aug 2017
TL;DR: This paper argues that besides traditional digital forensics practices it is important to have application-specific forensics in place to ensure collection of evidence in context of specific IoT applications and introduces a model which deals with not just traditional forensics but is applicable in digital as well as application- specific forensics process.
Abstract: Besides its enormous benefits to the industry and community the Internet of Things (IoT) has introduced unique security challenges to its enablers and adopters. As the trend in cybersecurity threats continue to grow, it is likely to influence IoT deployments. Therefore it is eminent that besides strengthening the security of IoT systems we develop effective digital forensics techniques that when breaches occur we can track the sources of attacks and bring perpetrators to the due process with reliable digital evidence. The biggest challenge in this regard is the heterogeneous nature of devices in IoT systems and lack of unified standards. In this paper we investigate digital forensics from IoT perspectives. We argue that besides traditional digital forensics practices it is important to have application-specific forensics in place to ensure collection of evidence in context of specific IoT applications. We consider top three IoT applications and introduce a model which deals with not just traditional forensics but is applicable in digital as well as application-specific forensics process. We believe that the proposed model will enable collection, examination, analysis and reporting of forensically sound evidence in an IoT application-specific digital forensics investigation.

59 citations


Journal ArticleDOI
TL;DR: A new Framework is proposed to mitigate challenges faced by forensics investigators and a new model for collecting forensic evidence outside the cloud environment is developed.

59 citations


Journal ArticleDOI
TL;DR: The utility of the Digital Forensic Intelligence Analysis Cycle (DFIAC) is demonstrated in locating information across an increasing volume of forensically extracted data from mobile devices, and a greater understanding of the developing trends in relation to mobile device forensic analysis is understood.

55 citations


Journal ArticleDOI
TL;DR: This paper describes several common questions that arise in forensic examinations of Android WeChat and provides corresponding technical methods that are useful to address these questions.

53 citations


Proceedings ArticleDOI
25 Jun 2017
TL;DR: This article proposes TrustIoV – a digital forensic framework for the IoV systems that provides mechanisms to collect and store trustworthy evidence from the distributed infrastructure and maintains a secure provenance of the evidence to ensure the integrity of the stored evidence.
Abstract: The Internet of Vehicles (IoV) is a complex and dynamic mobile network system that enables information sharing between vehicles, their surrounding sensors, and clouds. While IoV opens new opportunities in various applications and services to provide safety on the road, it introduces new challenges in the field of digital forensics investigations. The existing tools and procedures of digital forensics cannot meet the highly distributed, decentralized, dynamic, and mobile infrastructures of the IoV. Forensic investigators will face challenges while identifying necessary pieces of evidence from the IoV environment, and collecting and analyzing the evidence. In this article, we propose TrustIoV – a digital forensic framework for the IoV systems that provides mechanisms to collect and store trustworthy evidence from the distributed infrastructure. Trust-IoV maintains a secure provenance of the evidence to ensure the integrity of the stored evidence and allows investigators to verify the integrity of the evidence during an investigation. Our experimental results on a simulated environment suggest that Trust-IoV can operate with minimal overhead while ensuring the trustworthiness of evidence in a strong adversarial scenario.

47 citations


Journal ArticleDOI
TL;DR: A novel methodical evaluation of 21 state-of-the-art PRNU estimation/enhancement techniques that have been proposed in the literature in various frameworks and are extensively demonstrated over a large-scale experiment to conclude this case-sensitive study.
Abstract: Extracting a fingerprint of a digital camera has fertile applications in image forensics, such as source camera identification and image authentication. In the last decade, photo response non_uniformity (PRNU) has been well established as a reliable unique fingerprint of digital imaging devices. The PRNU noise appears in every image as a very weak signal, and its reliable estimation is crucial for the success rate of the forensic application. In this paper, we present a novel methodical evaluation of 21 state-of-the-art PRNU estimation/enhancement techniques that have been proposed in the literature in various frameworks. The techniques are classified and systematically compared based on their role/stage in the PRNU estimation procedure, manifesting their intrinsic impacts. The performance of each technique is extensively demonstrated over a large-scale experiment to conclude this case-sensitive study. The experiments have been conducted on our created database and a public image database, the “Dresden image database.”

Journal ArticleDOI
TL;DR: A methodology of Digital Forensic Quick Analysis is outlined, which describes a method to review Digital Forensic Data Reduction subsets to pinpoint relevant evidence and intelligence from heterogeneous distributed systems in a timely manner.
Abstract: The growth in the data volume and number of evidential data from heterogeneous distributed systems in smart cities, such as cloud and fog computing systems and Internet-of-Things devices e.g. IP-based CCTVs, has led to increased collection, processing and analysis times, potentially resulting in vulnerable persons e.g. victims of terrorism incidents being at risk. A process of Digital Forensic Data Reduction of source multimedia and forensic images has provided a method to reduce the collection time and volume of data. In this paper, a methodology of Digital Forensic Quick Analysis is outlined, which describes a method to review Digital Forensic Data Reduction subsets to pinpoint relevant evidence and intelligence from heterogeneous distributed systems in a timely manner. Applying the proposed methodology to real-world data from an Australian police agency highlighted the timeliness of the process, resulting in significant improvements in processing times in comparison with processing a full forensic image. The Quick Analysis methodology, combined with Digital Forensic Data Reduction, has potential to locate evidence and intelligence in a timely manner. Copyright © 2016 John Wiley & Sons, Ltd.

Proceedings ArticleDOI
01 Aug 2017
TL;DR: A new model for IoT-forensics that takes privacy into consideration is proposed by incorporating the requirements of ISO/IEC 29100:2011 throughout the investigation life cycle to lay the groundwork for the voluntary cooperation of individuals in cyber crime investigations.
Abstract: The Internet of Things (IoT) brings new challenges to digital forensics. Given the number and heterogeneity of devices in such scenarios, it bring extremely difficult to carry out investigations without the cooperation of individuals. Even if they are not directly involved in the offense, their devices can yield digital evidence that might provide useful clarification in an investigation. However, when providing such evidence they may leak sensitive personal information. This paper proposes PRoFIT; a new model for IoT-forensics that takes privacy into consideration by incorporating the requirements of ISO/IEC 29100:2011 throughout the investigation life cycle. PRoFIT is intended to lay the groundwork for the voluntary cooperation of individuals in cyber crime investigations.

Proceedings ArticleDOI
24 Apr 2017
TL;DR: Two approaches towards of having the investigation for IoT forensic is proposed by emphasizing the pre-investigation phase and implementing the real-time investigation to ensure the data and potential evidence is collected and preserved throughout the investigation.
Abstract: The smart devices have been used in the most major domain like the healthcare, transportation, smart home, smart city and more. However, this technology has been exposed to many vulnerabilities, which may lead to cybercrime through the devices. With the IoT constraints and low-security mechanisms applied, the device could be easily been attacked, treated and exploited by cyber criminals where the smart devices could provide wrong data where it can lead to wrong interpretation and actuation to the legitimate users. To comply with the IoT characteristics, two approaches towards of having the investigation for IoT forensic is proposed by emphasizing the pre-investigation phase and implementing the real-time investigation to ensure the data and potential evidence is collected and preserved throughout the investigation.

Journal ArticleDOI
TL;DR: This paper features a proof-of-concept implementation using the open source forensic framework named plaso to export data to CASE, a rational evolution of the Digital Forensic Analysis eXpression (DFAX) for representing digital forensic information and provenance.

Proceedings ArticleDOI
01 Jan 2017
TL;DR: Unifying security and forensics by being forensically ready and including digital forensics aspects in security mechanisms would enhance the security level in cloud computing, increase forensic capabilities and prepare organizations for any potential attack.
Abstract: The rapid increase in the use of cloud computing has led it to become a new arena for cybercrime Since cloud environments are, to some extent, a new field for digital forensics, a number of technical, legal and organisational challenges have been raised Although security and digital forensics share the same concerns, when an attack occurs, the fields of security and digital forensics are considered different disciplines This paper argues that cloud security and digital forensics in cloud environments are converging fields As a result, unifying security and forensics by being forensically ready and including digital forensics aspects in security mechanisms would enhance the security level in cloud computing, increase forensic capabilities and prepare organizations for any potential attack

Proceedings ArticleDOI
23 Jun 2017
TL;DR: A novel Digital Forensics Model served for Smart City Automated Vehicles has been developed working on investigating AAV (Autonomous Automated Vehicle) cases and the proposed development is reported to Big Data 2017.
Abstract: In the modern world, cyber societies are full of complications The Internet has brought so many convenient services to our society but Internet is also a mine field Mass surveillance from smart phone to PC, from automated car to smart television, any online device seems could be turn to privacy breach toolkit In order to follow the GDPR (General Data Protection Regulation), protect privacy data, including PII (Personally Identifiable Information), against Cyberstalking and many other cybercrime challenges, a novel Digital Forensics Model served for Smart City Automated Vehicles has been developed working on investigating AAV (Autonomous Automated Vehicle) cases The proposed development is reported to Big Data 2017 Here, we provide an update for discussion

Journal ArticleDOI
TL;DR: An in-depth analysis in the frequency domain of the second-order statistics of the geometrically transformed images and an effective method that allows one to distinguish between the aforementioned four different processing chains is presented.
Abstract: Geometric transformations, such as resizing and rotation, are almost always needed when two or more images are spliced together to create convincing image forgeries. In recent years, researchers have developed many digital forensic techniques to identify these operations. Most previous works in this area focus on the analysis of images that have undergone single geometric transformations, e.g., resizing or rotation. In several recent works, researchers have addressed yet another practical and realistic situation: successive geometric transformations, e.g., repeated resizing, resizing-rotation, rotation-resizing, and repeated rotation. We will also concentrate on this topic in this paper. Specifically, we present an in-depth analysis in the frequency domain of the second-order statistics of the geometrically transformed images. We give an exact formulation of how the parameters of the first and second geometric transformations influence the appearance of periodic artifacts. The expected positions of characteristic resampling peaks are analytically derived. The theory developed here helps to address the gap left by previous works on this topic and is useful for image security and authentication, in particular, the forensics of geometric transformations in digital images. As an application of the developed theory, we present an effective method that allows one to distinguish between the aforementioned four different processing chains. The proposed method can further estimate all the geometric transformation parameters. This may provide useful clues for image forgery detection.

Journal ArticleDOI
TL;DR: This work presents a container-based software framework, SCARF, which applies this approach to forensic computations and shows that for several types of processing tasks–such as hashing, indexing and bulk processing–performance scales almost linearly with the addition of hardware resources.

Journal ArticleDOI
TL;DR: The existing methods in digital forensic tools that have been used to create a collision attacks in digital evidence are reviewed to review the existing methods.

Journal ArticleDOI
TL;DR: The authors present a three-way classification of state-of-the-art digital forensic techniques, along with a complete survey of their operating principles, and evaluate and compare their performances in terms of a proposed set of parameters, which may be used as a standard benchmark for evaluating the efficiency of any general copy-move forgery detection technique for digital images.
Abstract: Copy-move forgery is one of the most preliminary and prevalent forms of modification attack on digital images. In this form of forgery, region(s) of an image is(are) copied and pasted onto itself, and subsequently the forged image is processed appropriately to hide the effects of forgery. State-of-the-art copy-move forgery detection techniques for digital images are primarily motivated toward finding duplicate regions in an image. The last decade has seen lot of research advancement in the area of digital image forensics, whereby the investigation for possible forgeries is solely based on post-processing of images. In this study, the authors present a three-way classification of state-of-the-art digital forensic techniques, along with a complete survey of their operating principles. In addition, they analyse the schemes and evaluate and compare their performances in terms of a proposed set of parameters, which may be used as a standard benchmark for evaluating the efficiency of any general copy-move forgery detection technique for digital images. The comparison results provided by them would help a user to select the most optimal forgery detection technique, depending on the author requirements.

Journal ArticleDOI
TL;DR: Experimental results obtained on widely used databases show that the proposed method achieves outstanding performance in distinguishing median-filtered images from original images or images that have undergone other types of manipulations, demonstrating its great advantage to be applied in real-time processing of big multimedia data.
Abstract: Tampering detection has been increasingly attracting attention in the field of digital forensics. As a popular nonlinear smoothing filter, median filtering is often used as a post-processing operation after image forgeries such as copy-paste forgery (including copy-move and image splicing), which is of particular interest to researchers. To implement the blind detection of median filtering, this paper proposes a novel approach based on a frequency-domain feature coined the annular accumulated points (AAP). Experimental results obtained on widely used databases, which consists of various real-world photos, show that the proposed method achieves outstanding performance in distinguishing median-filtered images from original images or images that have undergone other types of manipulations, especially in the scenarios of low resolution and JPEG compression with a low quality factor. Moreover, our approach remains reliable even when the feature dimension decreases to 5, which is significant to save the computing time required for classification, demonstrating its great advantage to be applied in real-time processing of big multimedia data.

01 Jan 2017
TL;DR: This dissertation proposes computational methods that aim to support the exploration and sense-making process of large collections of textual digital traces, e.g., emails and social media streams, with the goal of predicting people’s future activity, by leveraging their historic digital traces.
Abstract: In the era of big data, we continuously — and at times unknowingly — leave behind digital traces, by browsing, sharing, posting, liking, searching, watching, and listening to online content. Aggregated, these digital traces can provide powerful insights into the behavior, preferences, activities, and traits of people. While many have raised privacy concerns around the use of aggregated digital traces, it has undisputedly brought us many advances, from the search engines that enable our access to unforeseen amounts of data, knowledge, and information, to, e.g., the discovery of previously unknown adverse drug reactions from search engine logs. Whether in online services, journalism, digital forensics, law, or research, we increasingly set out to exploring large amounts of digital traces to discover new information. Consider for instance, the Enron scandal, Hillary Clinton’s email controversy, or the Panama Papers: cases that revolve around analyzing, searching, investigating, exploring, and turning upside down large amounts of digital traces to gain new insights, knowledge, and information. This discovery task is at its core about “finding evidence of activity in the real world.” This dissertation revolves around discovery in digital traces. We propose computational methods that aim to support the exploration and sense-making process of large collections of textual digital traces, e.g., emails and social media streams. We address methods for analyzing the textual content of digital traces, and the contexts in which they are created, with the goal of predicting people’s future activity, by leveraging their historic digital traces.

11 Dec 2017
TL;DR: This research proposed to use a blockchain that can be leveraged for forensic applications in particular bringing integrity and tamper resistance to digital forensics chain of custody.
Abstract: Digital evidence plays an important role in cyber crime investigation, as it is used to link persons with criminal activities. Thus it is of extreme importance to guarantee integrity, authenticity, and auditability of digital evidence as it moves along different levels of hierarchy in chain of custody during cyber crime investigation. Blockchain technology’s capability of enabling comprehensive view of transactions (events/actions) back to origination provides enormous promise for the forensic community. In this research we proposed to use a blockchain that can be leveraged for forensic applications in particular bringing integrity and tamper resistance to digital forensics chain of custody.

Journal ArticleDOI
TL;DR: A cryptographic key protection scheme for android devices which prevents FROST from acquiring the key of AES by changing storage location of the key in memory by switched to the fixed position where command line parameters will be stored when android boots.
Abstract: With the flourish of applications based on the internet of things and cloud computing, privacy issues have been attracting a lot of attentions. Although the increasing use of full disk encryption (FDE) significantly hamper privacy leakage and digital forensics, cold boot attacks have thwarted FDE since forensic recovery of scrambled telephones (FROST), a forensic tool, is proposed. The cryptographic keys which are stored in the mobile devices are inclined to be obtained by FROST. Recent research results have shown CPU-bound encryption methods to resist FROST. However, these methods performs AES encryption solely on CPU registers, whose advantage comes at the cost of encryption speed. This paper, therefore, presents a cryptographic key protection scheme for android devices which prevents FROST from acquiring the key of AES by changing storage location of the key in memory. The storage location of the key is switched to the fixed position where command line parameters will be stored when android boots. Therefore, the key will be covered by command line parameters while the system reboots, which negates FROST from obtaining the key. Compared with the popular CPU-bound encryption methods, our method has less impact on encryption efficiency and employs no additional storage resources.

Posted Content
TL;DR: In this article, the authors evaluate the applicability of existing digital forensic process models and analyse how each of these might apply to a cloud-based evidence processing paradigm and propose a Digital Forensics as a Service (DFaaS) model.
Abstract: Digital forensic science is very much still in its infancy, but is becoming increasingly invaluable to investigators A popular area for research is seeking a standard methodology to make the digital forensic process accurate, robust, and efficient The first digital forensic process model proposed contains four steps: Acquisition, Identification, Evaluation and Admission Since then, numerous process models have been proposed to explain the steps of identifying, acquiring, analysing, storage, and reporting on the evidence obtained from various digital devices In recent years, an increasing number of more sophisticated process models have been proposed These models attempt to speed up the entire investigative process or solve various of problems commonly encountered in the forensic investigation In the last decade, cloud computing has emerged as a disruptive technological concept, and most leading enterprises such as IBM, Amazon, Google, and Microsoft have set up their own cloud-based services In the field of digital forensic investigation, moving to a cloud-based evidence processing model would be extremely beneficial and preliminary attempts have been made in its implementation Moving towards a Digital Forensics as a Service model would not only expedite the investigative process, but can also result in significant cost savings - freeing up digital forensic experts and law enforcement personnel to progress their caseload This paper aims to evaluate the applicability of existing digital forensic process models and analyse how each of these might apply to a cloud-based evidence processing paradigm

Journal ArticleDOI
01 Nov 2017
TL;DR: The proliferation of mobile devices has led to advanced cybercriminal activities that exploit their ubiquity and there are research opportunities that must be explored to enable more efficient mobile forensic techniques and technologies.
Abstract: The proliferation of mobile devices has led to advanced cybercriminal activities that exploit their ubiquity. Contemporary mobile forensics techniques and the challenges facing forensic investigators are discussed. Also identified are research opportunities that must be explored to enable more efficient mobile forensic techniques and technologies.

Proceedings ArticleDOI
26 Oct 2017
TL;DR: This survey reviews several tools and methods in the literature which extract pieces of evidence from the system and analyze them anddfscusses the challenges during the collection and analysis of low level data from the compromised system.
Abstract: The growth of digital technologies results in the growth of digital crimes. Digital forensics aims to collect crime-related evidence from various digital media and analyze it. This survey reviews several tools and methods in the literature which extract pieces of evidence from the system and analyze them. It also dfscusses the challenges during the collection and analysis of low level data from the compromised system.

Journal ArticleDOI
TL;DR: In this research, CloudMe, a popular cloud storage service, is studied and the types and locations of the artefacts relating to the installation and uninstallation of CloudMe client application, logging in and out, and file synchronization events from the computer desktop and mobile clients are described.
Abstract: The significant increase in the volume, variety and velocity of data complicates cloud forensic efforts, as such big data will, at some point, become computationally expensive to be fully extracted and analyzed in a timely manner. Thus, it is important for a digital forensic practitioner to have a well-rounded knowledge about the most relevant data artefacts that could be forensically recovered from the cloud product under investigation. In this paper, CloudMe, a popular cloud storage service, is studied. The types and locations of the artefacts relating to the installation and uninstallation of the client application, logging in and out, and file synchronization events from the computer desktop and mobile clients are described. Findings from this research will pave the way towards the development of tools and techniques (e.g. data mining techniques) for cloud-enabled big data endpoint forensics investigation.

Journal ArticleDOI
TL;DR: This paper presents a novel approach to the identification of users from network traffic using only the meta-data of the traffic and the creation of application-level user interactions, which are proven to provide a far richer discriminatory feature set to enable more reliable identity verification.