scispace - formally typeset
Search or ask a question
Author

Shams Zawoad

Bio: Shams Zawoad is an academic researcher from University of Alabama at Birmingham. The author has contributed to research in topics: Cloud computing & Digital forensics. The author has an hindex of 16, co-authored 37 publications receiving 933 citations.

Papers
More filters
Proceedings ArticleDOI
27 Jun 2015
TL;DR: A Forensics-aware IoT (FAIoT) model is proposed for supporting reliable forensics investigations in the IoT environment and the first working definition of IoT forensics is proposed.
Abstract: The Internet of Things (IoT) involves numerous connected smart things with different technologies and communication standards. While IoT opens new opportunities in various fields, it introduces new challenges in the field of digital forensics investigations. The existing tools and procedures of digital forensics cannot meet the highly distributed and heterogeneous infrastructure of the IoT. Forensics investigators will face challenges while identifying necessary pieces of evidence from the IoT environment, and collecting and analyzing those evidence. In this article, we propose the first working definition of IoT forensics and systematically analyze the IoT forensics domain to explore the challenges and issues in this special branch of digital forensics. We propose a Forensics-aware IoT (FAIoT) model for supporting reliable forensics investigations in the IoT environment.

140 citations

Posted Content
TL;DR: The systematic approach towards understanding the nature and challenges of cloud forensics will allow us to examine possible secure solution approaches, leading to increased trust on and adoption of cloud computing, especially in business, healthcare, and national security, and will lead to lower cost and long-term benefit to society as a whole.
Abstract: In recent years, cloud computing has become popular as a cost-effective and efficient computing paradigm. Unfortunately, today's cloud computing architectures are not designed for security and forensics. To date, very little research has been done to develop the theory and practice of cloud forensics. Many factors complicate forensic investigations in a cloud environment. First, the storage system is no longer local. Therefore, even with a subpoena, law enforcement agents cannot confiscate the suspect's computer and get access to the suspect's files. Second, each cloud server contains files from many users. Hence, it is not feasible to seize servers from a data center without violating the privacy of many other users. Third, even if the data belonging to a particular suspect is identified, separating it from other users' data is difficult. Moreover, other than the cloud provider's word, there is usually no evidence that links a given data file to a particular suspect. For such challenges, clouds cannot be used to store healthcare, business, or national security related data, which require audit and regulatory compliance. In this paper, we systematically examine the cloud forensics problem and explore the challenges and issues in cloud forensics. We then discuss existing research projects and finally, we highlight the open problems and future directions in cloud forensics research area. We posit that our systematic approach towards understanding the nature and challenges of cloud forensics will allow us to examine possible secure solution approaches, leading to increased trust on and adoption of cloud computing, especially in business, healthcare, and national security. This in turn will lead to lower cost and long-term benefit to our society as a whole.

124 citations

Posted Content
TL;DR: This paper introduces Secure-Logging-as-a-Service (SecLaaS), which stores virtual machines' logs and provides access to forensic investigators ensuring the confidentiality of the cloud users and preserves proofs of past log and thus protects the integrity of the logs from dishonest investigators or cloud providers.
Abstract: Cloud computing has emerged as a popular computing paradigm in recent years. However, today's cloud computing architectures often lack support for computer forensic investigations. Analyzing various logs (e.g., process logs, network logs) plays a vital role in computer forensics. Unfortunately, collecting logs from a cloud is very hard given the black-box nature of clouds and the multi-tenant cloud models, where many users share the same processing and network resources. Researchers have proposed using log API or cloud management console to mitigate the challenges of collecting logs from cloud infrastructure. However, there has been no concrete work, which shows how to provide cloud logs to investigator while preserving users' privacy and integrity of the logs. In this paper, we introduce Secure-Logging-as-a-Service (SecLaaS), which stores virtual machines' logs and provides access to forensic investigators ensuring the confidentiality of the cloud users. Additionally, SeclaaS preserves proofs of past log and thus protects the integrity of the logs from dishonest investigators or cloud providers. Finally, we evaluate the feasibility of the scheme by implementing SecLaaS for network access logs in OpenStack - a popular open source cloud platform.

124 citations

Proceedings ArticleDOI
08 May 2013
TL;DR: In this paper, the authors proposed a secure logging-as-a-service (SecLaaS) scheme to store virtual machines' logs and provide access to forensic investigators ensuring the confidentiality of the cloud users.
Abstract: Cloud computing has emerged as a popular computing paradigm in recent years. However, today's cloud computing architectures often lack support for computer forensic investigations. Analyzing various logs (e.g., process logs, network logs) plays a vital role in computer forensics. Unfortunately, collecting logs from a cloud is very hard given the black-box nature of clouds and the multi-tenant cloud models, where many users share the same processing and network resources. Researchers have proposed using log API or cloud management console to mitigate the challenges of collecting logs from cloud infrastructure. However, there has been no concrete work, which shows how to provide cloud logs to investigator while preserving users' privacy and integrity of the logs. In this paper, we introduce Secure-Logging-as-a-Service (SecLaaS), which stores virtual machines' logs and provides access to forensic investigators ensuring the confidentiality of the cloud users. Additionally, SeclaaS preserves proofs of past log and thus protects the integrity of the logs from dishonest investigators or cloud providers. Finally, we evaluate the feasibility of the scheme by implementing SecLaaS for network access logs in OpenStack -- a popular open source cloud platform.

111 citations

Journal ArticleDOI
TL;DR: This paper analyzes the threats on cloud users' activity logs considering the collusion between cloud users, providers, and investigators and proposes Secure-Logging-as-a-Service (SecLaaS), which preserves various logs generated for the activity of virtual machines running in clouds and ensures the confidentiality and integrity of such logs.
Abstract: Collection and analysis of various logs (e.g., process logs, network logs) are fundamental activities in computer forensics. Ensuring the security of the activity logs is therefore crucial to ensure reliable forensics investigations. However, because of the black-box nature of clouds and the volatility and co-mingling of cloud data, providing the cloud logs to investigators while preserving users’ privacy and the integrity of logs is challenging. The current secure logging schemes, which consider the logger as trusted cannot be applied in clouds since there is a chance that cloud providers (logger) collude with malicious users or investigators to alter the logs. In this paper, we analyze the threats on cloud users’ activity logs considering the collusion between cloud users, providers, and investigators. Based on the threat model, we propose Secure-Logging-as-a-Service ( SecLaaS ), which preserves various logs generated for the activity of virtual machines running in clouds and ensures the confidentiality and integrity of such logs. Investigators or the court authority can only access these logs by the RESTful APIs provided by SecLaaS, which ensures confidentiality of logs. The integrity of the logs is ensured by hash-chain scheme and proofs of past logs published periodically by the cloud providers. In prior research, we used two accumulator schemes Bloom filter and RSA accumulator to build the proofs of past logs. In this paper, we propose a new accumulator scheme— Bloom-Tree , which performs better than the other two accumulators in terms of time and space requirement.

81 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The main goal of this study is to holistically analyze the security threats, challenges, and mechanisms inherent in all edge paradigms, while highlighting potential synergies and venues of collaboration.

1,045 citations

Journal Article
TL;DR: Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) with einer Schlüssellänge von 128 bit zum Sicherheitsrisiko, und zuletzt konnte 1998 mit einem von der „Electronic Frontier Foundation“ (EFF) entwickkelten Spezialmaschine mit 1.800 parallel arbeit
Abstract: Im Jahre 1977 wurde der „Data Encryption Algorithm“ (DEA) vom „National Bureau of Standards“ (NBS, später „National Institute of Standards and Technology“ – NIST) zum amerikanischen Verschlüsselungsstandard für Bundesbehörden erklärt [NBS_77]. 1981 folgte die Verabschiedung der DEA-Spezifikation als ANSI-Standard „DES“ [ANSI_81]. Die Empfehlung des DES als StandardVerschlüsselungsverfahren wurde auf fünf Jahre befristet und 1983, 1988 und 1993 um jeweils weitere fünf Jahre verlängert. Derzeit liegt eine Neufassung des NISTStandards vor [NIST_99], in dem der DES für weitere fünf Jahre übergangsweise zugelassen sein soll, aber die Verwendung von Triple-DES empfohlen wird: eine dreifache Anwendung des DES mit drei verschiedenen Schlüsseln (effektive Schlüssellänge: 168 bit) [NIST_99]. Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) mit einer Schlüssellänge von 128 bit. Da die amerikanische „National Security Agency“ (NSA) dafür gesorgt hatte, daß der DES eine Schlüssellänge von lediglich 64 bit besitzt, von denen nur 56 bit relevant sind, und spezielle Substitutionsboxen (den „kryptographischen Kern“ des Verfahrens) erhielt, deren Konstruktionskriterien von der NSA nicht veröffentlicht wurden, war das Verfahren von Beginn an umstritten. Kritiker nahmen an, daß es eine geheime „Trapdoor“ in dem Verfahren gäbe, die der NSA eine OnlineEntschlüsselung auch ohne Kenntnis des Schlüssels erlauben würde. Zwar ließ sich dieser Verdacht nicht erhärten, aber sowohl die Zunahme von Rechenleistung als auch die Parallelisierung von Suchalgorithmen machen heute eine Schlüssellänge von 56 bit zum Sicherheitsrisiko. Zuletzt konnte 1998 mit einer von der „Electronic Frontier Foundation“ (EFF) entwickelten Spezialmaschine mit 1.800 parallel arbeitenden, eigens entwickelten Krypto-Prozessoren ein DES-Schlüssel in einer Rekordzeit von 2,5 Tagen gefunden werden. Um einen Nachfolger für den DES zu finden, kündigte das NIST am 2. Januar 1997 die Suche nach einem „Advanced Encryption Standard“ (AES) an. Ziel dieser Initiative ist, in enger Kooperation mit Forschung und Industrie ein symmetrisches Verschlüsselungsverfahren zu finden, das geeignet ist, bis weit ins 21. Jahrhundert hinein amerikanische Behördendaten wirkungsvoll zu verschlüsseln. Dazu wurde am 12. September 1997 ein offizieller „Call for Algorithm“ ausgeschrieben. An die vorzuschlagenden symmetrischen Verschlüsselungsalgorithmen wurden die folgenden Anforderungen gestellt: nicht-klassifiziert und veröffentlicht, weltweit lizenzfrei verfügbar, effizient implementierbar in Hardund Software, Blockchiffren mit einer Blocklänge von 128 bit sowie Schlüssellängen von 128, 192 und 256 bit unterstützt. Auf der ersten „AES Candidate Conference“ (AES1) veröffentlichte das NIST am 20. August 1998 eine Liste von 15 vorgeschlagenen Algorithmen und forderte die Fachöffentlichkeit zu deren Analyse auf. Die Ergebnisse wurden auf der zweiten „AES Candidate Conference“ (22.-23. März 1999 in Rom, AES2) vorgestellt und unter internationalen Kryptologen diskutiert. Die Kommentierungsphase endete am 15. April 1999. Auf der Basis der eingegangenen Kommentare und Analysen wählte das NIST fünf Kandidaten aus, die es am 9. August 1999 öffentlich bekanntmachte: MARS (IBM) RC6 (RSA Lab.) Rijndael (Daemen, Rijmen) Serpent (Anderson, Biham, Knudsen) Twofish (Schneier, Kelsey, Whiting, Wagner, Hall, Ferguson).

624 citations

Journal ArticleDOI
TL;DR: The purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges, as well as some promising cross-cutting data reduction and forensics intelligence techniques.
Abstract: Today is the era of the Internet of Things (IoT). The recent advances in hardware and information technology have accelerated the deployment of billions of interconnected, smart and adaptive devices in critical infrastructures like health, transportation, environmental control, and home automation. Transferring data over a network without requiring any kind of human-to-computer or human-to-human interaction, brings reliability and convenience to consumers, but also opens a new world of opportunity for intruders, and introduces a whole set of unique and complicated questions to the field of Digital Forensics. Although IoT data could be a rich source of evidence, forensics professionals cope with diverse problems, starting from the huge variety of IoT devices and non-standard formats, to the multi-tenant cloud infrastructure and the resulting multi-jurisdictional litigations. A further challenge is the end-to-end encryption which represents a trade-off between users’ right to privacy and the success of the forensics investigation. Due to its volatile nature, digital evidence has to be acquired and analyzed using validated tools and techniques that ensure the maintenance of the Chain of Custody. Therefore, the purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges. Furthermore, this work provides an overview of the past and current theoretical models in the digital forensics science. Special attention is paid to frameworks that aim to extract data in a privacy-preserving manner or secure the evidence integrity using decentralized blockchain-based solutions. In addition, the present paper addresses the ongoing Forensics-as-a-Service (FaaS) paradigm, as well as some promising cross-cutting data reduction and forensics intelligence techniques. Finally, several other research trends and open issues are presented, with emphasis on the need for proactive Forensics Readiness strategies and generally agreed-upon standards.

440 citations

Journal ArticleDOI
TL;DR: This paper provides an overview of existing security and privacy concerns, particularly for the fog computing, and highlights ongoing research effort, open challenges, and research trends in privacy and security issues for fog computing.
Abstract: Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heterogeneity, and large-scale geo-distribution. This paper provides an overview of existing security and privacy concerns, particularly for the fog computing. Afterward, this survey highlights ongoing research effort, open challenges, and research trends in privacy and security issues for fog computing.

414 citations

Journal ArticleDOI
TL;DR: IoT’s novel factors affecting traditional computer forensics are explored, including its strengths and weaknesses, and several indispensable open research challenges are identified as future research directions.

232 citations