scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Information Security in 2017"


Journal ArticleDOI
TL;DR: This study presents a new set of Non-Linear Statistical Models that can be used in estimating the probability of being exploited as a function of time and uses available data sources in a probabilistic foundation to enhance the reliability.
Abstract: Obtaining complete information regarding discovered vulnerabilities looks extremely difficult. Yet, developing statistical models requires a great deal of such complete information about the vulnerabilities. In our previous studies, we introduced a new concept of “Risk Factor” of vulnerability which was calculated as a function of time. We introduced the use of Markovian approach to estimate the probability of a particular vulnerability being at a particular “state” of the vulnerability life cycle. In this study, we further develop our models, use available data sources in a probabilistic foundation to enhance the reliability and also introduce some useful new modeling strategies for vulnerability risk estimation. Finally, we present a new set of Non-Linear Statistical Models that can be used in estimating the probability of being exploited as a function of time. Our study is based on the typical security system and vulnerability data that are available. However, our methodology and system structure can be applied to a specific security system by any software engineer and using their own vulnerabilities to obtain their probability of being exploited as a function of time. This information is very important to a company’s security system in its strategic plan to monitor and improve its process for not being exploited.

21 citations


Journal ArticleDOI
TL;DR: An extensive and well-constructed overview of using machine learning for the problem of detecting anomalies in streaming datasets and shows how a combination of techniques from different compositions can solve a prominent problem, anomaly detection.
Abstract: This survey aims to deliver an extensive and well-constructed overview of using machine learning for the problem of detecting anomalies in streaming datasets. The objective is to provide the effectiveness of using Hoeffding Trees as a machine learning algorithm solution for the problem of detecting anomalies in streaming cyber datasets. In this survey we categorize the existing research works of Hoeffding Trees which can be feasible for this type of study into the following: surveying distributed Hoeffding Trees, surveying ensembles of Hoeffding Trees and surveying existing techniques using Hoeffding Trees for anomaly detection. These categories are referred to as compositions within this paper and were selected based on their relation to streaming data and the flexibility of their techniques for use within different domains of streaming data. We discuss the relevance of how combining the techniques of the proposed research works within these compositions can be used to address the anomaly detection problem in streaming cyber datasets. The goal is to show how a combination of techniques from different compositions can solve a prominent problem, anomaly detection.

20 citations


Journal ArticleDOI
TL;DR: An effective gray image cryptosystem containing Arnold cat map for pixel permutation and an improved Logistic map for the generation of encryption keys to be used for pixel modification is proposed and the proposed algorithm has superior security and effectively encrypts and decrypts the gray images.
Abstract: In this paper, we propose an effective gray image cryptosystem containing Arnold cat map for pixel permutation and an improved Logistic map for the generation of encryption keys to be used for pixel modification. Firstly, a new chaotic map is designed to show better performance than the standard one in terms of key space range, complexity and uniformity. Generated secret key is not only sensitive to the control parameters and initial condition of the improved map but also strongly depend on the plain image characteristic which provides an effective resistance against statistical and differential attacks. Additionally, to get higher encryption strength of the cryptosystem, both confusion and diffusion processes are performed with different keys in every iterations. Theoretical analysis and simulation results confirm that the proposed algorithm has superior security and effectively encrypts and decrypts the gray images as well.

15 citations


Journal ArticleDOI
TL;DR: Various steganalysis methods, different filtering based preprocessing methods, feature extraction methods, and machine learning based classification methods are surveyed for the proper identification of steganography in the image.
Abstract: Steganography is the process of hiding data into public digital medium for secret communication. The image in which the secret data is hidden is termed as stego image. The detection of hidden embedded data in the image is the foundation for blind image steganalysis. The appropriate selection of cover file type and composition contribute to the successful embedding. A large number of steganalysis techniques are available for the detection of steganography in the image. The performance of the steganalysis technique depends on the ability to extract the discriminative features for the identification of statistical changes in the image due to the embedded data. The issue encountered in the blind image steganography is the non-availability of knowledge about the applied steganography techniques in the images. This paper surveys various steganalysis methods, different filtering based preprocessing methods, feature extraction methods, and machine learning based classification methods, for the proper identification of steganography in the image.

15 citations


Journal ArticleDOI
TL;DR: This study has developed a predictive analytic model for three popular Desktop Operating Systems, namely, Windows 7, Mac OS X, and Linux Kernel by using their reported vulnerabilities on the National Vulnerability Database (NVD).
Abstract: Vulnerability forecasting models help us to predict the number of vulnerabilities that may occur in the future for a given Operating System (OS). There exist few models that focus on quantifying future vulnerabilities without consideration of trend, level, seasonality and non linear components of vulnerabilities. Unlike traditional ones, we propose a vulnerability analytic prediction model based on linear and non-linear approaches via time series analysis. We have developed the models based on Auto Regressive Moving Average (ARIMA), Artificial Neural Network (ANN), and Support Vector Machine (SVM) settings. The best model which provides the minimum error rate is selected for prediction of future vulnerabilities. Utilizing time series approach, this study has developed a predictive analytic model for three popular Desktop Operating Systems, namely, Windows 7, Mac OS X, and Linux Kernel by using their reported vulnerabilities on the National Vulnerability Database (NVD). Based on these reported vulnerabilities, we predict ahead their behavior so that the OS companies can make strategic and operational decisions like secure deployment of OS, facilitate backup provisioning, disaster recovery, diversity planning, maintenance scheduling, etc. Similarly, it also helps in assessing current security risks along with estimation of resources needed for handling potential security breaches and to foresee the future releases of security patches. The proposed non-linear analytic models produce very good prediction results in comparison to linear time series models.

12 citations


Journal ArticleDOI
TL;DR: A stochastic model is proposed to quantify the risk associated with the overall network using Markovian process in conjunction with Common Vulnerability Scoring System (CVSS) framework and uses host access graph to represent the network environment.
Abstract: There are several security metrics developed to protect the computer networks. In general, common security metrics focus on qualitative and subjective aspects of networks lacking formal statistical models. In the present study, we propose a stochastic model to quantify the risk associated with the overall network using Markovian process in conjunction with Common Vulnerability Scoring System (CVSS) framework. The model we developed uses host access graph to represent the network environment. Utilizing the developed model, one can filter the large amount of information available by making a priority list of vulnerable nodes existing in the network. Once a priority list is prepared, network administrators can make software patch decisions. Gaining in depth understanding of the risk and priority level of each host helps individuals to implement decisions like deployment of security products and to design network topologies.

11 citations


Journal ArticleDOI
TL;DR: This architecture sought to outperform the cloud-based architecture and to ensure further enhancements to system performance, especially from the perspective of security, and results indicate that this system has the response and processing time faster than classical cloud systems.
Abstract: Fog computing is a concept that extends the paradigm of cloud computing to the network edge. The goal of fog computing is to situate resources in the vicinity of end users. As with cloud computing, fog computing provides storage services. The data owners can store their confidential data in many fog nodes, which could cause more challenges for data sharing security. In this paper, we present a novel architecture for data sharing in a fog environment. We explore the benefits of fog computing in addressing one-to-many data sharing applications. This architecture sought to outperform the cloud-based architecture and to ensure further enhancements to system performance, especially from the perspective of security. We will address the security challenges of data sharing, such as fine-grained access control, data confidentiality, collusion resistance, scalability, and the issue of user revocation. Keeping these issues in mind, we will secure data sharing in fog computing by combining attributebased encryption and proxy re-encryption techniques. Findings of this study indicate that our system has the response and processing time faster than classical cloud systems. Further, experimental results show that our system has an efficient user revocation mechanism, and that it provides high scalability and sharing of data in real time with low latency.

10 citations


Journal ArticleDOI
TL;DR: This paper investigates impact of different ICMP based security attacks on two popular server systems namely Microsoft’s Windows Server and Apple's Mac Server OS running on same hardware platform, and compares their performance under different types of ICMPbased security attacks.
Abstract: There are different types of Cyber Security Attacks that are based on ICMP protocols. Many ICMP protocols are very similar, which may lead security managers to think they may have same impact on victim computer systems or servers. In this paper, we investigate impact of different ICMP based security attacks on two popular server systems namely Microsoft’s Windows Server and Apple’s Mac Server OS running on same hardware platform, and compare their performance under different types of ICMP based security attacks.

9 citations


Journal ArticleDOI
TL;DR: The Benford’s law features with support vector machine are proposed for the detection of malicious tampering of JPEG fingerprint images, aimed at protecting against insider attackers and hackers.
Abstract: Tampering of biometric data has attracted a great deal of attention recently. Furthermore, there could be an intentional or accidental use of a particular biometric sample instead of another for a particular application. Therefore, there exists a need to propose a method to detect data tampering, as well as differentiate biometric samples in cases of intentional or accidental use for a different application. In this paper, fingerprint image tampering is studied. Furthermore, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints are studied for separation purposes using the Benford’s law divergence metric. Benford’s law has shown in literature to be very effective in detecting tampering of natural images. In this paper, the Benford’s law features with support vector machine are proposed for the detection of malicious tampering of JPEG fingerprint images. This method is aimed at protecting against insider attackers and hackers. This proposed method detected tampering effectively, with Equal Error Rate (EER) of 2.08%. Again, the experimental results illustrate that, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints can be separated by the proposed method effectively.

7 citations


Journal ArticleDOI
TL;DR: The most common Side Channel Attacks on the implementations of cryptographic algorithms (symmetric: AES and asymmetric: RSA) with the countermeasures against these attacks are presented.
Abstract: Cyber-Physical Systems, or Smart-Embedded Systems, are co-engineered for the integration of physical, computational and networking resources. These resources are used to develop an efficient base for enhancing the quality of services in all areas of life and achieving a classier lifestyle in terms of a required service’s functionality and timing. Cyber-Physical Systems (CPSs) complement the need to have smart products (e.g., homes, hospitals, airports, cities). In other words, regulate the three kinds of resources available: physical, computational, and networking. This regulation supports communication and interaction between the human word and digital word to find the required intelligence in all scopes of life, including Telecommunication, Power Generation and Distribution, and Manufacturing. Data Security is among the most important issues to be considered in recent technologies. Because Cyber-Physical Systems consist of interacting complex components and middle-ware, they face real challenges in being secure against cyber-attacks while functioning efficiently and without affecting or degrading their performance. This study gives a detailed description of CPSs, their challenges (including cyber-security attacks), characteristics, and related technologies. We also focus on the tradeoff between security and performance in CPS, and we present the most common Side Channel Attacks on the implementations of cryptographic algorithms (symmetric: AES and asymmetric: RSA) with the countermeasures against these attacks.

7 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a review on how search engines work in regards to web search queries and user intent, and identify and highlight areas open for further investigative and innovative research in regard to end-user personalized web search privacy.
Abstract: While search engines have become vital tools for searching information on the Internet, privacy issues remain a growing concern due to the technological abilities of search engines to retain user search logs. Although such capabilities might provide enhanced personalized search results, the confidentiality of user intent remains uncertain. Even with web search query obfuscation techniques, another challenge remains, namely, reusing the same obfuscation methods is problematic, given that search engines have enormous computation and storage resources for query disambiguation. A number of web search query privacy procedures involve the cooperation of the search engine, a non-trusted entity in such cases, making query obfuscation even more challenging. In this study, we provide a review on how search engines work in regards to web search queries and user intent. Secondly, this study reviews material in a manner accessible to those outside computer science with the intent to introduce knowledge of web search engines to enable non-computer scientists to approach web search query privacy innovatively. As a contribution, we identify and highlight areas open for further investigative and innovative research in regards to end-user personalized web search privacy—that is methods that can be executed on the user side without third party involvement such as, search engines. The goal is to motivate future web search obfuscation heuristics that give users control over their personal search privacy.

Journal ArticleDOI
TL;DR: The research findings can be used by decision makers and lawmakers to improve existing cyber security laws, and enact laws for data privacy and sharing of open data.
Abstract: This paper presents an innovative Soft Design Science Methodology for improving information systems security using multi-layered security approach. The study applied Soft Design Science Methodology to address the problematic situation on how information systems security can be improved. In addition, Soft Design Science Methodology was compounded with mixed research methodology. This holistic approach helped for research methodology triangulation. The study assessed security requirements and developed a framework for improving information systems security. The study carried out maturity level assessment to determine security status quo in the education sector in Tanzania. The study identified security requirements gap (IT security controls, IT security measures) using ISO/IEC 21827: Systems Security Engineering-Capability Maturity Model (SSE-CMM) with a rating scale of 0 - 5. The results of this study show that maturity level across security domain is 0.44 out of 5. The finding shows that the implementation of IT security controls and security measures for ensuring security goals are lacking or conducted in ad-hoc. Thus, for improving the security of information systems, organisations should implement security controls and security measures in each security domain (multi-layer security). This research provides a framework for enhancing information systems security during capturing, processing, storage and transmission of information. This research has several practical contributions. Firstly, it contributes to the body of knowledge of information systems security by providing a set of security requirements for ensuring information systems security. Secondly, it contributes empirical evidence on how information systems security can be improved. Thirdly, it contributes on the applicability of Soft Design Science Methodology on addressing the problematic situation in information systems security. The research findings can be used by decision makers and lawmakers to improve existing cyber security laws, and enact laws for data privacy and sharing of open data.

Journal ArticleDOI
TL;DR: It is demonstrated how multiple sequence alignment supplemented with gap penalties leads to viral code signatures that generalize successfully to previously known polymorphic variants of JS.
Abstract: Antiviral software systems (AVSs) have problems in identifying polymorphic variants of viruses without explicit signatures for such variants. Alignment-based techniques from bioinformatics may provide a novel way to generate signatures from consensuses found in polymorphic variant code. We demonstrate how multiple sequence alignment supplemented with gap penalties leads to viral code signatures that generalize successfully to previously known polymorphic variants of JS. Cassandra virus and previously unknown polymorphic variants of W32.CTX/W32.Cholera and W32.Kitti viruses. The implications are that future smart AVSs may be able to generate effective signatures automatically from actual viral code by varying gap penalties to cover for both known and unknown polymorphic variants.

Journal ArticleDOI
TL;DR: The research rejects the H0 null hypothesis that AHP does not affect the relationship between the information technology analysts’ prioritization of five defense in-depth dependent variables and the independent variables of cost, ease of use, and effectiveness in protecting organizational devices against cyber-attacks.
Abstract: Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection systems, and password procedures across their enterprises to plan strategically and manage IT security risks. This quantitative study explores whether the analytical hierarchy process (AHP) model can be effectively applied to the prioritization of information assurance defense in-depth measures. In response to these threats, the President, legislators, experts, and others have characterized cyber security as a pressing national security issue. The methods used in this study consisted of emailing study participants a survey requesting that they prioritize five defense in-depth information assurance measures, anti-virus, intrusion detection, password, smart-cards, and encryption, with a range of responses from 1 - 5 using a Likert scale to consider standard cost, effectiveness, and perceived ease of use in terms of protection of organizational computing devices. The measures were then weighted, based on ranking. A pair-wise comparison of each of the five measures is then made using AHP to determine whether the Likert scale and the AHP model could be effectively applied to the prioritization of information assurance measures to protect organizational computing devices. The findings of the research reject the H0 null hypothesis that AHP does not affect the relationship between the information technology analysts’ prioritization of five defense in-depth dependent variables and the independent variables of cost, ease of use, and effectiveness in protecting organizational devices against cyber-attacks.

Journal ArticleDOI
TL;DR: The concept of eGovernment is elaborated within context of the understanding of the lay ICT policy maker and systems users of Ghana to follow the soft or social issues limiting the efforts in designing and implementing ICT policies in Ghana.
Abstract: It is obvious that no government or business operation can be done without some kinds of involvement of ICTs in the process. However, research shows that, the policies and implementation of ICTs in delivering public services face several challenges in different countries due to country level peculiarities and cultures. These challenges are socio-technical from the prospective of soft and hard implications of the eGovernment systems. This paper tries to follow the soft or social issues limiting the efforts in designing and implementing ICT policies in Ghana. We identify some of the challenges in respect of the policy makers and back and front office users of ICT systems. This paper has therefore, elaborated the concept of eGovernment within context of the understanding of the lay ICT policy maker and systems users of Ghana. The issues addressed in this article range from development to the sustainability of eGovernment systems, policies and stages that resonate with norms and culture, political and the administrative style of Ghana.

Journal ArticleDOI
TL;DR: This work describes security techniques for securing a VCCI, VMMs such as Encryption and Key Management (EKM), Access Control Mechanisms (ACMs), Virtual Trusted Platform Module (vTPM), Virtual Firewall (VF), and Trusted Virtual Domains (TVDs).
Abstract: The data and applications in cloud computing reside in cyberspace, that allowing to users access data through any connection device, when you need to transfer information over the cloud, you will lose control of it. There are multi types of security challenge must be understood and countermeasures. One of the major security challenges is resources of the cloud computing infrastructures are provided as services over the Internet, and entire data in the cloud computing are reside over network resources, that enables the data to be access through VMs. In this work, we describe security techniques for securing a VCCI, VMMs such as Encryption and Key Management (EKM), Access Control Mechanisms (ACMs), Virtual Trusted Platform Module (vTPM), Virtual Firewall (VF), and Trusted Virtual Domains (TVDs). In this paper we focus on security of virtual resources in Virtualized Cloud Computing Infrastructure (VCCI), Virtual Machine Monitor (VMM) by describing types of attacks on VCCI, and vulnerabilities of VMMs and we describe the techniques for securing a VCCI.

Journal ArticleDOI
TL;DR: An optimized homomorphic scheme (Op_FHE_SHCR) is proposed which speed up ciphertext (Rc) retrieval and addresses metadata dynamics and authentication through the authors' secure Anonymiser agent and applies an optimized ternary search tries (TST) algorithm in the metadata repository which utilizes Merkle hash tree structure to manage metadata authentication and dynamics.
Abstract: Security insurance is a paramount cloud services issue in the most recent decade. Therefore, Mapreduce which is a programming framework for preparing and creating huge data collections should be optimized and securely implemented. But, conventional operations on ciphertexts were not relevant. So there is a foremost need to enable particular sorts of calculations to be done on encrypted data and additionally optimize data processing at the Map stage. Thereby schemes like (DGHV) and (Gen 10) are presented to address data privacy issue. However private encryption key (DGHV) or key’s parameters (Gen 10) are sent to untrusted cloud server which compromise the information security insurance. Therefore, in this paper we propose an optimized homomorphic scheme (Op_FHE_SHCR) which speed up ciphertext (Rc) retrieval and addresses metadata dynamics and authentication through our secure Anonymiser agent. Additionally for the efficiency of our proposed scheme regarding computation cost and security investigation, we utilize a scalar homomorphic approach instead of applying a blinding probabilistic and polynomial-time calculation which is computationally expensive. Doing as such, we apply an optimized ternary search tries (TST) algorithm in our metadata repository which utilizes Merkle hash tree structure to manage metadata authentication and dynamics.

Journal ArticleDOI
TL;DR: An authenticated privacy preserving pairing-based scheme for remote health monitoring systems based on the concepts of bilinear paring, identity-based cryptography and non-interactive identity- based key agreement protocol and an efficient batch signature verification scheme to reduce computation cost during multiple simultaneous signature verifications is proposed.
Abstract: The digitization of patient health information has brought many benefits and challenges for both the patients and physicians. However, security and privacy preservation have remained important challenges for remote health monitoring systems. Since a patient’s health information is sensitive and the communication channel (i.e. the Internet) is insecure, it is important to protect them against unauthorized entities. Otherwise, failure to do so will not only lead to compromise of a patient’s privacy, but will also put his/her life at risk. How to provide for confidentiality, patient anonymity and un-traceability, access control to a patient’s health information and even key exchange between a patient and her physician are critical issues that need to be addressed if a wider adoption of remote health monitoring systems is to be realized. This paper proposes an authenticated privacy preserving pairing-based scheme for remote health monitoring systems. The scheme is based on the concepts of bilinear paring, identity-based cryptography and non-interactive identity-based key agreement protocol. The scheme also incorporates an efficient batch signature verification scheme to reduce computation cost during multiple simultaneous signature verifications.

Journal ArticleDOI
TL;DR: The efforts of implementing PC-Encoding to harden portable binaries in ELF (Executable and Linkable Format) are described, which is simple and intuitive to implement and incur little overhead.
Abstract: ARM® is the prevalent processor architecture for embedded and mobile applications. For the smartphones, it is the processor for which software applications are running, whether the platform is with Apple’s iOS or Google’s Android. Software operations under these platforms are prone to semantic gap, which refers to potential difference between intended operations described in software and actual operations done by processor. Attacks that compromise program control flows, which result in these mantic gaps, are a major attack type in modern software attacks. Many recent software protection schemes for servers and desktops focus on protecting program control flows, but there are little protection tools available for protecting program control flows of mobile applications for ARM processor architecture. This paper uses a program counter (PC) encoding technique (PC-Encoding) to harden program control flows under ARM processor architecture. The PC-Encoding directly encodes control flow target addresses that will load into the PC. It is simple and intuitive to implement and incur little overhead. Encoding the control flow target addresses can minimize the semantic gap by preventing potential compromises of the control flows. This paper describes our efforts of implementing PC-Encoding to harden portable binaries in ELF (Executable and Linkable Format).

Journal ArticleDOI
TL;DR: The results show that the DDPA algorithm satisfies the user’s privacy requirement in social network by reducing the execution time brought by iterations and reducing the information loss rate of graph structure.
Abstract: Social network contains the interaction between social members, which constitutes the structure and attribute of social network. The interactive relationship of social network contains a lot of personal privacy information. The direct release of social network data will cause the disclosure of privacy information. Aiming at the dynamic characteristics of social network data release, a new dynamic social network data publishing method based on differential privacy was proposed. This method was consistent with differential privacy. It is named DDPA (Dynamic Differential Privacy Algorithm). DDPA algorithm is an improvement of privacy protection algorithm in static social network data publishing. DDPA adds noise which follows Laplace to network edge weights. DDPA identifies the edge weight information that changes as the number of iterations increases, adding the privacy protection budget. Through experiments on real data sets, the results show that the DDPA algorithm satisfies the user’s privacy requirement in social network. DDPA reduces the execution time brought by iterations and reduces the information loss rate of graph structure.

Journal ArticleDOI
TL;DR: Performance comparison with leading symmetric algorithms (DES, AES and RC6) demonstrated AMSC’s efficiency in execution time and security analysis showed that AMSC is secure against cipher-text only and known plain-text attacks.
Abstract: This paper introduces and evaluates the performance of a novel cipher scheme, Ambiguous Multi-Symmetric Cryptography (AMSC), which conceals multiple coherent plain-texts in one cipher-text. The cipher-text can be decrypted by different keys to produce different plain-texts. Security analysis showed that AMSC is secure against cipher-text only and known plain-text attacks. AMSC has the following applications: 1) it can send multiple messages for multiple receivers through one cipher-text; 2) it can send one real message and multiple decoys for camouflage; and 3) it can send one real message to one receiver using parallel processing. Performance comparison with leading symmetric algorithms (DES, AES and RC6) demonstrated AMSC’s efficiency in execution time.

Journal ArticleDOI
TL;DR: This framework makes use of homomorphic encryption and secure multi-party computation to develop a series of protocols for private integer comparison and (non-) membership testing and provides a cost-efficient and secure solution for physical document verification.
Abstract: Physical document verification is a necessary task in the process of reviewing applications for a variety of services, such as loans, insurance, and mortgages. This process consumes a large amount of time, money, and human resources, which leads to limited business throughput. Furthermore, physical document verification poses a critical risk to clients’ personal information, as they are required to provide sensitive details and documents to verify their information. In this paper, we present a systematic approach to address shortcomings in the current state of the processes used for physical document verification. Our solution leverages a semi-trusted party data source (i.e. a governmental agency) and cryptographic protocols to provide a secure digital service. We make use of homomorphic encryption and secure multi-party computation to develop a series of protocols for private integer comparison and (non-) membership testing. Secure boolean evaluation and secure result aggregation schemes are proposed to combine the results of the evaluation of multiple predicates and produce the final outcome of the verification process. We also discuss possible improvements and other applications of the proposed secure system of protocols. Our framework not only provides a cost-efficient and secure solution for document verification, but also creates space for a new service.

Journal ArticleDOI
TL;DR: The results of this study indicate that the Jordanian Digital law requires some enhancements in order to cope with the trends of the ever-changing nature of the digital crimes.
Abstract: The nature of crime has dramatically changed after the revolution of the new digital era. It is no longer based on violence but on the criminal computer abilities and technical expertise. This paper presents a comprehensive comparison between the Jordanian digital law 2015 and the Omani information technology digital crime law 2010. The results of this study indicate that the Jordanian Digital law requires some enhancements in order to cope with the trends of the ever-changing nature of the digital crimes.