scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Information Security in 2012"


Journal ArticleDOI
TL;DR: The EER results of the combined systems prove that the ECG has an excellent source of supplementary information to a multibiometric system, despite it shows moderate performance in a unimodal framework.
Abstract: This paper presents an evaluation of a new biometric electrocardiogram (ECG) for individual authentication. We report the potential of ECG as a biometric and address the research concerns to use ECG-enabled biometric authentication system across a range of conditions. We present a method to delineate ECG waveforms and their end fiducials from each heartbeat. A new authentication strategy is proposed in this work, which uses the delineated features and taking decision for the identity of an individual with respect to the template database on the basis of match scores. Performance of the system is evaluated in a unimodal framework and in the multibiometric framework where ECG is combined with the face biometric and with the fingerprint biometric. The equal error rate (EER) result of the unimodal system is reported to 10.8%, while the EER results of the multibiometric systems are reported to 3.02% and 1.52%, respectively for the systems when ECG combined with the face biometric and ECG combined with the fingerprint biometric. The EER results of the combined systems prove that the ECG has an excellent source of supplementary information to a multibiometric system, despite it shows moderate performance in a unimodal framework. We critically evaluate the concerns involved to use ECG as a biometric for individual authentication such as, the lack of standardization of signal features and the presence of acquisition variations that make the data representation more difficult. In order to determine large scale performance, individuality of ECG remains to be examined.

100 citations


Journal ArticleDOI
TL;DR: The accuracy and time results of a text independent automatic speaker recognition (ASR) system, based on Mel-Frequency Cepstrum Coefficients (MFCC) and Gaussian Mixture Models (GMM), are shown in order to develop a security control access gate.
Abstract: The aim of this paper is to show the accuracy and time results of a text independent automatic speaker recognition (ASR) system, based on Mel-Frequency Cepstrum Coefficients (MFCC) and Gaussian Mixture Models (GMM), in order to develop a security control access gate. 450 speakers were randomly extracted from the Voxforge.org audio database, their utterances have been improved using spectral subtraction, then MFCC were extracted and these coefficients were statistically analyzed by GMM in order to build each profile. For each speaker two different speech files were used: the first one to build the profile database, the second one to test the system performance. The accuracy achieved by the proposed approach is greater than 96% and the time spent for a single test run, implemented in Matlab language, is about 2 seconds on a common PC.

49 citations


Journal ArticleDOI
TL;DR: A comprehensive hardware evaluation for the final round SHA-3 candidates is presented, based on a comparison made between each of the finalists in terms of security level, throughput, clock frequancey, area, power consumption, and the cost.
Abstract: Secure Hashing Algorithms (SHA) showed a significant importance in today’s information security applications. The National Institute of Standards and Technology (NIST), held a competition of three rounds to replace SHA1 and SHA2 with the new SHA-3, to ensure long term robustness of hash functions. In this paper, we present a comprehensive hardware evaluation for the final round SHA-3 candidates. The main goal of providing the hardware evaluation is to: find the best algorithm among them that will satisfy the new hashing algorithm standards defined by the NIST. This is based on a comparison made between each of the finalists in terms of security level, throughput, clock frequancey, area, power consumption, and the cost. We expect that the achived results of the comparisons will contribute in choosing the next hashing algorithm (SHA-3) that will support the security requirements of applications in todays ubiquitous and pervasive information infrastructure.

36 citations


Journal ArticleDOI
TL;DR: This paper proposes an unsupervised multi-level non-negative matrix factorization model to extract the hidden data structure and seek the rank of base matrix, and demonstrates that this approach is able to retrieve the hidden structure of data and determine the correct rank of Base matrix.
Abstract: Rank determination issue is one of the most significant issues in non-negative matrix factorization (NMF) research. However, rank determination problem has not received so much emphasis as sparseness regularization problem. Usually, the rank of base matrix needs to be assumed. In this paper, we propose an unsupervised multi-level non-negative matrix factorization model to extract the hidden data structure and seek the rank of base matrix. From machine learning point of view, the learning result depends on its prior knowledge. In our unsupervised multi-level model, we construct a three-level data structure for non-negative matrix factorization algorithm. Such a construction could apply more prior knowledge to the algorithm and obtain a better approximation of real data structure. The final bases selection is achieved through L2-norm optimization. We implement our experiment via binary datasets. The results demonstrate that our approach is able to retrieve the hidden structure of data, thus determine the correct rank of base matrix.

35 citations


Journal ArticleDOI
TL;DR: A new tool which is the combination of digital forensic investigation and crime data mining is proposed, designed for finding motive, pattern of cyber attacks and counts of attacks types happened during a period.
Abstract: Digital forensics is the science of identifying, extracting, analyzing and presenting the digital evidence that has been stored in the digital devices. Various digital tools and techniques are being used to achieve this. Our paper explains forensic analysis steps in the storage media, hidden data analysis in the file system, network forensic methods and cyber crime data mining. This paper proposes a new tool which is the combination of digital forensic investigation and crime data mining. The proposed system is designed for finding motive, pattern of cyber attacks and counts of attacks types happened during a period. Hence the proposed tool enables the system administrators to minimize the system vulnerability.

32 citations


Journal ArticleDOI
TL;DR: Experimental results show that feature reduction can improve detection rate for the category-based detection approach while maintaining the detection accuracy within an acceptable range and KNN classification method is used for the classification of the attacks.
Abstract: Existing Intrusion Detection Systems (IDS) examine all the network features to detect intrusion or misuse patterns. In feature-based intrusion detection, some selected features may found to be redundant, useless or less important than the rest. This paper proposes a category-based selection of effective parameters for intrusion detection using Principal Components Analysis (PCA). In this paper, 32 basic features from TCP/IP header, and 116 derived features from TCP dump are selected in a network traffic dataset. Attacks are categorized in four groups, Denial of Service (DoS), Remote to User attack (R2L), Remote to User attack (U2R) and Probing attack. TCP dump from DARPA 1998 dataset is used in the experiments as the selected dataset. PCA method is used to determine an optimal feature set to make the detection process faster. Experimental results show that feature reduction can improve detection rate for the category-based detection approach while maintaining the detection accuracy within an acceptable range. In this paper KNN classification method is used for the classification of the attacks. Experimental results show that feature reduction will significantly speed up the train and the testing periods for identification of the intrusion attempts.

25 citations


Journal ArticleDOI
TL;DR: A block diagram is proposed which may guide a database forensic examiner to obtain the evidences in an oracle database for database tamper detection.
Abstract: Most secure database is the one you know the most Tamper detection compares the past and present status of the system and produces digital evidence for forensic analysis Our focus is on different methods or identification of different locations in an oracle database for collecting the digital evidence for database tamper detection Starting with the basics of oracle architecture, continuing with the basic steps of forensic analysis the paper elaborates the extraction of suspicious locations in oracle As a forensic examiner, collecting digital evidence in a database is a key factor Planned and a modelled way of examination will lead to a valid detection Based on the literature survey conducted on different aspects of collecting digital evidence for database tamper detection, the paper proposes a block diagram which may guide a database forensic examiner to obtain the evidences

24 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compare the performance of a new 4-buffers SHA-256 S-HASH implementation, to that of the standard serial hashing, and demonstrate that for some usage models, SHA256 is significantly faster than commonly perceived.
Abstract: We describe a method for efficiently hashing multiple messages of different lengths. Such computations occur in various scenarios, and one of them is when an operating system checks the integrity of its components during boot time. These tasks can gain performance by parallelizing the computations and using SIMD architectures. For such scenarios, we compare the performance of a new 4-buffers SHA-256 S-HASH implementation, to that of the standard serial hashing. Our results are measured on the 2nd Generation Intel? CoreTM Processor, and demonstrate SHA-256 processing at effectively ~5.2 Cycles per Byte, when hashing from any of the three cache levels, or from the system memory. This represents speedup by a factor of 3.42x compared to OpenSSL (1.0.1), and by 2.25x compared to the recent and faster n-SMS method. For hashing from a disk, we show an effective rate of ~6.73 Cycles/Byte, which is almost 3 times faster than OpenSSL (1.0.1) under the same conditions. These results indicate that for some usage models, SHA-256 is significantly faster than commonly perceived.

24 citations


Journal ArticleDOI
TL;DR: A comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed.
Abstract: The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.

22 citations


Journal ArticleDOI
TL;DR: This paper uses homomorphic encryption schemes to build a fully secure system that allows users to benefit from location-based services while preserving the confidentiality and integrity of their data.
Abstract: Homomorphic encryption schemes make it possible to perform arithmetic operations, like additions and multiplications, over encrypted values. This capability provides enhanced protection for data and offers new research directions, including blind data processing. Using homomorphic encryption schemes, a Location-Based Service (LBS) can process encrypted inputs to retrieve encrypted location-related information. The retrieved encrypted data can only be decrypted by the user who requested the data. The technology still faces two main challenges: the encountered processing time and the upper limit imposed on the allowed number of operations. However, the protection of users’ privacy achieved through this technology makes it attractive for more research and enhancing. In this paper we use homomorphic encryption schemes to build a fully secure system that allows users to benefit from location-based services while preserving the confidentiality and integrity of their data. Our novel system consists of search circuits that allow an executor (i.e. LBS server) to receive encrypted inputs/requests and then perform a blind search to retrieve encrypted records that match the selection criterion. A querier can send the user’s position and the service type he/she is looking for, in encrypted form, to a server and then the server would respond to the request without any knowledge of the contents of the request and the retrieved records. We further propose a prototype that improves the practicality of our system.

19 citations


Journal ArticleDOI
TL;DR: Comprehensive overview of insertion and extraction methods used in different semi fragile water marking algorithm are studied using image parameters, potential application, different algorithms are described and focus is on their comparison according to the properties cited above.
Abstract: Technology has no limits today; we have lots of software available in the market by which we can alter any image. People usually copies image from the internet and after some changes they claim that these are their own properties. Insuring digital image integrity has therefore become a major issue. Over the past few years, watermarking has emerged as the leading candidate to solve problems of ownership and content authentications for digital multimedia documents. To protect authenticity of images semi fragile watermarking is very concerned by researchers because of its important function in multimedia content authentication. The aim of this paper is to present a survey and a comparison of emerging techniques for image authentication using semifragile watermarking. In present paper comprehensive overview of insertion and extraction methods used in different semi fragile water marking algorithm are studied using image parameters, potential application, different algorithms are described and focus is on their comparison according to the properties cited above and future directions for developing a better image authentication algorithm are suggested.

Journal ArticleDOI
TL;DR: This study reveals more than 70 threats for HIS that are classified into 30 common criteria, which may provide some insights to both researchers and professionals, who are interested in conducting research in risk management of HIS security.
Abstract: Security remains to be a critical issue in the safe operation of Information Systems (IS). Identifying the threats to IS may lead to an effective method for measuring security as the initial stage for risk management. Despite many attempts to classify threats to IS, new threats to Health Information Systems (HIS) remains a continual concern for system developers. The main aim of this paper is to present a research agenda of threats to HIS. A cohesive completeness study on the identification of possible threats on HIS was conducted. This study reveals more than 70 threats for HIS. They are classified into 30 common criteria. The abstraction was carried out using secondary data from various research databases. This work-in-progress study will proceed to the next stage of ranking the security threats for assessing risk in HIS. This classification of threats may provide some insights to both researchers and professionals, who are interested in conducting research in risk management of HIS security.

Journal ArticleDOI
TL;DR: Experimental results show the proposed method can recover the watermark pattern from the marked image even if major changes are made to the original digital image.
Abstract: A method for creating digital image copyright protection is proposed in this paper. The proposed method in this paper is based on visual cryptography defined by Noor and Shamir. The proposed method is working on selection of random pixels from the original digital image instead of specific selection of pixels. The new method proposed does not require that the watermark pattern to be embedded in to the original digital image. Instead of that, verification information is generated which will be used to verify the ownership of the image. This leaves the marked image equal to the original image. The method is based on the relationship between randomly selected pixels and their 8-neighbors’ pixels. This relationship keeps the marked image coherent against diverse attacks even if the most significant bits of randomly selected pixels have been changed by attacker as we will see later in this paper. Experimental results show the proposed method can recover the watermark pattern from the marked image even if major changes are made to the original digital image.

Journal ArticleDOI
TL;DR: The overall results suggest that the proposed cyber terrorism framework is acceptable by the participants and supports the initial research that the cyber terrorism conceptual framework constitutes the following components: target, motivation, tools of attack, domain, methods of attack and impact.
Abstract: Focus group discussion is an exploratory research technique used to collect data through group interaction. This technique provides the opportunity to observe interaction among participants on a topic under this study. This paper contributes to an understanding on the cyber terrorism conceptual framework through the analysis of focus group discussion. The proposed cyber terrorism conceptual framework which was obtained during the qualitative study by the authors has been used as a basis for discussion in the focus group discussion. Thirty (30) participants took part in the focus group discussion. The overall results suggest that the proposed cyber terrorism framework is acceptable by the participants. The present study supports our initial research that the cyber terrorism conceptual framework constitutes the following components: target, motivation, tools of attack, domain, methods of attack and impact.

Journal ArticleDOI
TL;DR: This paper proposes an enhancement of timestamp discrepancy used to validate a signed message and consequently limiting the impact of a replay attack by estimating approximately the time where the message is received and validated by the received node.
Abstract: Mobile Ad hoc NETworks (MANETs), characterized by the free move of mobile nodes are more vulnerable to the trivial Denial-of-Service (DoS) attacks such as replay attacks. A replay attacker performs this attack at anytime and anywhere in the network by interception and retransmission of the valid signed messages. Consequently, the MANET performance is severally degraded by the overhead produced by the redundant valid messages. In this paper, we propose an enhancement of timestamp discrepancy used to validate a signed message and consequently limiting the impact of a replay attack. Our proposed timestamp concept estimates approximately the time where the message is received and validated by the received node. This estimation is based on the existing parameters defined at the 802.11 MAC layer.

Journal ArticleDOI
TL;DR: The purpose of this paper is to describe the application of mixed method in developing a cyber terrorism framework by utilizing qualitative and quantitative techniques within the same study to incorporate the strength of both methodologies and fit together the insights into a workable solution.
Abstract: Mixed method research has becoming an increasingly popular approach in the discipline of sociology, psychology, education, health science and social science. The purpose of this paper is to describe the application of mixed method in developing a cyber terrorism framework. This project has two primary goals: firstly is to discover the theory and then develop a conceptual framework that describes the phenomena, and secondly is to verify the conceptual framework that describes the phenomena. In order to achieve conclusive findings of the study, a mixed method research is recommended: qualitative data and quantitative data are collected and analyzed respectively in a separate phase. The mixed method approach improves the rigor and explanation of the research results, thus bring conclusive findings to the study outcome. By utilizing qualitative and quantitative techniques within the same study, we are able to incorporate the strength of both methodologies and fit together the insights into a workable solution.

Journal ArticleDOI
TL;DR: A multi-stage anomaly detection system which is combination of timeslot-based and flow-based detectors is proposed which can reduce the number of flows which need to be subjected to flow- based analysis but yet exhibits high detection accuracy.
Abstract: Because of an explosive growth of the intrusions, necessity of anomaly-based Intrusion Detection Systems (IDSs) which are capable of detecting novel attacks, is increasing. Among those systems, flow-based detection systems which use a series of packets exchanged between two terminals as a unit of observation, have an advantage of being able to detect anomaly which is included in only some specific sessions. However, in large-scale networks where a large number of communications takes place, analyzing every flow is not practical. On the other hand, a timeslot-based detection systems need not to prepare a number of buffers although it is difficult to specify anomaly communications. In this paper, we propose a multi-stage anomaly detection system which is combination of timeslot-based and flow-based detectors. The proposed system can reduce the number of flows which need to be subjected to flow-based analysis but yet exhibits high detection accuracy. Through experiments using data set, we present the effectiveness of the proposed method.

Journal ArticleDOI
TL;DR: The state of the art of the different string matching algorithms used in network intrusion detection systems are reviewed; and also some research done about CPU and GPU on this area is performed.
Abstract: String matching algorithms are an important piece in the network intrusion detection systems. In these systems, the chain coincidence algorithms occupy more than half the CPU process time. The GPU technology has showed in the past years to have a superior performance on these types of applications than the CPU. In this article we perform a review of the state of the art of the different string matching algorithms used in network intrusion detection systems; and also some research done about CPU and GPU on this area.

Journal ArticleDOI
TL;DR: This paper presents a distributed mechanism for improving resource protection in a Digital Ecosystem environment that can be used not only for any secure and reliable transaction, but also for encouraging the collaborative efforts by the Digital E ecosystem community members to play a major role in securing the environment.
Abstract: The dynamic interaction and collaboration of the loosely coupled entities play a pivotal role for the successful implementation of a Digital Ecosystem environment. However, such interaction and collaboration can only be promoted when information and resources are effortlessly shared, accessed, and utilized by the interacting entities. A major requirement to promote an intensive sharing of resources is the ability to secure and uphold the confidentiality, integrity and non-repudiation of resources. This requirement is extremely important in particular when interactions with the unfamiliar entities occur frequently. In this paper, we present a distributed mechanism for improving resource protection in a Digital Ecosystem environment. This mechanism can be used not only for any secure and reliable transaction, but also for encouraging the collaborative efforts by the Digital Ecosystem community members to play a major role in securing the environment. Public Key Infrastructure is also employed to provide a strong protection for its access workflows.

Journal ArticleDOI
TL;DR: The security policy creation and management process proposed in this paper is based on the Six Sigma model and presents a method to adapt security goals and risk management of a computing service.
Abstract: This paper presents a management process for creating adaptive, real-time security policies within the Six Sigma (6σ) framework. A key challenge for the creation of a management process is the integration with models of known Industrial processes. One of the most used industrial process models is Six Sigma which is a business management model wherein customer centric needs are put in perspective with business data to create an efficient system. The security policy creation and management process proposed in this paper is based on the Six Sigma model and presents a method to adapt security goals and risk management of a computing service. By formalizing a security policy management process within an industrial process model, the adaptability of this model to existing industrial tools is seamless and offers a clear risk based policy decision framework. In particular, this paper presents the necessary tools and procedures to map Six Sigma DMAIC (Define-Measure-Analyze-Improve-Control) methodology to security policy management.

Journal ArticleDOI
TL;DR: The technical possibilities of countering threats by monitoring the optical radiation to detect eavesdropping are discussed, using standard fiber optic communications as illegal measuring network.
Abstract: Information leaks through regular fiber optic communications is possible in the form of eavesdropping on conversations, using standard fiber optic communications as illegal measuring network. The threat of leakage of audio information can create any kind of irregular light emission, as well as regular light beams modulated at acoustic frequencies. For information protection can be used a means of sound insulation, filtration and noising. This paper discusses the technical possibilities of countering threats by monitoring the optical radiation to detect eavesdropping.

Journal ArticleDOI
TL;DR: The scheme is implemented and it is shown that the credential signing process can be completed within reasonable time when the parameters are properly set and the scheme is secure in terms of authentication and privacy-preserving.
Abstract: Electric vehicle has attracted more and more attention all around the world in recent years because of its many advantages such as low pollution to the environment. However, due to the limitation of current technology, charging remains an important issue. In this paper, we study the problem of finding and making reservation on charging stations via a vehicular ad hoc network (VANET). Our focus is on the privacy concern as drivers would not like to be traced by knowing which charging stations they have visited. Technically, we make use of the property of blind signature to achieve this goal. In brief, an electric vehicle first generates a set of anonymous credentials on its own. A trusted authority then blindly signs on them after verifying the identity of the vehicle. After that, the vehicle can make charging station searching queries and reservations by presenting those signed anonymous credentials. We implemented the scheme and show that the credential signing process (expected to be the most time consuming step) can be completed within reasonable time when the parameters are properly set. In particular, the process can be completed in 5 minutes when 1024 bits of RSA signing key is used. Moreover, we show that our scheme is secure in terms of authentication and privacy-preserving.

Journal ArticleDOI
TL;DR: This paper presents a novel dynamic identity based authentication protocol for multi-server architecture using smart cards that resolves the aforementioned flaws, while keeping the merits of Liao and Wang's protocol.
Abstract: Most of the password based authentication protocols make use of the single authentication server for user's authentication. User's verifier information stored on the single server is a main point of susceptibility and remains an attractive target for the attacker. On the other hand, multi-server architecture based authentication protocols make it difficult for the attacker to find out any significant authentication information related to the legitimate users. In 2009, Liao and Wang proposed a dynamic identity based remote user authentication protocol for multi-server environment. However, we found that Liao and Wang's protocol is susceptible to malicious server attack and malicious user attack. This paper presents a novel dynamic identity based authentication protocol for multi-server architecture using smart cards that resolves the aforementioned flaws, while keeping the merits of Liao and Wang's protocol. It uses two-server paradigm by imposing different levels of trust upon the two servers and the user's verifier information is distributed between these two servers known as the service provider server and the control server. The proposed protocol is practical and computational efficient because only nonce, one-way hash function and XOR operations are used in its implementation. It provides a secure method to change the user's password without the server's help. In e-commerce, the number of servers providing the services to the user is usually more than one and hence secure authentication protocols for multi-server environment are required.

Journal ArticleDOI
TL;DR: StreamPreDeCon is an extension of the preference subspace clustering algorithm PreDeCon designed to resolve some of the challenges associated with anomalous packet detection, and shows improvement on results found in a previous study as the sensitivity rate in general increased while the false positive rate decreased.
Abstract: As the Internet offers increased connectivity between human beings, it has fallen prey to malicious users who exploit its resources to gain illegal access to critical information. In an effort to protect computer networks from external attacks, two common types of Intrusion Detection Systems (IDSs) are often deployed. The first type is signature-based IDSs which can detect intrusions efficiently by scanning network packets and comparing them with human-generated signatures describing previously-observed attacks. The second type is anomaly-based IDSs able to detect new attacks through modeling normal network traffic without the need for a human expert. Despite this advantage, anomaly-based IDSs are limited by a high false-alarm rate and difficulty detecting network attacks attempting to blend in with normal traffic. In this study, we propose a StreamPreDeCon anomaly-based IDS. StreamPreDeCon is an extension of the preference subspace clustering algorithm PreDeCon designed to resolve some of the challenges associated with anomalous packet detection. Using network packets extracted from the first week of the DARPA '99 intrusion detection evaluation dataset combined with Generic Http, Shellcode and CLET attacks, our IDS achieved 94.4% sensitivity and 0.726% false positives in a best case scenario. To measure the overall effectiveness of the IDS, the average sensitivity and false positive rates were calculated for both the maximum sensitivity and the minimum false positive rate. With the maximum sensitivity, the IDS had 80% sensitivity and 9% false positives on average. The IDS also averaged 63% sensitivity with a 0.4% false positive rate when the minimal number of false positives is needed. These rates are an improvement on results found in a previous study as the sensitivity rate in general increased while the false positive rate decreased.

Journal ArticleDOI
TL;DR: This article proposes an efficient mechanism to provide both attributes of randomness and uniqueness at the same time without highly constraining the first one and never violating the second one and proves the postulated properties.
Abstract: When initializing cryptographic systems or running cryptographic protocols, the randomness of critical parameters, like keys or key components, is one of the most crucial aspects. But, randomly chosen parameters come with the intrinsic chance of duplicates, which finally may cause cryptographic systems including RSA, ElGamal and Zero-Knowledge proofs to become insecure. When concerning digital identifiers, we need uniqueness in order to correctly identify a specific action or object. Unfortunately we also need randomness here. Without randomness, actions become linkable to each other or to their initiator’s digital identity. So ideally the employed (cryptographic) parameters should fulfill two potentially conflicting requirements simultaneously: randomness and uniqueness. This article proposes an efficient mechanism to provide both attributes at the same time without highly constraining the first one and never violating the second one. After defining five requirements on random number generators and discussing related work, we will describe the core concept of the generation mechanism. Subsequently we will prove the postulated properties (security, randomness, uniqueness, efficiency and privacy protection) and present some application scenarios including system-wide unique parameters, cryptographic keys and components, identifiers and digital pseudonyms.

Journal ArticleDOI
TL;DR: A new method Visual Pixel Detection VPD for extract data from a color or a grayscale images and shows that the proposed method provides a better performance on testing images in comparison with the existing method in attacking Steghide, Outguess and F5.
Abstract: Recently, numerous novel algorithms have been proposed in the fields of steganography and visual cryptography with the goals of improving security, reliability, and efficiency. Steganography detection is a technique to tell whether there are secret messages hidden in images. The performance of a steganalysis system is mainly determined by the method of feature extraction and the architecture selection of the classifier. In this paper, we present a new method Visual Pixel Detection VPD for extract data from a color or a grayscale images. Because the human eye can recognize the hidden information in the image after using this detection. The experimental results show that the proposed method provides a better performance on testing images in comparison with the existing method in attacking Steghide, Outguess and F5.

Journal ArticleDOI
TL;DR: It is offered to apply Case Based Reasoning technology to a spam filtering problem and the possibility of continuous updating of spam templates base on the bases of which new coming spam messages are compared, will raise efficiency of a filtration.
Abstract: Recently the number of undesirable messages coming to e-mail has strongly increased. As spam has changeable character the anti-spam systems should be trainable and dynamical. The machine learning technology is successfully applied in a filtration of e-mail from undesirable messages for a long time. In this paper it is offered to apply Case Based Reasoning technology to a spam filtering problem. The possibility of continuous updating of spam templates base on the bases of which new coming spam messages are compared, will raise efficiency of a filtration. Changing a combination of conditions it is possible to construct flexible filtration system adapted for different users or corporations. Also in this paper it is considered the second approach as implementation of CRM technology to spam filtration which is not applied to this area yet.

Journal ArticleDOI
TL;DR: A new steganography method for secure Data communication on half tone pictures using the halftone pictures improve the security and capacity and performs the suggested method improves the quality of placing picture and increases its security.
Abstract: Today steganography has attracted the attention of many researchers. In this paper, we propose a new steganography method for secure Data communication on half tone pictures. Using the halftone pictures improve the security and capacity. In this method, the complexity of every pixel in picture is computed, then a neibourhood is defined to compute the complexity of every pixel, and then the complexity of every pixel is computed in the neibourhood. Placing data in the monotonous areas of halftone can explain the presence of hidden data. A method has been represented that surveys the position of every pixel neibouring others and prevents including in monotonous areas. If that was a complicated one, steganography bit after stonehalf will be hidden after scrolling the whole, the process of spreading error will be done. Performing the suggested method improves the quality of placing picture and increases its security.

Journal ArticleDOI
TL;DR: By implementing a secure messaging function in OpenSC 0.12.0 that protects the PIN and data exchange between the SC and the middleware, this enables the integration of digital signature functionality into the OpenSC environment.
Abstract: Smartcards are used for a rapidly increasing number of applications including electronic identity, driving licenses, physical access, health care, digital signature, and electronic payments. The use of a specific smartcard in a "closed" environment generally provides a high level of security. In a closed environment no other smartcards are employed and the card use is restricted to the smartcard's own firmware, approved software applications, and approved card reader. However, the same level of security cannot be claimed for open environments where smartcards from different manufacturers might interact with various smartcard applications. The reason is that despite a number of existing standards and certification protocols like Common Criteria and CWA 14169, secure and convenient smartcard interoperability has remained a challenge. Ideally, just one middleware would handle the interactions between various software applications and different smartcards securely and seamlessly. In our ongoing research we investigate the underlying interoperability and security problems specifically for digital signature processes. An important part of such a middleware is a set of utilities and libraries that support cryptographic applications including authentication and digital signatures for a significant number of smartcards. The open-source project OpenSC provides such utilities and libraries. Here we identify some security lacks of OpenSC used as such a middleware. By implementing a secure messaging function in OpenSC 0.12.0 that protects the PIN and data exchange between the SC and the middleware, we address one important security weakness. This enables the integration of digital signature functionality into the OpenSC environment.

Journal ArticleDOI
TL;DR: Experimental results are presented to demonstrate that the robustness of a watermark with mixed ECC is much higher than the traditional one just with repetition coding while suffering JPEG lossy compression, salt and pepper noise and center cutting processing.
Abstract: In this paper, we present a novel technique based on a mixed Error Correcting Code(ECC)-the convolutional code and the repetition code to enhance the robustness of the embedded watermark. Before embedding, the binary watermark is scanned to one-dimension sequence and later inputted into the (3, 1, 2) convolutional encoder and (3, 1) repetition encoder frame by frame, which will improve the error correcting capability of decoder. The output code sequence is scanned to some matrixes as the new watermark messages. The watermarking is selected in low frequency band of the Discrete Wavelet Transform (DWT) and therefore it can resist the destruction of image processing. Experimental results are presented to demonstrate that the robustness of a watermark with mixed ECC is much higher than the traditional one just with repetition coding while suffering JPEG lossy compression, salt and pepper noise and center cutting processing.