scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Information Security in 2015"


Journal ArticleDOI
TL;DR: By extending the cyber security investment model of Gordon and Loeb to incorporate externalities, it is shown that the firm’s social optimal investment in cyber security increases by no more than 37% of the expected externality loss.
Abstract: Cyber security breaches inflict costs to consumers and businesses. The possibility also exists that a cyber security breach may shut down an entire critical infrastructure industry, putting a nation’s whole economy and national defense at risk. Hence, the issue of cyber security investment has risen to the top of the agenda of business and government executives. This paper examines how the existence of well-recognized externalities changes the maximum a firm should, from a social welfare perspective, invest in cyber security activities. By extending the cyber security investment model of Gordon and Loeb [1] to incorporate externalities, we show that the firm’s social optimal investment in cyber security increases by no more than 37% of the expected externality loss.

68 citations


Journal ArticleDOI
Arif Sari1
TL;DR: This review paper exposes and focuses on different IDS in cloud networks through different categorizations and conducts comparative study on the security measures of Dropbox, Google Drive and iCloud, to illuminate their strength and weakness in terms of security.
Abstract: Cloud computing has become one of the most projecting words in the IT world due to its design for providing computing service as a utility. The typical use of cloud computing as a resource has changed the scenery of computing. Due to the increased flexibility, better reliability, great scalability, and decreased costs have captivated businesses and individuals alike because of the pay-per-use form of the cloud environment. Cloud computing is a completely internet dependent technology where client data are stored and maintained in the data center of a cloud provider like Google, Amazon, Apple Inc., Microsoft etc. The Anomaly Detection System is one of the Intrusion Detection techniques. It’s an area in the cloud environment that is been developed in the detection of unusual activities in the cloud networks. Although, there are a variety of Intrusion Detection techniques available in the cloud environment, this review paper exposes and focuses on different IDS in cloud networks through different categorizations and conducts comparative study on the security measures of Dropbox, Google Drive and iCloud, to illuminate their strength and weakness in terms of security.

59 citations


Journal ArticleDOI
TL;DR: This paper discusses various phishing attacks using mobile devices followed by some discussion on countermeasures to bring more awareness to emerging mobile device-based phishingattacks.
Abstract: Mobile devices have taken an essential role in the portable computer world. Portability, small screen size, and lower cost of production make these devices popular replacements for desktop and laptop computers for many daily tasks, such as surfing on the Internet, playing games, and shopping online. The popularity of mobile devices such as tablets and smart phones has made them a frequent target of traditional web-based attacks, especially phishing. Mobile device-based phishing takes its share of the pie to trick users into entering their credentials in fake websites or fake mobile applications. This paper discusses various phishing attacks using mobile devices followed by some discussion on countermeasures. The discussion is intended to bring more awareness to emerging mobile device-based phishing attacks.

34 citations


Journal ArticleDOI
TL;DR: The experimental results show that the Modified Vector Space Representation technique performs well and it helps accurately distinguishing process behaviour through system calls.
Abstract: Predicting anomalous behaviour of a running process using system call trace is a common practice among security community and it is still an active research area. It is a typical pattern recognition problem and can be dealt with machine learning algorithms. Standard system call datasets were employed to train these algorithms. However, advancements in operating systems made these datasets outdated and un-relevant. Australian Defence Force Academy Linux Dataset (ADFA-LD) and Australian Defence Force Academy Windows Dataset (ADFA-WD) are new generation system calls datasets that contain labelled system call traces for modern exploits and attacks on various applications. In this paper, we evaluate performance of Modified Vector Space Representation technique on ADFA-LD and ADFA-WD datasets using various classification algorithms. Our experimental results show that our method performs well and it helps accurately distinguishing process behaviour through system calls.

24 citations


Journal ArticleDOI
TL;DR: A governing body framework is proposed which aims at solving security and privacy issues in cloud computing by establishing relationship amongst the CSPs in which the data about possible threats can be generated based on the previous attacks on other C SPs.
Abstract: Cloud computing is touted as the next big thing in the Information Technology (IT) industry, which is going to impact the businesses of any size and yet the security issue continues to pose a big threat on it. The security and privacy issues persisting in cloud computing have proved to be an obstacle for its widespread adoption. In this paper, we look at these issues from a business perspective and how they are damaging the reputation of big companies. There is a literature review on the existing issues in cloud computing and how they are being tackled by the Cloud Service Providers (CSP). We propose a governing body framework which aims at solving these issues by establishing relationship amongst the CSPs in which the data about possible threats can be generated based on the previous attacks on other CSPs. The Governing Body will be responsible for Data Center control, Policy control, legal control, user awareness, performance evaluation, solution architecture and providing motivation for the entities involved.

19 citations


Journal ArticleDOI
TL;DR: The study finds that all email forensic tools are not similar, offer diverse types of facility and by combining analysis tools, it may be possible to gain detailed information in the area of email forensic.
Abstract: Over the last decades, email has been the major carrier for transporting spam and malicious contents over the network. Email is also the primary source of numerous criminal activities on the Internet. Computer Forensics is a systematic process to retain and analyze saved emails for the purpose of legal proceedings and other civil matters. Email analysis is challenging due to not only various fields that can be forged by hackers or malicious users, but also the flexibility of composing, editing, deleting of emails using offline (e.g., MS Outlook) or online (e.g., Web mail) email applications. Towards this direction, a number of open source forensics tools have been widely used by the practitioners. However, these tools have been developed in an isolated manner rather than a collaborative approach. Given that email forensic tool users need to understand to what extent a tool would be useful for his/her circumstances and conducting forensic analysis accordingly. In this paper, we examine a set of common features to compare and contrast five popular open source email forensic tools. The study finds that all email forensic tools are not similar, offer diverse types of facility. By combining analysis tools, it may be possible to gain detailed information in the area of email forensic.

18 citations


Journal ArticleDOI
TL;DR: An overview of the public key infrastructure is discussed that includes various components and operation, some well known PKIs and their comparisons, and current implementations, risk and challenges of PKIs.
Abstract: As security is essential in communications through electronic networks, development of structures providing high levels of security is needed. Public Key Infrastructure (PKI) is a way of providing security measures by implementing the means of key pairs among users. In this paper, an overview of the public key infrastructure is discussed that includes various components and operation, some well known PKIs and their comparisons. Also we discuss current implementations, risk and challenges of PKIs.

17 citations


Journal ArticleDOI
TL;DR: This paper uses symmetric key-based homomorphic primitives to provide end- to-end privacy and end-to-end integrity of reverse multicast traffic and comparatively evaluates the performance of the proposed protocol to show its efficacy and efficiency in resource-constrained environments.
Abstract: In wireless sensor networks, secure data aggregation protocols target the two major objectives, namely, security and en route aggregation. Although en route aggregation of reverse multi-cast traffic improves energy efficiency, it becomes a hindrance to end-to-end security. Concealed data aggregation protocols aim to preserve the end-to-end privacy of sensor readings while performing en route aggregation. However, the use of inherently malleable privacy homomorphism makes these protocols vulnerable to active attackers. In this paper, we propose an integrity and privacy preserving end-to-end secure data aggregation protocol. We use symmetric key-based homomorphic primitives to provide end-to-end privacy and end-to-end integrity of reverse multicast traffic. As sensor network has a non-replenishable energy supply, the use of symmetric key based homomorphic primitives improves the energy efficiency and increase the sensor network’s lifetime. We comparatively evaluate the performance of the proposed protocol to show its efficacy and efficiency in resource-constrained environments.

13 citations


Journal ArticleDOI
TL;DR: The analysis shows that despite the necessity to implement biometric technology, absence of legal and regulatory requirements becomes a challenge to implementation of the proposed biometric solution.
Abstract: Biometric authentication systems are believed to be effective compared to traditional authentication systems. The introduction of biometrics into smart cards is said to result into biometric-based smart ID card with enhanced security. This paper discusses the biometric-based smart ID card with a particular emphasis on security and privacy implications in Rwanda universities environment. It highlights the security and implementation issues. The analysis shows that despite the necessity to implement biometric technology, absence of legal and regulatory requirements becomes a challenge to implementation of the proposed biometric solution. The paper is intended to engage a broad audience from Rwanda universities planning to introduce the biometric-based smart ID cards to verify students and staff for authentication purpose.

10 citations


Journal ArticleDOI
TL;DR: It was found that the built-in security features which are available by default on Microsoft’s Windows servers were not sufficient in defending against the TCP/SYN attacks even at low intensity attack traffic.
Abstract: Distributed Denial of Service (DDoS) is known to compromise availability of Information Systems today. Widely deployed Microsoft’s Windows 2003 & 2008 servers provide some built-in protection against common Distributed Denial of Service (DDoS) attacks, such as TCP/SYN attack. In this paper, we evaluate the performance of built-in protection capabilities of Windows servers 2003 & 2008 against a special case of TCP/SYN based DDoS attack. Based on our measurements, it was found that the built-in security features which are available by default on Microsoft’s Windows servers were not sufficient in defending against the TCP/SYN attacks even at low intensity attack traffic. Under TCP/SYN attack traffic, the Microsoft 2003 server was found to crash due to processor resource exhaustion, whereas the 2008 server was found to crash due to its memory resource depletion even at low intensity attack traffic.

8 citations


Journal ArticleDOI
TL;DR: The results in the practical scenario defined formally in this paper, show the Round Trip Time (RTT) for an agent to travel in the system and measured by the times required for anAgent to travel around different number of cloud users before and after implementing FCDAC.
Abstract: With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality and fine-grained data access control of Cloud Data Storage (CDS) environment, we proposed Multi-Agent System (MAS) architecture. This architecture consists of two agents: Cloud Service Provider Agent (CSPA) and Cloud Data Confidentiality Agent (CDConA). CSPA provides a graphical interface to the cloud user that facilitates the access to the services offered by the system. CDConA provides each cloud user by definition and enforcement expressive and flexible access structure as a logic formula over cloud data file attributes. This new access control is named as Formula-Based Cloud Data Access Control (FCDAC). Our proposed FCDAC based on MAS architecture consists of four layers: interface layer, existing access control layer, proposed FCDAC layer and CDS layer as well as four types of entities of Cloud Service Provider (CSP), cloud users, knowledge base and confidentiality policy roles. FCDAC, it’s an access policy determined by our MAS architecture, not by the CSPs. A prototype of our proposed FCDAC scheme is implemented using the Java Agent Development Framework Security (JADE-S). Our results in the practical scenario defined formally in this paper, show the Round Trip Time (RTT) for an agent to travel in our system and measured by the times required for an agent to travel around different number of cloud users before and after implementing FCDAC.

Journal ArticleDOI
TL;DR: Comparing the performance in the cases of using the same keyboard and different keyboards, the dependencies on keyboards are evaluated and it is found that users do not need to worry about the keyboard difference for users whose typing skills reach high level with about 900 or more letters in 5 minutes.
Abstract: We have proposed some methods for feature extraction and identification that enable identification of individuals through long-text input as an important topic in keystroke dynamics research. As to the robustness in practical circumstances, there exists a question on the keystroke dynamics how much the recognition accuracy is influenced by the change of keyboard. By comparing the performance in the cases of using the same keyboard and different keyboards, the dependencies on keyboards are evaluated through three implemented experiments for subjects. As a result, it is found that we do not need to worry about the keyboard difference for users whose typing skills reach high level with about 900 or more letters in 5 minutes, and only for the remaining users it would be necessary to register their profile data with respect to each keyboard they use in order to avoid recognition accuracy degradation.

Journal ArticleDOI
TL;DR: This paper uses MATLAB image processing toolbox to enhance the palm vein images and employ coset decomposition concept to store and identify the encoded palm vein feature vectors and presents an algebraic method for personal authentication and identification using internal contactless palms vein images.
Abstract: The palm vein authentication technology is extremely safe, accurate and reliable as it uses the vascular patterns contained within the body to confirm personal identification. The pattern of veins in the palm is complex and unique to each individual. Its non-contact function gives it a healthful advantage over other biometric technologies. This paper presents an algebraic method for personal authentication and identification using internal contactless palm vein images. We use MATLAB image processing toolbox to enhance the palm vein images and employ coset decomposition concept to store and identify the encoded palm vein feature vectors. Experimental evidence shows the validation and influence of the proposed approach.

Journal ArticleDOI
TL;DR: In this article, the authors have applied a triangulation to find out user and service provider viewpoint about Shibboleth, which has been found that the University of Bedfordshire controls its various services including student portal Breo, Learning Resources and Student Email Access and others through a user authentication system.
Abstract: The notion of this project is derived from our practical use of user authentication system namely Shibboleth at the University of Bedfordshire. It has been found that the University of Bedfordshire controls its various services including student portal Breo, Learning Resources and Student Email Access and others through the Shibboleth. Like the University of Bedfordshire the other Universities in the UK are also implementing the Shibboleth system in their access management control. Therefore, the researchers of this project have found it important to evaluate its efficiency and effectiveness of Shibboleth from different perspectives. In the first part of this paper it tries to explain the features of Shibboleth as SSO services and compares it with other SSO services like Athens, Kerberos, etc. Then in the middle section, the authors go through the steps of installation and configuration of the Shibboleth. In the end of the paper, based on the survey of real users of Shibboleth at the University of Bedfordshire, the authors give its insights on the effectiveness of the Shibboleth as SSO service. Throughout this investigation, the authors have applied a triangulation to find out user and service provider viewpoint about Shibboleth. Although there were some problems persisted, the authors also implemented the Shibboleth system successfully to figure out different problems, efficiency and effectiveness. The recommendations and conclusion have been provided at the end of this project.

Journal ArticleDOI
TL;DR: A new emerging trend of modern symmetric encryption algorithm by development of the advanced encryption standard (AES) algorithm is presented, which is abbreviated as Quantum-AES (QAES), and exploits the specific selected secret key generated from the QKD cipher using two different modes (online and off-line).
Abstract: With the rapid evolution of data exchange in network environments, information security has been the most important process for data storage and communication. In order to provide such information security, the confidentiality, data integrity, and data origin authentication must be verified based on cryptographic encryption algorithms. This paper presents a new emerging trend of modern symmetric encryption algorithm by development of the advanced encryption standard (AES) algorithm. The new development focuses on the integration between Quantum Key Distribution (QKD) and an enhanced version of AES. A new quantum symmetric encryption algorithm, which is abbreviated as Quantum-AES (QAES), is the output of such integration. QAES depends on generation of dynamic quantum S-Boxes (DQS-Boxes) based quantum cipher key, instead of the ordinary used static S-Boxes. Furthermore, QAES exploits the specific selected secret key generated from the QKD cipher using two different modes (online and off-line).

Journal ArticleDOI
TL;DR: This paper presents comprehensive study of conventional digital signature schemes based on RSA, DSA and ECDSA (Elliptic Curve Digital Signature Algorithm) and the improved version of these scheme.
Abstract: Due to the rapid growth of online transactions on the Internet, authentication, non-repudiation and integrity are very essential security requirements for a secure transaction To achieve these security goals, digital signature is the most efficient cryptographic primitive Many authors have proposed this scheme and prove their security and evaluate the efficiency In our paper, we present comprehensive study of conventional digital signature schemes based on RSA, DSA and ECDSA (Elliptic Curve Digital Signature Algorithm) and the improved version of these scheme

Journal ArticleDOI
TL;DR: Enhanced privacy technique that combines some anonymity techniques to maintain both privacy and data utility by considering the sensitivity values of attributes in queries using sensitivity weights which determine taking in account utility-based anonymization.
Abstract: Privacy preserving data mining (PPDM) has become more and more important because it allows sharing of privacy sensitive data for analytical purposes A big number of privacy techniques were developed most of which used the k-anonymity property which have many shortcomings, so other privacy techniques were introduced (l-diversity, p-sensitive k-anonymity, (α, k)-anonymity, t-closeness, etc) While they are different in their methods and quality of their results, they all focus first on masking the data, and then protecting the quality of the data This paper is concerned with providing an enhanced privacy technique that combines some anonymity techniques to maintain both privacy and data utility by considering the sensitivity values of attributes in queries using sensitivity weights which determine taking in account utility-based anonymization and then only queries having sensitive attributes whose values exceed threshold are to be changed using generalization boundaries The threshold value is calculated depending on the different weights assigned to individual attributes which take into account the utility of each attribute and those particular attributes whose total weights exceed the threshold values is changed using generalization boundaries and the other queries can be directly published Experiment results using UT dallas anonymization toolbox on real data set adult database from the UC machine learning repository show that although the proposed technique preserves privacy, it also can maintain the utility of the publishing data

Journal ArticleDOI
TL;DR: It is proved that the enhanced-Dragonfly protocol is secure against off-line dictionary attacks by analyzing its security properties using the Scyther tool.
Abstract: Dragonfly is Password Authenticated Key Exchange protocol that uses a shared session key to authenticate parties based on pre-shared secret password. It was claimed that this protocol was secure against off-line dictionary attack, but a new research has proved its vulnerability to off-line dictionary attack and proving step was applied by using “Patched Protocol” which was based on public key validation. Unfortunately, this step caused a raise in the computation cost, which made this protocol less appealing than its competitors. We proposed an alternate enhancement to keep this protocol secure without any extra computation cost that was known as “Enhanced Dragonfly”. This solution based on two-pre-shared secret passwords instead of one and the rounds between parties had compressed into two rounds instead of four. We prove that the enhanced-Dragonfly protocol is secure against off-line dictionary attacks by analyzing its security properties using the Scyther tool. A simulation was developed to measure the execution time of the enhanced protocol, which was found to be much less than the execution time of patched Dragonfly. The off-line dictionary attack time is consumed for few days if the dictionary size is 10,000. According to this, the use of the enhanced Dragonfly is more efficient than the patched Dragonfly.

Journal ArticleDOI
TL;DR: It is verified for the first time commonly held intuitions on malware evolution, showing quantitatively from the archaeological record that over 80% of the time, classes of higher malware complexity emerged later than classes of lower complexity.
Abstract: Dynamic analysis of malware allows us to examine malware samples, and then group those samples into families based on observed behavior. Using Boolean variables to represent the presence or absence of a range of malware behavior, we create a bitstring that represents each malware behaviorally, and then group samples into the same class if they exhibit the same behavior. Combining class definitions with malware discovery dates, we can construct a timeline of showing the emergence date of each class, in order to examine prevalence, complexity, and longevity of each class. We find that certain behavior classes are more prevalent than others, following a frequency power law. Some classes have had lower longevity, indicating that their attack profile is no longer manifested by new variants of malware, while others of greater longevity, continue to affect new computer systems. We verify for the first time commonly held intuitions on malware evolution, showing quantitatively from the archaeological record that over 80% of the time, classes of higher malware complexity emerged later than classes of lower complexity. In addition to providing historical perspective on malware evolution, the methods described in this paper may aid malware detection through classification, leading to new proactive methods to identify malicious software.

Journal ArticleDOI
TL;DR: Different models and techniques used to solve and enhance the security of e-mail systems are introduced and each one is evaluated from the view point of security.
Abstract: E-mail security becomes critical issue to research community in the field of information security. Several solutions and standards have been fashioned according to the recent security requirements in order to enhance the e-mail security. Some of the existing enhancements focus on keeping the exchange of data via e-mail in confident and integral way. While the others focus on authenticating the sender and prove that he will not repudiate from his message. This paper will survey various e-mail security solutions. We introduce different models and techniques used to solve and enhance the security of e-mail systems and evaluate each one from the view point of security.

Journal ArticleDOI
TL;DR: Hide results show that the use of 1st bit in LSB method for embedding data is much better than those used bits as expected, and file’s size affects strongly upon the effectiveness of the embedding process while hiding starting position doesn’t affect upon the variation of the adopted statistical estimators regardless to which bit is used.
Abstract: The weakness of Human Auditory System (HAS) led the audio steganography process to be used in hiding data in the digital sound. Audio steganography is implemented here by using Least Significant Bit (LSB) algorithm to hide message into multiple audio files. This is achieved by 1st, 2nd, 3rd, and 4th bits hiding ratios. In comparison to other used bits, hiding results show that the use of 1st bit in LSB method for embedding data is much better than those used bits as expected. In addition to that and according to the results, file’s size affects strongly upon the effectiveness of the embedding process while hiding starting position doesn’t affect upon the variation of the adopted statistical estimators regardless to which bit is used. Among the statistical estimators that have been adopted here, the Mean Absolute Error (MAE) seems to be the best one in testing hiding process.

Journal ArticleDOI
TL;DR: This article has viewed some authenticated Key Agreement Protocols and presented a comparative study, and described the design principle, security requirement and various attacks on Key Agreement protocol.
Abstract: One of the most important and challenging cryptographic primitives in Public Key Cryptography is Key Agreement Protocol where two or more parties share secret values and establish the session key. Many authors have proposed key agreement protocols. In this article, we have viewed some authenticated Key Agreement Protocols and presented a comparative study. We have also described the design principle, security requirement and various attacks on Key Agreement Protocol.

Journal ArticleDOI
TL;DR: New profile generation methods, profile-updating and profile-combining methods are proposed to reinforce the robustness of keystroke authentication and show the effectiveness of them through three examinations with experimental data.
Abstract: We have investigated several characteristics of the keystroke authentication in Japanese free text typing, and our methods have provided high recognition accuracy for high typing skill users who can type 700 or more letters per 5 minutes. There are, however, some situations decreasing the accuracy such as long period passage after registering each user’s profile documents and existence of lower typing skill users who can type only about 500 - 600 letters per 5 minutes. In this paper, we propose new profile generation methods, profile-updating and profile-combining methods, to reinforce the robustness of keystroke authentication and show the effectiveness of them through three examinations with experimental data.

Journal ArticleDOI
TL;DR: This paper proposes to improve poor contrast of classical VSS schemes for text or alphanumeric secret messages and low entropy images and proposes a method that allows the size of the structuring element to change according to the contrast and thesize of a stacked image.
Abstract: Visual secret sharing (VSS) is one of the cryptographic techniques of Image secret sharing scheme (ISSS) that performs encoding of secret message image (text or picture) into noise like black and white images, which are called as shares. Shares are stacked together and secret message image is decoded using human visual system. One of the major drawbacks of this scheme is its poor contrast of the recovered image, which improves if computational device is available while decoding. In this paper, we propose to improve poor contrast of classical VSS schemes for text or alphanumeric secret messages and low entropy images. Initially, stacked image is binarized using dynamic threshold value. A mathematical morphological operation is applied on the stacked image to enhance contrast of the reconstructed image. Moreover, a method is proposed that allows the size of the structuring element to change according to the contrast and the size of a stacked image. We perform experiments for different types of VSS schemes, different share patterns, different share types (rectangle and circle), and low entropy images. Experimental results demonstrate the efficacy of the proposed scheme.

Journal ArticleDOI
TL;DR: The article makes the case for improving the legislation of the Republic of Kazakhstan in strengthening informational security of individuals, society, the state, and measures to prevent the destructive impact of harmful information.
Abstract: In this article, the theory of information security is written as a context of national security. Article is devoted to an actual problem of legal support of information security in the Republic of Kazakhstan. The author analyzes modern problems and threats of information security in the conditions of globalization and considers aspects of information security. This article focuses on issues of spreading harmful information, which negatively affects the psyche, behavior, health, society and destabilizes the government administration. The article makes the case for improving the legislation of the Republic of Kazakhstan in strengthening informational security of individuals, society, the state, and measures to prevent the destructive impact of harmful information.

Journal ArticleDOI
TL;DR: Two proposed modulation techniques to enhance the performance of WSN are introduced, which merge both the image and the audio in one signal and enhances the energy consumption data rate and the security level.
Abstract: The trade off between the energy consumption and the quality of the received image should be considered as a main point in the techniques design in Wireless Sensor Network (WSN). This paper analyzes the performance of multiple image encryption algorithms with different approaches. And also, it introduces two proposed modulation techniques to enhance the performance of WSN. These two techniques merge both the image and the audio in one signal. The merging process enhances the energy consumption data rate. In addition, it removes the effectiveness of the jamming completely from both the reconstructed image and reconstructed audio signal at the receiver. So, the receiver will reconstruct the image without jamming effectiveness. The paper introduces a proposed audio encryption algorithm. The use of encryption algorithms for both image and audio signals with the merging process enhances the security level. Popular metrics are used to compare between these image encryption algorithms and also to show the benefits from these enhancements. The results show the preference of one of these image encryption algorithms to others. And also, the merging process enhances the bit rate to high level.

Journal ArticleDOI
TL;DR: The normal methods of biometric templates matching are replaced by a more powerful and high quality identification approach based on Grobner bases computations, which gives particularly exact matching.
Abstract: Biometric identification systems are principally related to the information security as well as data protection and encryption. The paper proposes a method to integrate biometrics data encryption and authentication into error correction techniques. The normal methods of biometric templates matching are replaced by a more powerful and high quality identification approach based on Grobner bases computations. In the normal biometric systems, where the data are always noisy, an approximate matching is expected; however, our cryptographic method gives particularly exact matching.