scispace - formally typeset
Search or ask a question

Showing papers on "Key size published in 2015"


Journal ArticleDOI
TL;DR: A coherent-state network protocol able to achieve remarkably high key rates at metropolitan distances, in fact three orders of magnitude higher than those currently achieved, is designed and proposed.
Abstract: Quantum cryptography achieves a formidable task—the remote distribution of secret keys by exploiting the fundamental laws of physics. Quantum cryptography is now headed towards solving the practical problem of constructing scalable and secure quantum networks. A significant step in this direction has been the introduction of measurement-device independence, where the secret key between two parties is established by the measurement of an untrusted relay. Unfortunately, although qubit-implemented protocols can reach long distances, their key rates are typically very low, unsuitable for the demands of a metropolitan network. Here we show, theoretically and experimentally, that a solution can come from the use of continuous-variable systems. We design a coherent-state network protocol able to achieve remarkably high key rates at metropolitan distances, in fact three orders of magnitude higher than those currently achieved. Our protocol could be employed to build high-rate quantum networks where devices securely connect to nearby access points or proxy servers. An end-to-end continuous-variable quantum key distribution system with an untrusted node is proposed. A proof-of-principle experiment shows that 10−1 secret key bits per relay use are distributed at 4 dB loss, corresponding to 20 km in optical fibre.

420 citations


Journal ArticleDOI
TL;DR: In this article, the authors considered the problem of channel resolvability with respect to a warden, who observes the signals through another discrete memoryless channel, and showed that the receiver's channel is better than the warden's channel in a sense that we make precise.
Abstract: We consider the situation in which a transmitter attempts to communicate reliably over a discrete memoryless channel while simultaneously ensuring covertness (low probability of detection) with respect to a warden, who observes the signals through another discrete memoryless channel. We develop a coding scheme based on the principle of channel resolvability, which generalizes and extends prior work in several directions. First, it shows that, irrespective of the quality of the channels, it is possible to communicate on the order of $\sqrt{n}$ reliable and covert bits over $n$ channel uses if the transmitter and the receiver share on the order of $\sqrt{n}$ key bits; this improves upon earlier results requiring on the order of $\sqrt{n}\log n$ key bits. Second, it proves that, if the receiver's channel is "better" than the warden's channel in a sense that we make precise, it is possible to communicate on the order of $\sqrt{n}$ reliable and covert bits over $n$ channel uses without a secret key; this generalizes earlier results established for binary symmetric channels. We also identify the fundamental limits of covert and secret communications in terms of the optimal asymptotic scaling of the message size and key size, and we extend the analysis to Gaussian channels. The main technical problem that we address is how to develop concentration inequalities for "low-weight" sequences; the crux of our approach is to define suitably modified typical sets that are amenable to concentration inequalities.

184 citations


Journal ArticleDOI
TL;DR: This paper implements the Elliptic Curve cryptography to encrypt, decrypt and digitally sign the cipher image to provide authenticity and integrity.

81 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a security analysis for quantum key distribution, establishing a rigorous tradeoff between various protocol and security parameters for a class of entanglement-based and prepare-and-measure protocols.
Abstract: In this work we present a security analysis for quantum key distribution, establishing a rigorous tradeoff between various protocol and security parameters for a class of entanglement-based and prepare-and-measure protocols. The goal of this paper is twofold: 1) to review and clarify the state-of-the-art security analysis based on entropic uncertainty relations, and 2) to provide an accessible resource for researchers interested in a security analysis of quantum cryptographic protocols that takes into account finite resource effects. For this purpose we collect and clarify several arguments spread in the literature on the subject with the goal of making this treatment largely self-contained. More precisely, we focus on a class of prepare-and-measure protocols based on the Bennett-Brassard (BB84) protocol as well as a class of entanglement-based protocols similar to the Bennett-Brassard-Mermin (BBM92) protocol. We carefully formalize the different steps in these protocols, including randomization, measurement, parameter estimation, error correction and privacy amplification, allowing us to be mathematically precise throughout the security analysis. We start from an operational definition of what it means for a quantum key distribution protocol to be secure and derive simple conditions that serve as sufficient condition for secrecy and correctness. We then derive and eventually discuss tradeoff relations between the block length of the classical computation, the noise tolerance, the secret key length and the security parameters for our protocols. Our results significantly improve upon previously reported tradeoffs.

75 citations


Journal ArticleDOI
TL;DR: A record high bit rate prototype QKD system providing a total of 878 Gbit of secure key data over a 34 day period corresponding to a sustained key rate of around 300 kbit/s is reported.
Abstract: Securing information in communication networks is an important challenge in today's world. Quantum Key Distribution (QKD) can provide unique capabilities towards achieving this security, allowing intrusions to be detected and information leakage avoided. We report here a record high bit rate prototype QKD system providing a total of 878 Gbit of secure key data over a 34 day period corresponding to a sustained key rate of around 300 kbit/s. The system was deployed over a standard 45 km link of an installed metropolitan telecommunication fibre network in central Tokyo. The prototype QKD system is compact, robust and automatically stabilised, enabling key distribution during diverse weather conditions. The security analysis includes an efficient protocol, finite key size effects and decoy states, with a quantified key failure probability of e = 10⁻¹⁰.

75 citations


Book ChapterDOI
01 Nov 2015
TL;DR: A Dynamic Key Length Based Security Framework (DLSeF) based on the shared key derived from synchronized prime numbers; the key is dynamically updated in short intervals to thwart Man in the Middle and other Network attacks.
Abstract: The near real-time processing of continuous data flows in large scale sensor networks has many applications in risk-critical domains ranging from emergency management to industrial control systems. The problem is how to ensure end-to-end security (e.g., integrity, and authenticity) of such data stream for risk-critical applications. We refer this as an online security verification problem. Existing security techniques cannot deal with this problem because they were not designed to deal with high volume, high velocity data in real-time. Furthermore, they are inefficient as they introduce a significant buffering delay during security verification, resulting in a requirement of large buffer size for the stream processing server. To address this problem, we propose a Dynamic Key Length Based Security Framework (DLSeF) based on the shared key derived from synchronized prime numbers; the key is dynamically updated in short intervals to thwart Man in the Middle and other Network attacks. Theoretical analyses and experimental results of DLSeF framework show that it can significantly improve the efficiency of processing stream data by reducing the security verification time without compromising the security.

69 citations


Journal ArticleDOI
TL;DR: A new technique has been proposed in this paper where the classic technique of mapping the characters to affine points in the elliptic curve has been removed and the corresponding ASCII values of the plain text are paired up.

69 citations


Proceedings ArticleDOI
14 Jun 2015
TL;DR: This paper proves the unconditional security of a particular semi-quantum protocol and derive an expression for its key rate, in the asymptotic scenario.
Abstract: Semi-quantum key distribution protocols are designed to allow two users to establish a secure secret key when one of the two users is limited to performing certain “classical” operations. There have been several such protocols developed recently, however, due to their reliance on a two-way quantum communication channel (and thus, the attacker's opportunity to interact with the qubit twice), their security analysis is difficult and little is known concerning how secure they are compared to their fully quantum counterparts. In this paper we prove the unconditional security of a particular semi-quantum protocol and derive an expression for its key rate, in the asymptotic scenario.

68 citations


Book ChapterDOI
16 Aug 2015
TL;DR: It is proved that if a small number of plaintexts are encrypted under multiple independent keys, the Even-Mansour construction surprisingly offers similar security as an ideal block cipher with the same block and key size.
Abstract: At ASIACRYPT 1991, Even and Mansour introduced a block cipher construction based on a single permutation. Their construction has since been lauded for its simplicity, yet also criticized for not providing the same security as other block ciphers against generic attacks. In this paper, we prove that if a small number of plaintexts are encrypted under multiple independent keys, the Even-Mansour construction surprisingly offers similar security as an ideal block cipher with the same block and key size. Note that this multi-key setting is of high practical relevance, as real-world implementations often allow frequent rekeying. We hope that the results in this paper will further encourage the use of the Even-Mansour construction, especially when a secure and efficient implementation of a key schedule would result in significant overhead.

50 citations


Journal ArticleDOI
TL;DR: A new security algorithm using combination of both symmetric and asymmetric cryptographic techniques is proposed to provide high security with minimized key maintenance, and guarantees three cryptographic primitives, integrity, confidentiality and authentication.

47 citations


Proceedings ArticleDOI
12 Nov 2015
TL;DR: The enhancement of the AES algorithm is discussed and the process, which involves the generation of dynamic S-boxes for Advance Encryption Standard (AES), is described, which are more dynamic and key-dependent which make the differential and linear cryptanalysis more difficult.
Abstract: Cryptographic algorithms uniquely define the mathematical steps required to encrypt and decrypt messages in a cryptographic system. Shortly, they protect data from unauthorized access. The process of encryption is a crucial technique to ensure the protection of important electronic information and allows two parties to communicate and prevent unauthorized parties from accessing the information simultaneously. The process of encrypting information is required to be dynamic in nature to ensure protection from novel and advanced techniques used by cryptanalysts. The substitution box (S-box) is a key fundamental of contemporary symmetric cryptosystems as it provides nonlinearity to cryptosystems and enhances the security of their cryptography. This paper discusses the enhancement of the AES algorithm and describes the process, which involves the generation of dynamic S-boxes for Advance Encryption Standard (AES). The generated S-boxes are more dynamic and key-dependent which make the differential and linear cryptanalysis more difficult. NIST randomness tests and correlation coefficient were conducted on the proposed dynamic AES algorithm, their results showing that it is superior to the original AES with security verified.

Journal Article
TL;DR: In this paper, Even and Mansour's Even-Mansour construction has been shown to offer similar security as an ideal block cipher with the same block and key size, under multiple independent keys.
Abstract: At ASIACRYPT 1991, Even and Mansour introduced a block cipher construction based on a single permutation. Their construction has since been lauded for its simplicity, yet also criticized for not providing the same security as other block ciphers against generic attacks. In this paper, we prove that if a small number of plaintexts are encrypted under multiple independent keys, the Even-Mansour construction surprisingly offers similar security as an ideal block cipher with the same block and key size. Note that this multi-key setting is of high practical relevance, as real-world implementations often allow frequent rekeying. We hope that the results in this paper will further encourage the use of the Even-Mansour construction, especially when a secure and efficient implementation of a key schedule would result in significant overhead.

Proceedings ArticleDOI
15 May 2015
TL;DR: AES and DES and their comparison using MATLAB software are discussed and their result on the basis of avalanche effect, simulation time and memory required by AES and DES are compared.
Abstract: In these days use of digital data exchange is increasing day by day in every field Information security plays very important role in storing and transmitting the data When we transmit a multimedia data such as audio, video, images etc over the network, cryptography provides security In cryptography, we encode data before sending it and decode it on receiving, for this purpose, we use many cryptographic algorithms AES and DES are most commonly used cryptographic algorithms AES provides the encryption to secure the data before the transmission and DES also provides security as AES In this paper we discussed AES and DES and their comparison using MATLAB software After applying AES and DES, we compare their result on the basis of avalanche effect, simulation time and memory required by AES and DES

Journal ArticleDOI
TL;DR: Proposed scheme implements Elliptic Curve Cryptography (ECC) for secure key distribution and data exchange and proves a better security mechanism in case of WSNs for healthcare devices.

Book ChapterDOI
12 Aug 2015
TL;DR: In this paper, a TMD trade-off attack on a stream cipher called Sprout has been presented, where the internal state size of a stream ciphers is at least twice the key length to provide resistance against the conventional Time-Memory-Data TMD attack.
Abstract: The internal state size of a stream cipher is supposed to be at least twice the key length to provide resistance against the conventional Time-Memory-Data TMD tradeoff attacks. This well adopted security criterion seems to be one of the main obstacles in designing, particularly, ultra lightweight stream ciphers. At FSE 2015, Armknecht and Mikhalev proposed an elegant design philosophy for stream ciphers as fixing the key and dividing the internal states into equivalence classes where any two different keys always produce non-equivalent internal states. The main concern in the design philosophy is to decrease the internal state size without compromising the security against TMD tradeoff attacks. If the number of equivalence classes is more than the cardinality of the key space, then the cipher is expected to be resistant against TMD tradeoff attacks even though the internal state except the fixed key is of fairly small length. Moreover, Armknecht and Mikhalev presented a new design, which they call Sprout, to embody their philosophy. In this work, ironically, we mount a TMD tradeoff attack on Sprout within practical limits using $$2^d$$2d output bits in $$2^{71-d}$$271-d encryptions of Sprout along with $$2^{d}$$2d table lookups. The memory complexity is $$2^{86-d}$$286-d where $$d\le 40$$d≤40. In one instance, it is possible to recover the key in $$2^{31}$$231 encryptions and $$2^{40}$$240 table lookups if we have $$2^{40}$$240 bits of keystream output by using tables of 770 Terabytes ini¾?total. The offline phase of preparing the tables consists of solving roughly $$2^{41.3}$$241.3 systems of linear equations with 20 unknowns and an effort of about $$2^{35}$$235 encryptions. Furthermore, we mount a guess-and-determine attack having a complexity about $$2^{68}$$268 encryptions with negligible data and memory. We have verified our attacks by conducting several experiments. Our results show that Sprout can be practically broken.

Journal ArticleDOI
TL;DR: Although there are many symmetric key algorithms, this work proposed a content-based algorithm, which follows the Symmetric key cryptography method, an algorithm implementing binary addition operation, a circular bit shifting operation and folding method, as a deep concern has given to make the key secure.

Journal ArticleDOI
TL;DR: In this paper, a detailed security analysis of a d-dimensional quantum key distribution protocol based on two and three mutually unbiased bases (MUBs) both in an asymptotic and finite key length scenario is presented.
Abstract: We present a detailed security analysis of a d-dimensional quantum key distribution protocol based on two and three mutually unbiased bases (MUBs) both in an asymptotic and finite key length scenario. The finite secret key rates are calculated as a function of the length of the sifted key by (i) generalizing the uncertainly relation-based insight from BB84 to any d-level 2-MUB QKD protocol and (ii) by adopting recent advances in the second-order asymptotics for finite block length quantum coding (for both d-level 2- and 3-MUB QKD protocols). Since the finite and asymptotic secret key rates increase with d and the number of MUBs (together with the tolerable threshold) such QKD schemes could in principle offer an important advantage over BB84. We discuss the possibility of an experimental realization of the 3-MUB QKD protocol with the orbital angular momentum degrees of freedom of photons.

Proceedings ArticleDOI
23 Nov 2015
TL;DR: A step-by-step tutorial to transform ECC over prime field GF(p) from mathematical concept to the software implementation and several alternatives and tradeoffs between different coordinate systems in the computational process are given.
Abstract: Since the last decade, the growth of computing power and parallel computing has resulted in significant needs of efficient cryptosystem. Elliptic Curve Cryptography (ECC) offers faster computation and stronger security over other asymmetric cryptosystems such as RSA. ECC can be used for several cryptography activities: secret key sharing, message encryption, and digital signature. This paper gives step-by-step tutorial to transform ECC over prime field GF(p) from mathematical concept to the software implementation. This paper also gives several alternatives and tradeoffs between different coordinate systems in the computational process. The implementation result is quite interesting since several computational costs have been optimized in latest instruction sets. For the study case, we provides the implementation result in C language with GNU GMP library on Intel i3 CPU M350 2.27GHz (1 Core, 2 GB RAM, 32-bit architecture).

Proceedings ArticleDOI
03 Dec 2015
TL;DR: A fair comparison between RSA and Modified RSA algorithm along with time and security is presented by running several encryption and decryption setting to process data of different sizes and proves that RSA algorithm is faster than Modified RSA in terms of encryption andDecryption speed.
Abstract: Digital signature has been providing security services to secure electronic transaction over internet. Rivest, Shamir and Adlemen (RSA) algorithm was most widely used to provide security technique. Here we have modified the RSA algorithm to enhance its level of security. This paper presents a fair comparison between RSA and Modified RSA algorithm along with time and security by running several encryption and decryption setting to process data of different sizes. The efficiency of these algorithms was considered based on key generation speed and security level. The texts of different sizes were encrypted and decrypted using RSA and modified RSA algorithms. The simulation result proves that in Modified RSA algorithm key generation is faster and it enhances security by two levels. RSA algorithm is faster than Modified RSA in terms of encryption and decryption speed.

Journal ArticleDOI
TL;DR: This paper proposes a generic framework of lightweight key updating that can protect the current cryptographic standards and evaluates the minimum requirements for heuristic SCA-security, and proposes a complete solution to protect the implementation of any standard mode of Advanced Encryption Standard.
Abstract: Side-channel analysis (SCA) exploits the information leaked through unintentional outputs (e.g., power consumption) to reveal the secret key of cryptographic modules. The real threat of SCA lies in the ability to mount attacks over small parts of the key and to aggregate information over different encryptions. The threat of SCA can be thwarted by changing the secret key at every run. Indeed, many contributions in the domain of leakage resilient cryptography tried to achieve this goal. However, the proposed solutions were computationally intensive and were not designed to solve the problem of the current cryptographic schemes. In this paper, we propose a generic framework of lightweight key updating that can protect the current cryptographic standards and evaluate the minimum requirements for heuristic SCA-security. Then, we propose a complete solution to protect the implementation of any standard mode of Advanced Encryption Standard. Our solution maintains the same level of SCA-security (and sometimes better) as the state of the art, at a negligible area overhead while doubling the throughput of the best previous work.

Journal ArticleDOI
TL;DR: This work presents a protocol that uses session keys derived from those master keys to establish a group key that is information-theoretically secure and compares favorably to multi-party extensions of Diffie-Hellman key exchange.
Abstract: Advances in lattice-based cryptography are enabling the use of public key algorithms (PKAs) in power-constrained ad hoc and sensor network devices. Unfortunately, while many wireless networks are dominated by group communications, PKAs are inherently unicast—i.e., public/private key pairs are generated by data destinations. To fully realize public key cryptography in these networks, lightweight PKAs should be augmented with energy-efficient mechanisms for group key agreement. We consider a setting where master keys are loaded on clients according to an arbitrary distribution. We present a protocol that uses session keys derived from those master keys to establish a group key that is information-theoretically secure. When master keys are distributed randomly, our protocol requires $O(\log_b t)$ multicasts, where $1-1/b$ is the probability that a given client possesses a given master key. The minimum number of public multicast transmissions required for a set of clients to agree on a secret key in our setting was recently characterized. The proposed protocol achieves the best possible approximation to that optimum that is computable in polynomial time. Moreover, the computational requirements of our protocol compare favorably to multi-party extensions of Diffie-Hellman key exchange.

Journal ArticleDOI
TL;DR: All QKD phases are analysed with an emphasis on the explanation of the process of shortening the initial key with a large number of tests using a quantum cryptography simulator.
Abstract: Quantum key distribution (QKD) is based on the laws of quantum physics and therefore it can guarantee the highest level of security. It is used to establish the key that is used for further symmetrical encryption. Since QKD consists of several phases in which the key is reduced, it is necessary to define the equation by which the length of the raw key is calculated. In this paper, we analyse all QKD phases with an emphasis on the explanation of the process of shortening the initial key. The results are verified with a large number of tests using a quantum cryptography simulator. DOI: http://dx.doi.org/10.5755/j01.eee.21.6.13768

Proceedings ArticleDOI
02 Apr 2015
TL;DR: This paper proposes an efficient many-to-many group key management protocol in distributed group communication based on Elliptic Curve Cryptography and decreases the key length while providing securities at the same level as that of other cryptosystems provides.
Abstract: Secure and reliable group communication is an active area of research. Its popularity is fuelled by the growing importance of group-oriented and collaborative properties. The central research challenge is secure and efficient group key management. In this paper, we propose an efficient many-to-many group key management protocol in distributed group communication. This protocol is based on Elliptic Curve Cryptography and decrease the key length while providing securities at the same level as that of other cryptosystems provides. The main issue in secure group communication is group dynamics and key management. A scalable secure group communication model ensures that whenever there is a membership change, a new group key is computed and distributed to the group members with minimal communication and computation cost. This paper explores the use of batching of group membership changes to reduce the time and key re-distribution operations. The features of ECC protocol are that, no keys are exchanged between existing members at join, and only one key, the group key, is delivered to remaining members at leave. In the security analysis, our proposed algorithm takes less time when users join or leave the group in comparison to existing one. In ECC, there is only 1 key generation and key encryption overhead at join and leave operation. At join the communication overhead is key size of a node and at leave operation is 2 log2 n -- 2 × key size of a node.

Proceedings ArticleDOI
09 Jul 2015
TL;DR: The solution of ECDH crypto library allows the use of public key cryptography for key establishment for microcontroller with limited resources without adding any additional specialized equipment.
Abstract: In this article, the ECDH crypto library is introduced. This library is designed to ultra-low power MSP430 microcontroller and it allows implement time and memory consuming cryptographic operations in this microcontroller with limited resources. The main part of the article focuses on the way of ECDH implementation to the MSP430 microcontroller. Some implementation problems were discussed here. The practical part of the article focuses on measuring of computing times and memory size requirements. Our solution of ECDH crypto library allows the use of public key cryptography for key establishment for microcontroller with limited resources without adding any additional specialized equipment.

Journal ArticleDOI
TL;DR: Performance comparisons show that the 48K-bit design, which is applicable for both RSA and fully homomorphic encryption, outperforms the previous works with respect to throughput and efficiency.
Abstract: This brief presents a novel and efficient design for a Rivest–Shamir–Adleman (RSA) cryptosystem with a very large key size A new modular multiplier architecture is proposed by combining the fast Fourier transform-based Strassen multiplication algorithm and Montgomery reduction, which is different from the interleaved version of Montgomery multiplications used in traditional RSA designs A new modular exponentiation algorithm is also proposed for the RSA design Applying this method, we have implemented 8K/12K-bit and 48K-bit RSA on application-specific integrated circuit designs The results show that the proposed method gains more advantage as the key size increases, which matches the complexity analysis Performance comparisons show that the 48K-bit design, which is applicable for both RSA and fully homomorphic encryption, outperforms the previous works with respect to throughput and efficiency

Proceedings ArticleDOI
Pei Luo1, Yunsi Fei1, Xin Fang1, A. Adam Ding1, David Kaeli1, Miriam Leeser1 
14 Jun 2015
TL;DR: In this paper, the authors present a side-channel analysis of a hardware implementation of MAC-Keccak on FPGA and compare the attack complexity with other cryptographic algorithms.
Abstract: As Keccak has been selected as the new SHA-3 standard, Message Authentication Code (MAC) (MAC-Keccak) using a secret key will be widely used for integrity checking and authenticity assurance. Recent works have shown the feasibility of side-channel attacks against software implementations of MAC-Keccak to retrieve the key, with the security assessment of hardware implementations remaining an open problem. In this paper, we present a comprehensive and practical side-channel analysis of a hardware implementation of MAC-Keccak on FPGA. Different from previous works, we propose a new attack method targeting the first round output of MAC-Keccak rather than the linear operation θ only. The results on sampled power traces show that the unprotected hardware implementation of MAC-Keccak is vulnerable to side-channel attacks, and attacking the nonlinear operation of MAC-Keccak is very effective. We further discuss countermeasures against side-channel analysis on hardware MAC-Keccak. Finally, we discuss the impact of the key length on side-channel analysis and compare the attack complexity between MAC-Keccak and other cryptographic algorithms.

Proceedings ArticleDOI
14 Apr 2015
TL;DR: This work shows that key generation with PUFs is a practical application of the generic information theoretic problem of secret key agreement with a compound source, and presents an improved secure sketch construction with the new optimal syndrome coding scheme for PUFs, Systematic Low Leakage Coding (SLLC).
Abstract: Physical Unclonable Functions (PUFs) derive unique secrets from internal manufacturing variations in integrated circuits. This work shows that key generation with PUFs is a practical application of the generic information theoretic problem of secret key agreement with a compound source.We present an improved secure sketch construction with our new optimal syndrome coding scheme for PUFs, Systematic Low Leakage Coding (SLLC). Our scheme provides inherent information theoretic security without the need of a hash function or strong extractor, and optimal asymptotic performance concerning maximum key size and minimum helper data size. The secrecy leakage is bounded by a small epsilon that goes to zero for sufficiently good PUFs.The reference implementation for an ASIC application scenario shows that our scheme does not require the 47% hardware overhead for the hash function that is mandatory for the state-of-the-art approaches.

Journal ArticleDOI
TL;DR: By exploiting the properties of finite-length polar codes, the authors introduce a physical layer encryption scheme to make secure and efficient communication between a sender and a legitimate receiver against both active and passive attacks, simultaneously.
Abstract: In this study, by exploiting the properties of finite-length polar codes, the authors introduce a physical layer encryption scheme to make secure (from a computational security perspective) and efficient communication between a sender (Alice) and a legitimate receiver (Bob) against both active and passive attacks, simultaneously. To prevent active attacks, two techniques are considered: (i) a novel method is introduced to keep the generator matrix of polar code secret from an active attacker (Oscar); (ii) a proper joint polar encoding/encryption algorithm based on the hidden generator matrix is introduced. Two additional strategies are considered against passive attacks: (i) a new method is introduced to partition the bit-channels into good and bad bit-channels and then the scrambled information bits are transmitted over those bit-channels that are good for Bob but bad for a passive attacker (Eve); (ii) the secret key cryptography is implemented at the physical layer, such that Eve cannot decode the eavesdropped data without the knowledge of the secret key shared between the authorised parties. Besides, this study discusses efficiency analysis results consisting key size, error performance and computational complexity of the proposed scheme.

Journal ArticleDOI
TL;DR: This paper proposes a new approach for e-security applications using the concept of genetic algorithms with pseudorandom sequence to encrypt and decrypt data stream and presents application of GA in the field of cryptography.
Abstract: Cryptography is a basic tool for protection and securing data. Security provides safety, reliability and accuracy. Genetic Algorithm (GA) is typically used to obtain solution for optimization and search problems. This paper presents application of GA in the field of cryptography. The selection of key in the field ofpublic key cryptography is a selection process in which keys can be categorized on the basis of their fitness function, making GA a better candidate for the key generation. We propose a new approach for e-security applications using the concept of genetic algorithms with pseudorandom sequence to encrypt and decrypt data stream. Many different image encryption methods have been proposed to keep the security of these images. Image encryption algorithms try to convert an image to another image that is hard to understand.

Proceedings ArticleDOI
16 Mar 2015
TL;DR: An enhanced security model of OTP system using ECC with palm-vein biometrie is suggested, which suggests better security with lesser key size than other prevalent public key crypto-model.
Abstract: Security of one-time password (OTP) is essential because nowadays most of the e-commerce transactions are performed with the help of this mechanism. OTP is used to counter replay attack/eavesdropping. Replay attack or eavesdropping is one type of attacks on network-connected computing environment or isolated computing environment. For achieving 112 bits of security level, Rivest Shamir and Adleman (RSA) algorithm needs key size of 2048 bits, while Elliptic Curve Cryptography (ECC) needs key size of 224–255 bits. Another issue with most of the existing implementation of security models is storage of secret keys. Cryptographic keys are often kept in en-secured way that can either be guessed/social-engineered or obtained through brute force attacks. This becomes a weak link and leads integrity issues of sensitive data in a security model. To overcome the above problem, biometrics is combined with cryptography for developing strong security model. This paper suggests an enhanced security model of OTP system using ECC with palm-vein biometrie. This model also suggests better security with lesser key size than other prevalent public key crypto-model. The cryptographic keys are also not required to memorize or keep anywhere, these keys are generated as and when needed.