scispace - formally typeset
Search or ask a question

Showing papers on "Key size published in 2007"


Journal ArticleDOI
TL;DR: A new stream cipher, Grain, is proposed, which targets hardware environments where gate count, power consumption and memory is very limited and has the additional feature that the speed can be increased at the expense of extra hardware.
Abstract: A new stream cipher, Grain, is proposed. The design targets hardware environments where gate count, power consumption and memory is very limited. It is based on two shift registers and a non-linear output function. The cipher has the additional feature that the speed can be increased at the expense of extra hardware. The key size is 80 bits and no attack faster than exhaustive key search has been identified. The hardware complexity and throughput compares favourably to other hardware oriented stream ciphers like E0 and A5/1.

570 citations


01 Mar 2007
TL;DR: The asymmetric-key-based key agreement schemes in this Recommendation are based on the Diffie-Hellman (DH) and Menezes-Qu-Vanstone (MQV) algorithms and an asymmetric, key transport scheme is specified.
Abstract: This Recommendation provides the specifications of key establishment schemes that are appropriate for use by the U.S. Federal Government, based on standards developed by the Accredited Standards Committee (ASC) X9, Inc.: American National Standard (ANS) X9.42 Agreement of Symmetric Keys using Discrete Logarithm Cryptography and ANS X9.63 Key Agreement and Key Transport using Elliptic Curve Cryptography. A key establishment scheme can be characterized as either a key agreement scheme or a key transport scheme. The asymmetric-key-based key agreement schemes in this Recommendation are based on the Diffie-Hellman (DH) and Menezes-Qu-Vanstone (MQV) algorithms. In addition, an asymmetric-key-based key transport scheme is specified.

242 citations


Proceedings ArticleDOI
24 Jun 2007
TL;DR: This work revisits the code-based identification protocol proposed by Stern at Crypto'93, and gives evidence that the size of public keys can be dramatically reduced while preserving a high and well-understood level of security.
Abstract: We revisit the code-based identification protocol proposed by Stern at Crypto'93, and give evidence that the size of public keys can be dramatically reduced while preserving a high and well-understood level of security. More precisely, the public keys can be made even shorter than RSA ones (typically 347 bits), while their size is around 150 Kbits in the original scheme. This is achieved by using matrices which are double circulant, rather than purely random. On the whole, this provides a very practical identification (and possibly signature) scheme which is mostly attractive for light-weight cryptography.

101 citations


Journal Article
TL;DR: In this paper, the authors investigate the strength of DES against attacks that use a limited number of plaintexts and ciphertexts, and they find that up to 6-round DES is susceptible to this kind of attacks.
Abstract: The Data Encryption Standard (DES) is a 64-bit block cipher. Despite its short key size of 56 bits, DES continues to be used to protect financial transactions valued at billions of Euros. In this paper, we investigate the strength of DES against attacks that use a limited number of plaintexts and ciphertexts. By mounting meet-in-the-middle attacks on reduced-round DES, we find that up to 6-round DES is susceptible to this kind of attacks. The results of this paper lead to a better understanding on the way DES can be used.

53 citations


Journal ArticleDOI
TL;DR: In this article, the authors characterize the form of an invertible quantum operation, i.e., a completely positive trace preserving linear transformation (a CPTP map) whose inverse is also a CPTP mapping, and show that these maps correspond to applying a unitary transformation to the state along with an ancilla initialized to a fixed state.
Abstract: In this note, we characterize the form of an invertible quantum operation, i.e., a completely positive trace preserving linear transformation (a CPTP map) whose inverse is also a CPTP map. The precise form of such maps becomes important in contexts such as self-testing and encryption. We show that these maps correspond to applying a unitary transformation to the state along with an ancilla initialized to a fixed state, which may be mixed. The characterization of invertible quantum operations implies that one-way schemes for encrypting quantum states using a classical key may be slightly more general than the "private quantum channels" studied by Ambainis, Mosca, Tapp and de Wolf [1, Section 3]. Nonetheless, we show that their results, most notably a lower bound of 2n bits of key to encrypt n quantum bits, extend in a straightforward manner to the general case.

50 citations


Book ChapterDOI
09 Dec 2007
TL;DR: The strength of DES against attacks that use a limited number of plaintexts and ciphertexts is investigated by mounting meet-in-the-middle attacks on reduced-round DES.
Abstract: The Data Encryption Standard (DES) is a 64-bit block cipher. Despite its short key size of 56 bits, DES continues to be used to protect financial transactions valued at billions of Euros. In this paper, we investigate the strength of DES against attacks that use a limited number of plaintexts and ciphertexts. By mounting meet-in-the-middle attacks on reduced-round DES, we find that up to 6-round DES is susceptible to this kind of attacks. The results of this paper lead to a better understanding on the way DES can be used.

44 citations


Book ChapterDOI
21 Feb 2007
TL;DR: In this article, it was shown that if the n-bit source S allows for a secure encryption of b bits, where b > log n, then one can deterministically extract nearly b almost perfect random bits from S.
Abstract: Most cryptographic primitives require randomness (for example, to generate their secret keys). Usually, one assumes that perfect randomness is available, but, conceivably, such primitives might be built under weaker, more realistic assumptions. This is known to be true for many authentication applications, when entropy alone is typically sufficient. In contrast, all known techniques for achieving privacy seem to fundamentally require (nearly) perfect randomness. We ask the question whether this is just a coincidence, or, perhaps, privacy inherently requires true randomness? We completely resolve this question for the case of (information-theoretic) private-key encryption, where parties wish to encrypt a b-bit value using a shared secret key sampled from some imperfect source of randomness S. Our main result shows that if such n-bit source S allows for a secure encryption of b bits, where b > log n, then one can deterministically extract nearly b almost perfect random bits from S. Further, the restriction that b > log n is nearly tight: there exist sources S allowing one to perfectly encrypt (log n - loglog n) bits, but not to deterministically extract even a single slightly unbiased bit. Hence, to a large extent, true randomness is inherent for encryption: either the key length must be exponential in the message length b, or one can deterministically extract nearly b almost unbiased random bits from the key. In particular, the one-time pad scheme is essentially "universal". Our technique also extends to related computational primitives which are perfectly-binding, such as perfectly-binding commitment and computationally secure private- or public-key encryption, showing the necessity to efficiently extract almost b pseudorandom bits.

34 citations


Book ChapterDOI
26 Mar 2007
TL;DR: Two reflection attacks on r-round Blowfish which is a fast, software oriented encryption algorithm with a variable key length k work successfully on approximately 2k+32-16r number of keys which are called reflectively weak keys.
Abstract: The reflection attack is a recently discovered self similarity analysis which is usually mounted on ciphers with many fixed points. In this paper, we describe two reflection attacks on r-round Blowfish which is a fast, software oriented encryption algorithm with a variable key length k. The attacks work successfully on approximately 2k+32-16r number of keys which we call reflectively weak keys. We give an almost precise characterization of these keys. One interesting result is that 234 known plaintexts are enough to determine if the unknown key is a reflectively weak key, for any key length and any number of rounds. Once a reflectively weak key is identified, a large amount of subkey information is revealed with no cost. Then, we recover the key in roughly r ċ 216r+22 steps. Furthermore, it is possible to improve the attack for some key lengths by using memory to store all reflectively weak keys in a table in advance. The pre-computation phase costs roughly r ċ 2k-11 steps. Then the unknown key can be recovered in 2(k+32-16r)/64 steps. As an independent result, we improve Vaudenay's analysis on Blowfish for reflectively weak keys. Moreover, we propose a new success criterion for an attack working on some subset of the key space when the key generator is random.

29 citations


01 Jan 2007
TL;DR: The implementation of Elliptic Curve Cryptography -based Threshold Cryptography (ECC-TC) is implemented, which explores three most-efficient ECC encryption algorithms and put forth possibility of using these ECC-TC algorithms in different scenarios in a MANET.
Abstract: Summary A Mobile Ad hoc Network (MANET) consists of multiple wireless mobile devices that form a network on the fly to allow communication with each other without any infrastructure. Due to its nature, providing security in this network is challenging. Threshold Cryptography (TC) provides a promise of securing this network. In this paper, our purpose is to find most suitable ECC algorithm compared to RSA. Through our implementation of Elliptic Curve Cryptography -based Threshold Cryptography (ECC-TC), we have explored three most-efficient ECC encryption algorithms and put forth possibility of using these ECC-TC algorithms in different scenarios in a MANET. We compare all ECC-TC results and suggest an algorithm that would be most suitable for MANET. Finally, we put forth a new secret sharing alternative that limit communication overheads for transmitting multiple secrets at the same time.

25 citations


Proceedings ArticleDOI
13 Dec 2007
TL;DR: A new security protocol for on-line transaction can be designed using combination of both symmetric and asymmetric cryptographic techniques, which provides three cryptographic primitives - integrity, confidentiality and authentication.
Abstract: A new security protocol for on-line transaction can be designed using combination of both symmetric and asymmetric cryptographic techniques. This protocol provides three cryptographic primitives - integrity, confidentiality and authentication. It uses elliptic curve cryptography for encryption, RSA algorithm for authentication and MD-5 for integrity. Instead of ECC symmetric cipher (AES-Rijndael) can be used to encrypt, public key cryptography (RSA) to authenticate and MD-5 to check for integrity. The symmetric cryptographic algorithms are fast as compared to asymmetric cryptographic algorithms like RSA, elliptic curve cryptography. Communication has a major impact on today's business. It is desired to communicate data with high security. At present, various types of cryptographic algorithms provide high security to information on controlled networks. These algorithms are required to provide data security and users authenticity. A new security protocol has been designed for better security using a combination of both symmetric and asymmetric cryptographic techniques.

24 citations


01 Jan 2007
TL;DR: The new cipher CTC2 is MUCH more secure than CTC against LC and the key scheduling of CTC has been extended to use any key size, independently from the block size.
Abstract: The cipher CTC (Courtois Toy Cipher) described in [4] has been designed to demonstrate that it is possible to break on a PC a block cipher with good diffusion and very small number of known (or chosen) plaintexts. It has however never been designed to withstand all known attacks on block ciphers and Dunkelman and Keller have shown [13] that a few bits of the key can be recovered by Linear Cryptanalysis (LC) – which cannot however compromise the security of a large key. This weakness can easily be avoided: in this paper we give a specification of CTC2, a tweaked version of CTC. The new cipher is MUCH more secure than CTC against LC and the key scheduling of CTC has been extended to use any key size, independently from the block size. Otherwise, there is little difference between CTC and CTC2. We will show that up to 10 rounds of CTC2 can be broken by simple algebraic attacks.

Book ChapterDOI
10 Sep 2007
TL;DR: The aim of this work is to explore the possibilities of dedicated hardware implementing the best known algorithm for generic curves: the parallelized Pollard's ρmethod, and to improve the accuracy of the security level offered by a given key size.
Abstract: In this last decade, Elliptic Curve Cryptography (ECC) has gained increasing acceptance in the industry and the academic community and has been the subject of several standards. This interest is mainly due to the high level of security with relatively small keys provided by ECC. Indeed, no sub-exponential algorithms are known to solve the underlying hard problem: the Elliptic Curve Discrete Logarithm. The aim of this work is to explore the possibilities of dedicated hardware implementing the best known algorithm for generic curves: the parallelized Pollard's ρmethod. This problem has specific constraints and requires therefore new architectures. Four different strategies were investigated with different FPGA families in order to provide the best area-time product, according to the capabilities of the chosen platforms. The approach yielding the best throughput over hardware cost ratio is then fully described and was implemented in order to estimate the cost of an attack. Such results should help to improve the accuracy of the security level offered by a given key size, especially for the shorter parameters proposed for resource constrained devices.

Journal ArticleDOI
TL;DR: This work evaluates the security performance of the recently proposed "stealth" approach to covert communications over a public fiber-optical network and demonstrates the security advantage of the system by examining the BER/SNR performance as a function of the fidelity of the decoder used by an eavesdropper.
Abstract: We evaluate the security performance of the recently proposed “stealth” approach to covert communications over a public fiber-optical network. We present quantitative security analysis to assess the vulnerability of such systems against different attacks executed by an eavesdropper. We demonstrate the security advantage of the system by examining the BER/SNR performance as a function of the fidelity of the decoder used by an eavesdropper. Effective key length is constructed as a security metric to gauge the level of confidentiality implicit in the secure transmission.

Proceedings ArticleDOI
13 Jul 2007
TL;DR: An optimized encryption method which may be associated with the RSA key generation mechanism is proposed which can be implemented on new generation networks and applies to wireless networks with Bluetooth devices which need an increased security by enlargement of the utilization area.
Abstract: The RSA algorithm proposed by Rivest, Shamir and Adleman as a public key cryptosystem is used in different communication networks in order to ensure data confidentiality. Different weaknesses of this algorithm could be observed and many attacks against it are developed successfully. Improving this algorithm was performed in this paper in order to ensure a higher data security and an increased computing process speed. We propose an optimized encryption method which may be associated with the RSA key generation mechanism. The proposed method is based on a detailed analysis of the algebraic finite fields (AFF). The improved algorithm can be implemented on new generation networks (second generation networks and so on) and applies to wireless networks with Bluetooth devices which need an increased security by enlargement of the utilization area. In the same time, we have used a maximum acceptable length encryption key and algorithm complexity, which increases the computing speed and security degree, but allows the processor to work properly.

Journal ArticleDOI
01 Jun 2007
TL;DR: This paper found that these equations exhibit characteristics, which satisfy the expected properties of Message Authentication Code (MAC), and provides a novel approach of generating MAC with higher security but with smaller key size.
Abstract: Chaos functions are mainly used to develop mathematical models of non-linear systems. They have attracted the attention of many mathematicians owing to their extremely sensitive nature to initial conditions and their immense applicability to problems in daily life. In this paper, two widely used chaos functions the Logistic equation and the Lorenz equation are taken and analyzed. We found that these equations exhibit characteristics, which satisfy the expected properties of Message Authentication Code (MAC). We also provide a novel approach of generating MAC with higher security but with smaller key size. This is achieved by the design of the algorithm using variable Initialization Vectors (IV) instead of the constant IVs. Variable IV adds strength to the security of the message. The experimental results show that it satisfies the expected characteristics of a Message Authentication Code generation algorithm.

01 Jan 2007
TL;DR: A novel Blind Signature Scheme (BSS) has been proposed, which allows a requester to obtain signature from a signer on any document, in such a way that the signer learns nothing about the message that is being signed.
Abstract: Summary In this paper, a novel Blind Signature Scheme (BSS) has been proposed. The scheme is based on Elliptic Curve Discrete Logarithm Problem (ECDLP). It allows a requester to obtain signature from a signer on any document, in such a way that the signer learns nothing about the message that is being signed. The scheme utilizes the inherent advantage of Elliptic Curve Cryptosystem in terms of smaller key size and lower computational overhead to its counterpart public cryptosystems such as RSA and ElGamal. The scheme has been proved to be robust, untraceable and correct. The proposed scheme can be used in various applications like E-voting, digital cash etc where anonymity of requester is required.

01 Jan 2007
TL;DR: In this article, the authors proposed a new cryptographic key exchange protocol based on Mandelbrot and Julia Fractal sets, which is resistant against attacks, utilizes small key size and comparatively performs faster then the existing Diffie-Hellman key exchange protocols.
Abstract: Summary In this paper, we propose a new cryptographic key exchange protocol based on Mandelbrot and Julia Fractal sets. The Fractal based key exchange protocol is possible because of the intrinsic connection between the Mandelbrot and Julia Fractal sets. In the proposed protocol, Mandelbrot Fractal function takes the chosen private key as the input parameter and generates the corresponding public key. Julia Fractal function is then used to calculate the shared key based on the existing private key and the received public key. The proposed protocol is designed to be resistant against attacks, utilizes small key size and comparatively performs faster then the existing Diffie-Hellman key exchange protocol. The proposed Fractal key exchange protocol is therefore an attractive alternative to traditional number theory based key exchange protocols.

Posted Content
TL;DR: In this article, the authors proposed a differential fault analysis on the AES key schedule and showed how an entire 128-bit AES key can be retrieved using two pairs of correct and faulty ciphertexts.
Abstract: This letter proposes a differential fault analysis on the AES key schedule and shows how an entire 128-bit AES key can be retrieved. In the workshop at FDTC 2007, we presented the DFA mechanism on the AES key schedule and proposed general attack rules. Using our proposed rules, we showed an efficient attack that can retrieve 80 bits of the 128-bit key. Recently, we have found a new attack that can obtain an additional 8 bits compared with our previous attack. As a result, we present most efficient attack for retrieving 88 bits of the 128-bit key using approximately two pairs of correct and faulty ciphertexts.

01 Jan 2007
TL;DR: These two well known Public Key Cryptographic Systems, RSA and NTRU, are proposed and presented and implemented to verify their performance for different text files of variable sizes.
Abstract: In many business sectors secure and efficient data transfer is essential. To ensure the security to the applications of business, the business sectors use Public Key Cryptographic Systems (PKCS). An RSA and NTRU system generally belongs to the category of PKCS. The efficiency of a public key cryptographic system is mainly measured in computational overheads, key size and bandwidth. In particular, the RSA algorithm is used in many applications. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards cannot process keys extending 1024 bit shows that there is a need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithms based on manipulating lists of very small integers and polynomials. This allows NTRU to achieve high speeds with the use of minimal computing power. NTRU is the first secure public key cryptosystem not based on factorization or discrete logarithm problems. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The performance characteristics of NTRU and RSA are observed by implementing the algorithms for computation and comparing their experimental running times. Encryption and decryption speeds take O (n log (n)) operations. The RSA’s complexity is O (n ) operations and it compares with NTRU’s O (n log(n) operations. In this paper, we proposed and presented these two well known Public Key Cryptographic Systems, and were implemented to verify their performance for different text files of variable sizes.

Proceedings ArticleDOI
13 Dec 2007
TL;DR: This paper examines how this computation of modular exponentiation computation of TE mod m could be speeded up drawing up on the Indian Vedic Mathematics when compared with the conventional algorithms in existence.
Abstract: The standard techniques for providing privacy and security in data networks include encryption/decryption algorithms such as advanced encryption system (AES) (private-key) and RSA (public-key). RSA is one of the safest standard algorithms, based on public-key, for providing security in networks. One of the most time consuming processes in RSA encryption/decryption algorithm is the modular exponentiation computation of TE mod m where T is the text, (e, n) is the key and this paper examines how this computation could be speeded up drawing up on the Indian Vedic Mathematics when compared with the conventional algorithms in existence.

01 Jan 2007
TL;DR: It has been verified that this RSA encryption engine can perform 32-bit, 256-bit and 1024-bit encryption operation in less than 41.585us, 531.515us and 790.61us respectively.
Abstract: An approach to develop the FPGA of a flexible key RSA encryption engine that can be used as a standard device in the secured communication system is presented. The VHDL modeling of this RSA encryption engine has the unique characteristics of supporting multiple key sizes, thus can easily be fit into the systems that require different levels of security. A simple nested loop addition and subtraction have been used in order to implement the RSA operation. This has made the processing time faster and used comparatively smaller amount of space in the FPGA. The hardware design is targeted on Altera STRATIX II device and determined that the flexible key RSA encryption engine can be best suited in the device named EP2S30F484C3. The RSA encryption implementation has made use of 13,779 units of logic elements and achieved a clock frequency of 17.77MHz. It has been verified that this RSA encryption engine can perform 32-bit, 256-bit and 1024-bit encryption operation in less than 41.585us, 531.515us and 790.61us respectively.

Proceedings ArticleDOI
08 Oct 2007
TL;DR: The results show that the abilities of AES and S- boxes to secure against CPA attack are correlated, and an evaluation of the ability of S-boxes to thwart CPA is presented in a quantitative way.
Abstract: Most of today's wireless sensor networks use a symmetric-key algorithm such as AES for security. Cryptographic S-boxes are an integral part of the AES, although there existed a rich literature devoted to efficient implementations for them, but little attention has been paid to security aspects of the S-box designs. In this paper we conducted a simulation-based CPA attack on AES implementations with different S-box structures. Our results show that the abilities of AES and S-boxes to secure against CPA attack are correlated, and an evaluation of the ability of S-boxes to thwart CPA is presented in a quantitative way. By further exploiting, a novel byte substitution circuit used inhomogeneous S-boxes instead of fixed S-boxes was proposed, and the simulation result shows that power consumption becomes randomized and the peak corresponding to the correct key is masked successfully.

Journal ArticleDOI
TL;DR: A dedicated algorithm for cryptanalysis is proposed based on a generalized time-memory-data trade-off approach and its main characteristics are derived and it points out a security weakness of employing a block cipher with block length shorter than the key length in the considered BE schemes.
Abstract: In this letter a weakness of certain broadcast encryption schemes in which the protected delivery of a session key (SEK) is based on XOR-ing this SEK with the IDs of the keys employed for its encryption is addressed. The weakness can be effectively explored assuming passive attacking which in the cases corresponding to a malicious legitimate user being the attacker, is a ciphertext only attack. A dedicated algorithm for cryptanalysis is proposed based on a generalized time-memory-data trade-off approach and its main characteristics are derived. The developed algorithm points out a security weakness of employing a block cipher with block length shorter than the key length in the considered BE schemes.

01 Jan 2007
TL;DR: A new cryptographic key exchange protocol based on Mandelbrot and Julia Fractal sets is proposed, designed to be resistant against attacks, utilizes small key size and comparatively performs faster then the existing Diffie-Hellman key exchange Protocol.
Abstract: In this paper, we propose a new cryptographic key exchange protocol based on Mandelbrot and Julia Fractal sets. The Fractal based key exchange protocol is possible because of the intrinsic connection between the Mandelbrot and Julia Fractal sets. In the proposed protocol, Mandelbrot Fractal function takes the chosen private key as the input parameter and generates the corresponding public key. Julia Fractal function is then used to calculate the shared key based on the existing private key and the received public key. The proposed protocol is designed to be resistant against attacks, utilizes small key size and comparatively performs faster then the existing Diffie-Hellman key exchange protocol. The proposed Fractal key exchange protocol is therefore an attractive alternative to traditional number theory based key exchange protocols.

Proceedings Article
01 Jan 2007
TL;DR: Experimental results demonstrate that the FPGA implementation of the elliptic curve point multiplication can speedup the point multiplication by 31.6 times compared to a software based implementation.
Abstract: Elliptic curve cryptography (ECC) is an alternative to traditional techniques for public key cryptography. It offers smaller key size without sacrificing security level. In a typical elliptic curve cryptosystem, elliptic curve point multiplication is the most computationally expensive component. So it would be more attractive to implement this unit using hardware than using software. In this paper, we propose an efficient FPGA implementation of the elliptic curve point multiplication in GF(2). We have designed and synthesized the elliptic curve point multiplication with Xilinx’s FPGA. Experimental results demonstrate that the FPGA implementation can speedup the point multiplication by 31.6 times compared to a software based implementation.

Proceedings ArticleDOI
22 Apr 2007
TL;DR: This paper surveys some popular quantum cryptographic protocols and finds that the efficient six-state protocol outperforms the others both in the tolerable quantum bit error rate and in the key generation rate when a realistic laser source is used.
Abstract: Communications in secrecy are often required in many commercial and military applications. Unfortunately, many cryptographic schemes in use today such as public-key cryptography based on the RSA algorithm would be broken with either unanticipated advances in hardware and algorithm or the advent of quantum computers. Quantum cryptography, on the other hand, has been proven secure even against the most general attack allowed by the laws of physics and is a promising technology poised for widespread adoption in realistic cryptographic applications. Quantum cryptography allows two parties to expand on a secret key that they have previously shared. Various quantum cryptographic protocols have been proposed to perform this task. In this paper, we survey some popular quantum cryptographic protocols (including the famous Bennett-Brassard 1984 protocol) and discuss their security. Specifically, we consider their security in two cases: the ideal case where a perfect single-photon source is used and the practical case where a realistic laser source is used. We compare the protocols and find that the efficient six-state protocol outperforms the others both in the tolerable quantum bit error rate and in the key generation rate when a realistic laser source is used.

Proceedings ArticleDOI
13 Dec 2007
TL;DR: An efficient Montgomery modular multiplication technique that employs multi-bit shifting and carry-save addition to perform long-integer arithmetic is proposed and the resulting Montgomery multiplier and the RSA processor performance results are presented at the fastest reported to date in literature.
Abstract: New, generic silicon architecture for implementing Montgomery's multiplication algorithm is presented This paper proposes an efficient Montgomery modular multiplication technique that employs multi-bit shifting and carry-save addition to perform long-integer arithmetic The gain in data throughput for Montgomery multiplication is approximately 4549% (for 1024-bit length) and the hardware reduction is 2427% of the traditional methods Hence, the corresponding hardware realization is optimal in terms of area and offer higher data throughput for Montgomery multiplication The practical application of this approach has been demonstrated by applying this to the design of RSA processor architecture with 512-bit and 1024-bit key size The RSA processor also offers higher throughput (3218% for 1024-bit) with a slight increase in area (224%) This optimization is also technology independent and thus should suit well not only FPGA implementation but also ASIC The Montgomery design and RSA processor has been evaluated on Xilinx Virtex-4 series for the practical bit lengths of 512, 1024 and 2048 The resulting Montgomery multiplier and the RSA processor performance results presented at the fastest reported to date in literature

01 Jan 2007
TL;DR: A novel fault tolerant parity prediction method for the AES substitution box based on composite fields that reduces the logic complexity by 70 to 80% and a novel metric for side channel resistance that shows the relationship between the mean and the standard deviation when achieving 280 level security.
Abstract: Elliptic curve cryptography invented in the 1980's and the Advanced Encryption Standard (AES) standardized in 2002 have become important research topics in security because the advances in modern computing systems have allowed attacks on older cryptographic schemes to be achieved in a shorter period of time. Elliptic curve cryptography is the fundamental mathematical building block for elliptic curve cryptosystems. Elliptic curve cryptosystems are expected to replace the aging RSA cryptosystem which is currently deployed in e-commerce, banking, secure voice over Internet protocol (VoIP), secure sockets layer (SSL), and virtual private network (VPN). Likewise, the AES cryptosystem has replaced the aging Data Encryption Standard (DES) and Triple-DES cryptosystems found in similar applications. In the future, Internet protocol television (IPTV), and secure voice and video over Internet protocol (V 2oIP) are expected to require these newer cryptosystems. Due to the nature of a broad range of applications from secure VoIP to e-commerce, each application requires different elliptic curve cryptographic accelerators and AES accelerators to meet its particular design constraints. This thesis considers various VLSI implementation issues of AES and elliptic curve cryptography accelerators for Galois fields defined by large primes (p) represented as GF(p) and Galois fields defined by irreducible polynomials of degree (m ) represented by GF(2m) and develops novel approaches to improving performance, reducing area, implementing reconfigurable architecture design, and designing for side channel attack resistance. We implemented two reconfigurable elliptic curve cryptography accelerators. One was designed for low power applications while the other was designed for high throughput applications. They achieved speedups of 1.24 and 56 respectively. Then, we investigated the area time complexity tradeoff for various implementation approaches to elliptic curve cryptography accelerator implementations. We found that a designer can indeed approximate the required area for a given performance requirement before implementation due to the O(m3) area-time product. We investigated high performance reconfigurable systems for changing the reduction polynomial in GF(2m) and determined that an approach based on systematic reduction performed best (approximately 37% faster than Barrett or Montgomery reduction). We propose a novel greedy dual base decomposition algorithm for pre-computation of the scalar in the scalar point multiplication method. This method achieves a significant performance improvement (up to 31%) over standard approaches. We propose a novel side channel resistant least significant bit (LSB) invariant scalar point multiplication method that can use pre-computation to reduce the complexity for multiple scalar point multiplications on the same base point. It can achieve performance of almost twice the speed of Montgomery's invariant method. We proposed a novel fault tolerant parity prediction method for the AES substitution box based on composite fields that reduces the logic complexity by 70 to 80%. We proposed a novel metric for side channel resistance that shows the relationship between the mean and the standard deviation when achieving 280 level security.

Journal ArticleDOI
01 Apr 2007
TL;DR: A group key management protocol for hierarchical sensor networks where instead of using pre-deployed keys, each sensor node generates a partial key dynamically using a function that takes partial keys of its children as arguments.
Abstract: In this paper, we describe a group key management protocol for hierarchical sensor networks where instead of using pre-deployed keys, each sensor node generates a partial key dynamically using a function. The function takes partial keys of its children as arguments. The design of the protocol is motivated by the fact that traditional cryptographic techniques are impractical in sensor networks because of associated high energy and computational overheads. The group key management protocol supports the establishment of two types of group keys; one for the nodes within a group (intra-cluster), and the other among a group of cluster heads (inter-cluster). The protocol handles freshness of the group key dynamically, and eliminates the involvement of a trusted third party (TTP). We have experimentally analyzed the time and energy consumption in broadcasting partial keys and the group key under two sensor routing protocols (Tiny-AODV and Tiny-Diffusion) by varying the number of nodes and key sizes. The performance study provides the optimum number of partial keys needed for computing the group key to balance the key size for security requirements and the power consumption. The experimental study also concludes that the energy consumption of SPIN [9] increases rapidly as the number of group members increases in comparison to our protocol. Similarly the pre-deployed key approach requires more communication time in comparison with this protocol. We have implemented this protocol using MICA2 motes and repeated most of the experiments which are done in simulation and we found out that the obtained results are very close to the observations made using the simulator.

Book ChapterDOI
27 Aug 2007
TL;DR: Designers must assume that the information theoretic level of leakage from smart cards can be transformed into usable key information by adversaries whatever counter-measures are put in place.
Abstract: Side channel leakage from smart cards has been of concern since their inception and counter-measures are routinely employed. So a number of standard and reasonable assumptions are made here regarding an implementation of RSA in a cryptographic token which may be subjected to non-invasive side-channel cryptanalysis. These include blinding the re-usable secret key, input whitening, and using an exponentiation algorithm whose operation sequence partially obscures the key. The working hypothesis is that there is limited side channel leakage which only distinguishes very imprecisely between squarings and multiplications. For this typical situation, a method is described for recovering the private exponent, and, realistically, it does not require an excessive number of traces. It just requires the modulus to be public and the public exponent not to be too large. The attack is computationally feasible unless parameters are appropriately adjusted. It reveals that longer keys are much more vulnerable than shorter ones unless blinding is proportional to key length. A further key conclusion is that designers must assume that the information theoretic level of leakage from smart cards can be transformed into usable key information by adversaries whatever counter-measures are put in place.