scispace - formally typeset
Search or ask a question

Showing papers on "Cryptography published in 2009"


Book ChapterDOI
16 Apr 2009
TL;DR: The notion of order-preserving symmetric encryption (OPE) was introduced by Agrawal et al. as mentioned in this paper, who showed that a straightforward relaxation of standard security notions for encryption such as indistinguishability against chosen-plaintext attack (IND-CPA) is unachievable by a practical OPE scheme.
Abstract: We initiate the cryptographic study of order-preserving symmetric encryption (OPE), a primitive suggested in the database community by Agrawal et al. (SIGMOD '04) for allowing efficient range queries on encrypted data. Interestingly, we first show that a straightforward relaxation of standard security notions for encryption such as indistinguishability against chosen-plaintext attack (IND-CPA) is unachievable by a practical OPE scheme. Instead, we propose a security notion in the spirit of pseudorandom functions (PRFs) and related primitives asking that an OPE scheme look "as-random-as-possible" subject to the order-preserving constraint. We then design an efficient OPE scheme and prove its security under our notion based on pseudorandomness of an underlying blockcipher. Our construction is based on a natural relation we uncover between a random order-preserving function and the hypergeometric probability distribution. In particular, it makes black-box use of an efficient sampling algorithm for the latter.

858 citations


Book ChapterDOI
02 Dec 2009
TL;DR: In this article, an implementation of the two-party case, using Yao's garbled circuits, and various algorithmic protocol improvements are analyzed both theoretically and empirically, using experiments of various adversarial situations.
Abstract: Secure multi-party computation has been considered by the cryptographic community for a number of years. Until recently it has been a purely theoretical area, with few implementations with which to test various ideas. This has led to a number of optimisations being proposed which are quite restricted in their application. In this paper we describe an implementation of the two-party case, using Yao's garbled circuits, and present various algorithmic protocol improvements. These optimisations are analysed both theoretically and empirically, using experiments of various adversarial situations. Our experimental data is provided for reasonably large circuits, including one which performs an AES encryption, a problem which we discuss in the context of various possible applications.

789 citations


Book
27 Nov 2009
TL;DR: The authors move quickly from explaining the foundations to describing practical implementations, including recent topics such as lightweight ciphers for RFIDs and mobile devices, and current key-length recommendations.
Abstract: Cryptography is now ubiquitous moving beyond the traditional environments, such as government communications and banking systems, we see cryptographic techniques realized in Web browsers, e-mail programs, cell phones, manufacturing systems, embedded software, smart buildings, cars, and even medical implants Today's designers need a comprehensive understanding of applied cryptography After an introduction to cryptography and data security, the authors explain the main techniques in modern cryptography, with chapters addressing stream ciphers, the Data Encryption Standard (DES) and 3DES, the Advanced Encryption Standard (AES), block ciphers, the RSA cryptosystem, public-key cryptosystems based on the discrete logarithm problem, elliptic-curve cryptography (ECC), digital signatures, hash functions, Message Authentication Codes (MACs), and methods for key establishment, including certificates and public-key infrastructure (PKI) Throughout the book, the authors focus on communicating the essentials and keeping the mathematics to a minimum, and they move quickly from explaining the foundations to describing practical implementations, including recent topics such as lightweight ciphers for RFIDs and mobile devices, and current key-length recommendations The authors have considerable experience teaching applied cryptography to engineering and computer science students and to professionals, and they make extensive use of examples, problems, and chapter reviews, while the books website offers slides, projects and links to further resources This is a suitable textbook for graduate and advanced undergraduate courses and also for self-study by engineers

746 citations


Journal ArticleDOI
TL;DR: This paper provides a complete description of Yao’s protocol, along with a rigorous proof of security, for the first time that an explicitProof of security has been published.
Abstract: In the mid 1980s, Yao presented a constant-round protocol for securely computing any two-party functionality in the presence of semi-honest adversaries (FOCS 1986). In this paper, we provide a complete description of Yao’s protocol, along with a rigorous proof of security. Despite the importance of Yao’s protocol to the theory of cryptography and in particular to the field of secure computation, to the best of our knowledge, this is the first time that an explicit proof of security has been published.

704 citations


Journal ArticleDOI
TL;DR: This paper identifies the threats and vulnerabilities to WSNs and summarize the defense methods based on the networking protocol layer analysis first, and gives a holistic overview of security issues.
Abstract: Wireless sensor networks (WSNs) use small nodes with constrained capabilities to sense, collect, and disseminate information in many types of applications. As sensor networks become wide-spread, security issues become a central concern, especially in mission-critical tasks. In this paper, we identify the threats and vulnerabilities to WSNs and summarize the defense methods based on the networking protocol layer analysis first. Then we give a holistic overview of security issues. These issues are divided into seven categories: cryptography, key management, attack detections and preventions, secure routing, secure location security, secure data fusion, and other security issues. Along the way we analyze the advantages and disadvantages of current secure schemes in each category. In addition, we also summarize the techniques and methods used in these categories, and point out the open research issues and directions in each area.

611 citations


Book
24 Dec 2009
TL;DR: The Advanced Encryption Standard (AES) is an encryption standard adopted by the U.S. government as mentioned in this paper, which has a 128-bit block size with key sizes of 128, 192 and 256 bits.
Abstract: In cryptography, the Advanced Encryption Standard (AES) is an encryption standard adopted by the U.S. government. The standard comprises three block ciphers, AES-128, AES-192 and AES-256, adopted from a larger collection originally published as Rijndael. Each AES cipher has a 128-bit block size, with key sizes of 128, 192 and 256 bits, respectively. The AES ciphers have been analyzed extensively and are now used worldwide, as was the case with its predecessor, the Data Encryption Standard (DES). AES was announced by National Institute of Standards and Technology (NIST) as U.S. FIPS PUB 197 (FIPS 197) on November 26, 2001 after a 5-year standardization process in which fifteen competing designs were presented and evaluated before Rijndael was selected as the most suitable . It became effective as a Federal government standard on May 26, 2002 after approval by the Secretary of Commerce. It is available in many different encryption packages. AES is the first publicly accessible and open cipher approved by the NSA for top secret information.

593 citations


01 Jan 2009
TL;DR: The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.
Abstract: Quantum computers will break today's most popular public-key cryptographic systems, including RSA, DSA, and ECDSA. This book introduces the reader to the next generation of cryptographic algorithms, the systems that resist quantum-computer attacks: in particular, post-quantum public-key encryption systems and post-quantum public-key signature systems. Leading experts have joined forces for the first time to explain the state of the art in quantum computing, hash-based cryptography, code-based cryptography, lattice-based cryptography, and multivariate cryptography. Mathematical foundations and implementation issues are included. This book is an essential resource for students and researchers who want to contribute to the field of post-quantum cryptography.

587 citations


Book ChapterDOI
20 Feb 2009
TL;DR: The public-key encryption scheme of Regev, and the identity-basedryption scheme of Gentry, Peikert and Vaikuntanathan are remarkably robust against memory attacks where the adversary can measure a large fraction of the bits of the secret-key, or more generally, can compute an arbitrary function of thesecret-key of bounded output length.
Abstract: This paper considers two questions in cryptography. Cryptography Secure Against Memory Attacks. A particularly devastating side-channel attack against cryptosystems, termed the "memory attack", was proposed recently. In this attack, a significant fraction of the bits of a secret key of a cryptographic algorithm can be measured by an adversary if the secret key is ever stored in a part of memory which can be accessed even after power has been turned off for a short amount of time. Such an attack has been shown to completely compromise the security of various cryptosystems in use, including the RSA cryptosystem and AES. We show that the public-key encryption scheme of Regev (STOC 2005), and the identity-based encryption scheme of Gentry, Peikert and Vaikuntanathan (STOC 2008) are remarkably robust against memory attacks where the adversary can measure a large fraction of the bits of the secret-key, or more generally, can compute an arbitrary function of the secret-key of bounded output length. This is done without increasing the size of the secret-key, and without introducing any complication of the natural encryption and decryption routines. Simultaneous Hardcore Bits. We say that a block of bits of x are simultaneously hard-core for a one-way function f (x ), if given f (x ) they cannot be distinguished from a random string of the same length. Although any candidate one-way function can be shown to hide one hardcore bit and even a logarithmic number of simultaneously hardcore bits, there are few examples of one-way or trapdoor functions for which a linear number of the input bits have been proved simultaneously hardcore; the ones that are known relate the simultaneous security to the difficulty of factoring integers. We show that for a lattice-based (injective) trapdoor function which is a variant of function proposed earlier by Gentry, Peikert and Vaikuntanathan, an N *** o (N ) number of input bits are simultaneously hardcore, where N is the total length of the input. These two results rely on similar proof techniques.

560 citations


Book ChapterDOI
29 Jun 2009
TL;DR: A new approach to solving cryptographic problems by adapting both the problem description and the solver synchronously instead of tweaking just one of them is presented, which was able to solve a well-researched stream cipher 26 times faster than was previously possible.
Abstract: Cryptography ensures the confidentiality and authenticity of information but often relies on unproven assumptions. SAT solvers are a powerful tool to test the hardness of certain problems and have successfully been used to test hardness assumptions. This paper extends a SAT solver to efficiently work on cryptographic problems. The paper further illustrates how SAT solvers process cryptographic functions using automatically generated visualizations, introduces techniques for simplifying the solving process by modifying cipher representations, and demonstrates the feasibility of the approach by solving three stream ciphers. To optimize a SAT solver for cryptographic problems, we extended the solver's input language to support the XOR operation that is common in cryptography. To better understand the inner workings of the adapted solver and to identify bottlenecks, we visualize its execution. Finally, to improve the solving time significantly, we remove these bottlenecks by altering the function representation and by pre-parsing the resulting system of equations. The main contribution of this paper is a new approach to solving cryptographic problems by adapting both the problem description and the solver synchronously instead of tweaking just one of them. Using these techniques, we were able to solve a well-researched stream cipher 26 times faster than was previously possible.

478 citations


Book ChapterDOI
19 Aug 2009
TL;DR: A generic construction of a public-key encryption scheme that is resilient to key leakage from any universal hash proof system is presented, and variants of the Cramer-Shoup cryptosystem are proved to be CCA1-secure with any leakage of L/4 bits, and CCA2- secure with any leaking of L / polylog(L) bits.
Abstract: Most of the work in the analysis of cryptographic schemes is concentrated in abstract adversarial models that do not capture side-channel attacks. Such attacks exploit various forms of unintended information leakage, which is inherent to almost all physical implementations. Inspired by recent side-channel attacks, especially the "cold boot attacks", Akavia, Goldwasser and Vaikuntanathan (TCC '09) formalized a realistic framework for modeling the security of encryption schemes against a wide class of side-channel attacks in which adversarially chosen functions of the secret key are leaked. In the setting of public-key encryption, Akavia et al. showed that Regev's lattice-based scheme (STOC '05) is resilient to any leakage of L / polylog(L) bits, where L is the length of the secret key. In this paper we revisit the above-mentioned framework and our main results are as follows: We present a generic construction of a public-key encryption scheme that is resilient to key leakage from any universal hash proof system. The construction does not rely on additional computational assumptions, and the resulting scheme is as efficient as the underlying proof system. Existing constructions of such proof systems imply that our construction can be based on a variety of number-theoretic assumptions, including the decisional Diffie-Hellman assumption (and its progressively weaker d-Linear variants), the quadratic residuosity assumption, and Paillier's composite residuosity assumption. We construct a new hash proof system based on the decisional Diffie-Hellman assumption (and its d-Linear variants), and show that the resulting scheme is resilient to any leakage of L(1 ? o(1)) bits. In addition, we prove that the recent scheme of Boneh et al. (CRYPTO '08), constructed to be a "circular-secure" encryption scheme, is resilient to any leakage of L(1 ? o(1)) bits. These two proposed schemes complement each other in terms of efficiency. We extend the framework of key leakage to the setting of chosen-ciphertext attacks. On the theoretical side, we prove that the Naor-Yung paradigm is applicable in this setting as well, and obtain as a corollary encryption schemes that are CCA2-secure with any leakage of L(1 ? o(1)) bits. On the practical side, we prove that variants of the Cramer-Shoup cryptosystem (along the lines of our generic construction) are CCA1-secure with any leakage of L/4 bits, and CCA2-secure with any leakage of L/6 bits.

425 citations


Proceedings Article
10 Aug 2009
TL;DR: Vanish is presented, a system that meets this challenge through a novel integration of cryptographic techniques with global-scale, P2P, distributed hash tables (DHTs) and meets the privacy-preserving goals described above.
Abstract: Today's technical and legal landscape presents formidable challenges to personal data privacy First, our increasing reliance on Web services causes personal data to be cached, copied, and archived by third parties, often without our knowledge or control Second, the disclosure of private data has become commonplace due to carelessness, theft, or legal actions Our research seeks to protect the privacy of past, archived data -- such as copies of emails maintained by an email provider -- against accidental, malicious, and legal attacks Specifically, we wish to ensure that all copies of certain data become unreadable after a userspecified time, without any specific action on the part of a user, and even if an attacker obtains both a cached copy of that data and the user's cryptographic keys and passwords This paper presents Vanish, a system that meets this challenge through a novel integration of cryptographic techniques with global-scale, P2P, distributed hash tables (DHTs) We implemented a proof-of-concept Vanish prototype to use both the million-plus-node Vuze Bit-Torrent DHT and the restricted-membership OpenDHT We evaluate experimentally and analytically the functionality, security, and performance properties of Vanish, demonstrating that it is practical to use and meets the privacy-preserving goals described above We also describe two applications that we prototyped on Vanish: a Firefox plugin for Gmail and other Web sites and a Vanishing File application

Book
12 Mar 2009
TL;DR: This book serves as a complete resource for the successful design or implementation of cryptographic algorithms or protocols using Boolean functions; provides engineers and scientists with a needed reference for the use of Boolean functions in cryptography; and addresses the issues of cryptographic Boolean functions theory and applications in one concentrated resource.
Abstract: Boolean functions are the building blocks of symmetric cryptographic systems. Symmetrical cryptographic algorithms are fundamental tools in the design of all types of digital security systems (i.e. communications, financial and e-commerce). "Cryptographic Boolean Functions and Applications" is a concise reference that shows how Boolean functions are used in cryptography. Currently, practitioners who need to apply Boolean functions in the design of cryptographic algorithms and protocols need to patch together needed information from a variety of resources (books, journal articles and other sources). This book compiles the key essential information in one easy to use, step-by-step reference. Beginning with the basics of the necessary theory, the book goes on to examine more technical topics, some of which are at the frontier of current research. The book serves as a complete resource for the successful design or implementation of cryptographic algorithms or protocols using Boolean functions; provides engineers and scientists with a needed reference for the use of Boolean functions in cryptography; and, addresses the issues of cryptographic Boolean functions theory and applications in one concentrated resource. The book is organized logically to help the reader easily understand the topic.

Journal ArticleDOI
TL;DR: This article proposes a simple and provably secure encryption scheme that allows efficient additive aggregation of encrypted data and constructs an end-to-end aggregate authentication scheme that is secure against outsider-only attacks, based on the indistinguishability property of a pseudorandom function (PRF), a standard cryptographic primitive.
Abstract: Wireless sensor networks (WSNs) are composed of tiny devices with limited computation and battery capacities. For such resource-constrained devices, data transmission is a very energy-consuming operation. To maximize WSN lifetime, it is essential to minimize the number of bits sent and received by each device. One natural approach is to aggregate sensor data along the path from sensors to the sink. Aggregation is especially challenging if end-to-end privacy between sensors and the sink (or aggregate integrity) is required. In this article, we propose a simple and provably secure encryption scheme that allows efficient additive aggregation of encrypted data. Only one modular addition is necessary for ciphertext aggregation. The security of the scheme is based on the indistinguishability property of a pseudorandom function (PRF), a standard cryptographic primitive. We show that aggregation based on this scheme can be used to efficiently compute statistical values, such as mean, variance, and standard deviation of sensed data, while achieving significant bandwidth savings. To protect the integrity of the aggregated data, we construct an end-to-end aggregate authentication scheme that is secure against outsider-only attacks, also based on the indistinguishability property of PRFs.

Book ChapterDOI
01 Jan 2009
TL;DR: A closer look reveals that there is no justification for the leap from “quantum computers destroy RSA and DSA and ECDSA” to “ quantum computers destroy cryptography.”
Abstract: Imagine that it’s fifteen years from now and someone announces the successful construction of a large quantum computer. The New York Times runs a frontpage article reporting that all of the public-key algorithms used to protect the Internet have been broken. Users panic. What exactly will happen to cryptography? Perhaps, after seeing quantum computers destroy RSA and DSA and ECDSA, Internet users will leap to the conclusion that cryptography is dead; that there is no hope of scrambling information to make it incomprehensible to, and unforgeable by, attackers; that securely storing and communicating information means using expensive physical shields to prevent attackers from seeing the information—for example, hiding USB sticks inside a locked briefcase chained to a trusted courier’s wrist. A closer look reveals, however, that there is no justification for the leap from “quantum computers destroy RSA and DSA and ECDSA” to “quantum computers destroy cryptography.” There are many important classes of cryptographic systems beyond RSA and DSA and ECDSA:

Proceedings ArticleDOI
13 Nov 2009
TL;DR: This paper proposes to encrypt every data block with a different key so that flexible cryptography-based access control can be achieved, and investigates the overhead and safety of the proposed approach, and study mechanisms to improve data access efficiency.
Abstract: Providing secure and efficient access to large scale outsourced data is an important component of cloud computing. In this paper, we propose a mechanism to solve this problem in owner-write-users-read applications. We propose to encrypt every data block with a different key so that flexible cryptography-based access control can be achieved. Through the adoption of key derivation methods, the owner needs to maintain only a few secrets. Analysis shows that the key derivation procedure using hash functions will introduce very limited computation overhead. We propose to use over-encryption and/or lazy revocation to prevent revoked users from getting access to updated data blocks. We design mechanisms to handle both updates to outsourced data and changes in user access rights. We investigate the overhead and safety of the proposed approach, and study mechanisms to improve data access efficiency.

Proceedings ArticleDOI
12 Dec 2009
TL;DR: PasS (Privacy as a Service) is the first practical cloud computing privacy solution that utilizes previous research on cryptographic coprocessors to solve the problem of securely processing sensitive data in cloud computing infrastructures.
Abstract: In this paper we present PasS (Privacy as a Service); a set of security protocols for ensuring the privacy and legal compliance of customer data in cloud computing architectures. PasS allows for the secure storage and processing of users’ confidential data by leveraging the tamper-proof capabilities of cryptographic coprocessors. Using tamper-proof facilities provides a secure execution domain in the computing cloud that is physically and logically protected from unauthorized access. PasS central design goal is to maximize users’ control in managing the various aspects related to the privacy of sensitive data. This is achieved by implementing user-configurable software protection and data privacy mechanisms. Moreover, PasS provides a privacy feedback process which informs users of the different privacy operations applied on their data and makes them aware of any potential risks that may jeopardize the confidentiality of their sensitive information. To the best of our knowledge, PasS is the first practical cloud computing privacy solution that utilizes previous research on cryptographic coprocessors to solve the problem of securely processing sensitive data in cloud computing infrastructures.

Journal ArticleDOI
19 May 2009
TL;DR: An overview of the potential of free space optical technology in information security, encryption, and authentication is presented and optical authentication techniques applied to ID tags with visible and near infrared imaging are reviewed.
Abstract: This paper presents an overview of the potential of free space optical technology in information security, encryption, and authentication. Optical waveform posses many degrees of freedom such as amplitude, phase, polarization, spectral content, and multiplexing which can be combined in different ways to make the information encoding more secure. This paper reviews optical techniques for encryption and security of two-dimensional and three-dimensional data. Interferometric methods are used to record and retrieve data by either optical or digital holography for security applications. Digital holograms are widely used in recording and processing three dimensional data, and are attractive for securing three dimensional data. Also, we review optical authentication techniques applied to ID tags with visible and near infrared imaging. A variety of images and signatures, including biometrics, random codes, and primary images can be combined in an optical ID tag for security and authentication.

Journal ArticleDOI
TL;DR: Two concrete methods are presented, ShaVe and ShaCK, in which sensing and analysis of shaking movement is combined with cryptographic protocols for secure authentication, which are based on initial key exchange followed by exchange and comparison of sensor data for verification of key authenticity.
Abstract: A challenge in facilitating spontaneous mobile interactions is to provide pairing methods that are both intuitive and secure Simultaneous shaking is proposed as a novel and easy-to-use mechanism for pairing of small mobile devices The underlying principle is to use common movement as a secret that the involved devices share for mutual authentication We present two concrete methods, ShaVe and ShaCK, in which sensing and analysis of shaking movement is combined with cryptographic protocols for secure authentication ShaVe is based on initial key exchange followed by exchange and comparison of sensor data for verification of key authenticity ShaCK, in contrast, is based on matching features extracted from the sensor data to construct a cryptographic key The classification algorithms used in our approach are shown to robustly separate simultaneous shaking of two devices from other concurrent movement of a pair of devices, with a false negative rate of under 12 percent A user study confirms that the method is intuitive and easy to use, as users can shake devices in an arbitrary pattern

Proceedings ArticleDOI
31 May 2009
TL;DR: CPA/CCA secure symmetric encryption schemes that remain secure with exponentially hard-to-invert auxiliary input are constructed, based on a new cryptographic assumption, Learning Subspace-with-Noise (LSN), which is related to the well known Learning Parity- with-noise (LPN) assumption.
Abstract: We study the question of designing cryptographic schemes which are secure even if an arbitrary function f(sk) of the secret key is leaked, as long as the secret key sk is still (exponentially) hard to compute from this auxiliary input. This setting of auxiliary input is more general than the more traditional setting, which assumes that some of information about the secret key sk may be leaked, but sk still has high min-entropy left. In particular, we deal with situations where f(sk) information-theoretically determines the entire secret key sk.As our main result, we construct CPA/CCA secure symmetric encryption schemes that remain secure with exponentially hard-to-invert auxiliary input. We give several applications of such schemes. * We construct an average-case obfuscator for the class of point functions, which remains secure with exponentially hard-to-invert auxiliary input, and is reusable. * We construct a reusable and robust extractor that remains secure with exponentially hard-to-invert auxiliary input. Our results rely on a new cryptographic assumption, Learning Subspace-with-Noise (LSN), which is related to the well known Learning Parity-with-Noise (LPN) assumption.

Book ChapterDOI
02 Dec 2009
TL;DR: Lower bounds are given on the work factor of idealized versions of code-based cryptography algorithms, taking into account all possible tweaks which could improve their practical complexity.
Abstract: Code-based cryptography is often viewed as an interesting "Post-Quantum" alternative to the classical number theory cryptography. Unlike many other such alternatives, it has the convenient advantage of having only a few, well identified, attack algorithms. However, improvements to these algorithms have made their effective complexity quite complex to compute. We give here some lower bounds on the work factor of idealized versions of these algorithms, taking into account all possible tweaks which could improve their practical complexity. The aim of this article is to help designers select durably secure parameters.

Journal ArticleDOI
TL;DR: Well-known cryptographic tools for providing integrity and consistency for data stored in clouds are surveyed and recent research in cryptography and distributed computing addressing these problems are discussed.
Abstract: More and more users store data in "clouds" that are accessed remotely over the Internet. We survey well-known cryptographic tools for providing integrity and consistency for data stored in clouds and discuss recent research in cryptography and distributed computing addressing these problems.

Proceedings ArticleDOI
07 Mar 2009
TL;DR: This work presents a novel architecture capable of tracking all information flow within the machine, including all explicit data transfers and all implicit flows (those subtly devious flows caused by not performing conditional operations).
Abstract: For many mission-critical tasks, tight guarantees on the flow of information are desirable, for example, when handling important cryptographic keys or sensitive financial data. We present a novel architecture capable of tracking all information flow within the machine, including all explicit data transfers and all implicit flows (those subtly devious flows caused by not performing conditional operations). While the problem is impossible to solve in the general case, we have created a machine that avoids the general-purpose programmability that leads to this impossibility result, yet is still programmable enough to handle a variety of critical operations such as public-key encryption and authentication. Through the application of our novel gate-level information flow tracking method, we show how all flows of information can be precisely tracked. From this foundation, we then describe how a class of architectures can be constructed, from the gates up, to completely capture all information flows and we measure the impact of doing so on the hardware implementation, the ISA, and the programmer.

Journal ArticleDOI
TL;DR: This work proposes a new hybrid authentication mechanism, VANET authentication using signatures and TESLA++ (VAST), that combines the advantages of ECDSA signatures and tesLA++, and introduces a new certificate verification strategy that prevents denial of service attacks while requiring zero additional sender overhead.
Abstract: Although much research has been conducted in the area of authentication in wireless networks, vehicular ad-hoc networks (VANETs) pose unique challenges, such as real-time constraints, processing limitations, memory constraints, frequently changing senders, requirements for interoperability with existing standards, extensibility and flexibility for future requirements, etc. No currently proposed technique addresses all of the requirements for message and entity authentication in VANETs. After analyzing the requirements for viable VANET message authentication, we propose a modified version of TESLA, TESLA++, which provides the same computationally efficient broadcast authentication as TESLA with reduced memory requirements. To address the range of needs within VANETs we propose a new hybrid authentication mechanism, VANET authentication using signatures and TESLA++ (VAST), that combines the advantages of ECDSA signatures and TESLA++. Elliptic curve digital signature algorithm (ECDSA) signatures provide fast authentication and non-repudiation, but are computationally expensive. TESLA++ prevents memory and computation-based denial of service attacks. We analyze the security of our mechanism and simulate VAST in realistic highway conditions under varying network and vehicular traffic scenarios. Simulation results show that VAST outperforms either signatures or TESLA on its own. Even under heavy loads VAST is able to authenticate 100% of the received messages within 107ms. VANETs use certificates to achieve entity authentication (i.e., validate senders). To reduce certificate bandwidth usage, we use Hu et al.'s strategy of broadcasting certificates at fixed intervals, independent of the arrival of new entities. We propose a new certificate verification strategy that prevents denial of service attacks while requiring zero additional sender overhead. Our analysis shows that these solutions introduce a small delay, but still allow drivers in a worst case scenario over 3 seconds to respond to a dangerous situation.

Journal ArticleDOI
TL;DR: In this paper, a new cryptography method based on the central dogma of molecular biology is introduced, which simulates some critical processes in central dogma, it is a pseudo DNA cryptography method.

Journal ArticleDOI
TL;DR: A prototype that implements a continuous-variable quantum key distribution protocol based on coherent states and reverse reconciliation, which uses time and polarization multiplexing for optimal transmission and detection of the signal and phase reference, and employs sophisticated error-correction codes for reconciliation.
Abstract: We have designed and realized a prototype that implements a continuous-variable quantum key distribution (QKD) protocol based on coherent states and reverse reconciliation. The system uses time and polarization multiplexing for optimal transmission and detection of the signal and phase reference, and employs sophisticated error-correction codes for reconciliation. The security of the system is guaranteed against general coherent eavesdropping attacks. The performance of the prototype was tested over preinstalled optical fibres as part of a quantum cryptography network combining different QKD technologies. The stable and automatic operation of the prototype over 57?h yielded an average secret key distribution rate of 8?kbit?s?1 over a 3?dB loss optical fibre, including the key extraction process and all quantum and classical communication. This system is therefore ideal for securing communications in metropolitan size networks with high-speed requirements.

Journal ArticleDOI
TL;DR: This paper proposes an OCML-based colour image encryption scheme with a stream cipher structure using a 192-bit-long external key to generate the initial conditions and the parameters of the OCML, which is modelled by one-way coupled-map lattices.
Abstract: The chaos-based cryptographic algorithms have suggested some new ways to develop efficient image-encryption schemes. While most of these schemes are based on low-dimensional chaotic maps, it has been proposed recently to use high-dimensional chaos namely spatiotemporal chaos, which is modelled by one-way coupled-map lattices (OCML). Owing to their hyperchaotic behaviour, such systems are assumed to enhance the cryptosystem security. In this paper, we propose an OCML-based colour image encryption scheme with a stream cipher structure. We use a 192-bit-long external key to generate the initial conditions and the parameters of the OCML. We have made several tests to check the security of the proposed cryptosystem namely, statistical tests including histogram analysis, calculus of the correlation coefficients of adjacent pixels, security test against differential attack including calculus of the number of pixel change rate (NPCR) and unified average changing intensity (UACI), and entropy calculus. The cryptosystem speed is analyzed and tested as well.


Journal ArticleDOI
TL;DR: This paper designs novel VCRG-n encryption schemes for binary, gray-level and color images, validate the correctness by formal proofs and demonstrate the feasibility by computer simulations.

Book ChapterDOI
22 Nov 2009
TL;DR: By adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.
Abstract: More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

Book
14 Aug 2009
TL;DR: Algebraic Cryptanalysis bridges the gap between a course in cryptography, and being able to read the cryptanalytic literature, with a survey of the methods used in practice, including SAT-solvers and the methods of Nicolas Courtois.
Abstract: Algebraic Cryptanalysis bridges the gap between a course in cryptography, and being able to read the cryptanalytic literature. This book is divided into three parts: Part One covers the process of turning a cipher into a system of equations; Part Two covers finite field linear algebra; Part Three covers the solution of Polynomial Systems of Equations, with a survey of the methods used in practice, including SAT-solvers and the methods of Nicolas Courtois. Topics include: Analytic Combinatorics, and its application to cryptanalysis The equicomplexity of linear algebra operations Graph coloring Factoring integers via the quadratic sieve, with its applications to the cryptanalysis of RSA Algebraic Cryptanalysis is designed for advanced-level students in computer science and mathematics as a secondary text or reference book for self-guided study. This book is suitable for researchers in Applied Abstract Algebra or Algebraic Geometry who wish to find more applied topics or practitioners working for security and communications companies.