scispace - formally typeset
Search or ask a question

Showing papers on "Key size published in 2023"



Journal ArticleDOI
TL;DR: In this article , the authors studied the security of Russian authenticated encryption with associated data mode that is known as MGM and examined the mode properties under the condition that we have $$\mathcal {O}\left( 2^{n/2} \right) queries, where n is the state size of the used block cipher.
Abstract: In this work we study the security of Russian authenticated encryption with associated data mode that is known as MGM. We examine the mode properties under the condition that we have $$\mathcal {O}\left( 2^{n/2} \right) $$ queries, where n is the state size of the used block cipher. Two attacks that are based on birthday paradox are proposed. One of these attacks does not reuse nonse and allows you to generate a message with the correct authentication code without knowing the secret key. It should be noted that the number of protected information on one key with MGM mode does not exceed $$2^{n/2}$$ bits.

1 citations


Journal ArticleDOI
TL;DR: In this article , the authors presented a cryptographic encryption standard whose model is based on Serpent presented by Eli Biham, Ross Anderson, and Lars Knudsen, the modification lies in the design of the Cipher, they used power associative (PA) loop and group of permutations.
Abstract: This article presents a cryptographic encryption standard whose model is based on Serpent presented by Eli Biham, Ross Anderson, and Lars Knudsen. The modification lies in the design of the Cipher, we have used power associative (PA) loop and group of permutations. The proposed mathematical structure is superior to Galois Field (GF) in terms of complexity and has the ability to create arbitrary randomness due to a larger key space. The proposed method is simple and speedy in terms of computations, meanwhile it affirms higher security and sensitivity. In contrast to GF, PA-loop are non-isomorphic and have several Cayley table representations. This validates the resistance to cryptanalytic attacks, particularly those targeting mathematical structures. This cryptographic scheme’s full description of encryption and decryption is measured and rigorously assessed to support its multimedia applications. The observed speed of this technique, which uses a key of 256 bits and a block size of 128 bits, is comparable to three-key triple-DES.

1 citations


Book ChapterDOI
TL;DR: In this paper , it was shown that the security of a key-stream ciphers depends entirely on the choice of the output function, which determines whether a key recovery attack is possible.
Abstract: A common structure in stream ciphers makes use of linear and nonlinear shift registers with a nonlinear output function drawing from both registers. We refer to these as Grain-like keystream generators. A recent development in lightweight ciphers is a modification of this structure to include a non-volatile key register, which allows key bits to be fed into the state update of the nonlinear register. Sprout and Plantlet are examples of this modified structure. The authors of these ciphers argue that including these key bits in the internal state update provides increased security, enabling the use of reduced register sizes below the commonly accepted rule of thumb that the state size should be at least twice the key size. In this paper, we analyse Plantlet and show that the security of this design depends entirely on the choice of the output function. Specifically, the contribution from the nonlinear register to the output function determines whether a key recovery attack is possible. We make a minor modification to Plantlet’s output function which allows the contents of the linear register to be recovered using an algebraic attack during keystream generation. This information then allows partial recovery of the contents of the nonlinear register, after which the key bits and the remaining register contents can be obtained using a guess and check approach, with a complexity significantly lower than exhaustive key search. Note that our attack is not successful on the existing version of Plantlet, though it only requires minor modifications to the filter function in order for the attack to succeed. However, our results clearly demonstrate that including the key in the state update during keystream generation does not increase the security of Plantlet. In fact, this feature was exploited to recover the key during keystream generation without the need to consider the initialisation process. This paper provides design guidelines for choosing both suitable output functions and the register stages used for inputs to these functions in order to resist the attacks we applied.

Posted ContentDOI
22 Jun 2023
TL;DR: Wang et al. as discussed by the authors presented a key expansion algorithm based on a high-performance one-dimensional chaotic map, which is assessed based on statistical independence and sensitivity to the initial key.
Abstract: Abstract In this paper, we present a key expansion algorithm based on a high-performance one-dimensional chaotic map. Traditional one-dimensional chaotic maps exhibit several limitations, prompting us to construct a new map that overcomes these shortcomings. By analyzing the structural characteristics of classic 1D chaotic maps, we propose a high-performance 1D map that outperforms multidimensional maps introduced by numerous researchers in recent years. In block cryptosystems, the security of round keys is of utmost importance. To ensure the generation of secure round keys, a sufficiently robust key expansion algorithm is required. The security of round keys is assessed based on statistical independence and sensitivity to the initial key. Leveraging the properties of our constructed high-performance chaotic map, we introduce a chaotic key expansion algorithm. Our experimental results validate the robust security of our proposed key expansion algorithm, demonstrating its resilience against various attacks. The algorithm exhibits strong statistical independence and sensitivity to the initial key, further strengthening the security of the generated round keys.

Journal ArticleDOI
TL;DR: In this article , a modified version of the Blowfish algorithm was proposed that performs high-speed encryption with high throughput and supports 128-bit block size, enhancing its applicability in various areas.
Abstract: The field of information security has many uses in the modern day and beyond. Encryption is a method used to secure information from unauthorized access. Since symmetric key algorithms can decrypt data much more quickly than asymmetric key algorithms, the former are more popular. Blowfish is an unpatented, freely useable, compact, quick, and efficient symmetric key encryption technique. Additionally, this method has a high level of security. The size of its blocks (64 bits) is limiting its use, though. The paper aims to propose a modified version of the Blowfish algorithm that performs high-speed encryption with high throughput and supports 128-bit block size, enhancing its applicability in various areas. The algorithm can be an alternative to the AES algorithm with limited power consumption. The proposed algorithm is compared with the original Blowfish algorithm based on execution speed, throughput, and the avalanche effect. The algorithm's performance is also evaluated on images based on diffusion properties, image histogram, entropy, and correlation coefficient.

Journal ArticleDOI
TL;DR: In this paper , the authors proposed an efficient key scheduling algorithm for PRESENT, a lightweight encryption technique resistant against cryptanalytic attacks, which is implemented on the ARM Cortex M3-based NXP LPC 1857 and 1768 hardware development platforms.
Abstract: The efficiency of a cryptographic algorithm in terms of security depends on the resistance against cryptanalytic attacks. Besides the complexity of the encryption algorithm, the key plays an essential role in the security against cryptanalytic attacks. The strength and complexity of the encryption algorithm do not suffice and serve the fundamental purpose of security if the key is compromised at any stage. So, apart from the cryptanalytically robust encryption algorithm, a strong key schedule is also essential to thwart possible attacks against a particular algorithm. PRESENT, a lightweight encryption technique with a simple design, is resistant to linear and differential attacks but has a weak key schedule and is susceptible to cryptanalytic attacks. This paper proposes an efficient key scheduling algorithm for PRESENT lightweight encryption technique resistant against cryptanalytic attacks. Statistical tests examine the proposed key schedule’s cryptographic strength on the properties of subkeys produced in the key generation mechanism. The tests prove the efficiency of the proposed key schedule in terms of cryptanalytic attacks. Implementation and comparison of the efficacy of the proposed key schedule in terms of security and implementation costs are illustrated with the PRESENT-80 key schedule and the PRESENT-128 key schedule. The key scheduling algorithm is implemented on the ARM Cortex M3-based NXP LPC 1857 and 1768 hardware development platforms. The results indicate the efficiency of the proposed algorithm in terms of security, performance, and power consumption across the two hardware platforms.

Book ChapterDOI
TL;DR: In this article , the authors proposed two methods that combine high error correcting capability with security enhancement to enable cryptographic communication even under high noise, which can be regarded as one type of mode of operation.
Abstract: This paper proposes two methods that combine high error correcting capability with security enhancement to enable cryptographic communication even under high noise. The first method is a combination of symmetric key cryptography and Shortened LDPC, which enables two-way communication. It can be regarded as one type of mode of operation. The second method combines the McEliece method and Shortened QC-MDPC to realize one-way communication. It has the advantage of fast processing speeds compared to general asymmetric key cryptography and the ability to centrally manage key updates for many IoT modules. We performed computer simulations and analysed practical parameterization and security enhancement. Both methods are found to provide sufficient security and are expected to have a wide range of applications.


Journal ArticleDOI
TL;DR: In this paper , the basic idea of elliptic curve cryptography (ECC) as well as Vigenère symmetry key is described, and a cryptosystem using elliptic curves and vigenère cryptography is proposed.
Abstract: In this paper describes the basic idea of elliptic curve cryptography (ECC) as well as Vigenère symmetry key. Elliptic curve arithmetic can be used to develop elliptic curve coding schemes, including key exchange, encryption, and digital signature. The main attraction of elliptic curve cryptography compared to Rivest, Shamir, Adleman (RSA) is that it provides equivalent security for a smaller key size, which reduces processing costs. From the theorical basic, we proposed a cryptosystem using elliptic curves and Vigenère cryptography. We proposed and implemented our encryption algorithm in an integrated development environment named visual studio 2019 to design a safe, secure, and effective cryptosystem.

Proceedings ArticleDOI
23 Jan 2023
TL;DR: In this article , the authors proposed the Key SECURE-Key (KSKEY) algorithm to get around the restricted key bit size, which is not a suitable way to encrypt data because multi-core processors in modern microprocessors are significantly increasing process speeds.
Abstract: In the recent days people use a variety of online software applications to move data from one location to another. To secure personal data from hackers, software or applications use cryptographicbased algorithms. The encryption and decryption processes are the foundation of the cryptographic algorithm, which will be carried out with the help of a key. Cryptographic algorithms employ automatic key generation that starts with the user’s password, whereas another generating key directly interprets the password. A symmetric algorithm uses the same key for both encryption and decryption. For key generation, symmetric algorithms use a second or subsidiary method. The password is protected using this key generation process from various key attacks. The key generation algorithm functions as an interpreter, transforming a password from a human-readable form to a machine-readable form. An increase in permutations and combinations helps to protect the password; additional characteristics are being added to the algorithm. To get around the restricted key bit size, this paper proposed the KEY SECURE-KEY (KSKEY) algorithm. However, increasing the key size is not a suitable way to encrypt data because multi-core processors in modern microprocessors are significantly increasing processingspeed.

Posted ContentDOI
25 May 2023
TL;DR: In this article , a Ring-LWE (RLWE) scheme with a short key length, with a modified LLL basis reduction algorithm, was proposed and the trend in the degree of field extension required to generate a secure and small key was investigated.
Abstract: Modern information communications use cryptography to keep the contents of communications confidential. RSA (Rivest-Shamir-Adleman) cryptography and elliptic curve cryptography, which are public-key cryptosystems, are widely used cryptographic schemes. However, it is known that these cryptographic schemes can be deciphered in a very short time by Shor's algorithm when a quantum computer is put into practical use. Therefore, several methods have been proposed for quantum computer-resistant cryptosystems that cannot be cracked even by a quantum computer. A simple implementation of LWE-based lattice cryptography based on the LWE (Learning With Errors) problem requires a key length of $O(n^2)$ to ensure the same level of security as existing public-key cryptography schemes such as RSA and elliptic curve cryptography. In this paper, we attacked the Ring-LWE (RLWE) scheme, which can be implemented with a short key length, with a modified LLL (Lenstra-Lenstra-Lov\'asz) basis reduction algorithm and investigated the trend in the degree of field extension required to generate a secure and small key. Results showed that the lattice-based cryptography may be strengthened by employing Cullen or Mersenne prime numbers as the degree of field extension.

Proceedings ArticleDOI
11 Apr 2023
TL;DR: In this paper , an algorithm for generating and maintaining the keys based on polynomials and interpolation is presented, where the advantage of having a variable length key is leveraged to mitigate key-size based attacks by using polynomial.
Abstract: Every organization's primary concern is security. A cryptography method will remain safe in the system if the key is not cracked by a hacker through any kind of attacks. A conventional cryptography algorithm's strength relies on the key size and structure. Cryptography algorithms can be susceptible to brute force attacks since their keys have fixed lengths. The advantage of having a variable length key is leveraged to mitigate key-Size based attacks by using polynomials. This article outlines an algorithm for generating and maintaining the keys based on the polynomials and interpolation.

Journal ArticleDOI
TL;DR: In this article , the authors provide an overview of ECC, including the ECC process, fundamental protocols, and different ECC systems and applications, and comparison tables are included that list the ratio of key size between ECC and RSA, in factors of overhead, power availability and required storage.
Abstract: Various Cryptography algorithms are used to keep the transmission of data safe from intruder and to secure the connection between sender and receiver. This article provides an overview of ECC, including the algorithm process, fundamental protocols, and different ECC systems and applications. EC over fields also represents by numerous graphical representations of cryptographic processes. Comparison tables are included that list the ratio of key size between ECC and RSA (the nearest method to ECC), in factors of overhead, power availability and required storage, which indicates fast running with less bandwidth in case of elliptic curve method. In the other hand some previous works related to the topic were compared according to the functions of properties, measurement methods, and type of the data used, results were collected in traceable manner for the purpose of making it available to researchers and those interested in ECC.

Posted ContentDOI
16 Mar 2023
TL;DR: In this paper , the entropy accumulation theorem is applied to prove finite-size security against coherent attacks for a discrete-modulated quantum key distribution protocol involving four coherent states and heterodyne detection.
Abstract: Continuous variable quantum key distribution with discrete modulation has the potential to provide quantum physical security using widely available optical elements and existing telecom infrastructure. While their implementation is significantly simpler than that for protocols based on Gaussian modulation, proving their finite-size security against coherent attacks poses a challenge. In this work we apply the entropy accumulation theorem, a tool that has previously been used in the setting of discrete variables, to prove finite-size security against coherent attacks for a discrete-modulated quantum key distribution protocol involving four coherent states and heterodyne detection. To do so, and contrary to previous approaches, we consider a protocol in which all the information is discretised. We first bound its asymptotic rate under a realistic photon number cutoff assumption. This bound is then upgraded into a finite-size security proof using entropy accumulation. Our analysis provides asymptotic rates in the range of $0.1-10^{-4}$ bits per round for distances up to hundred kilometres, while in the finite case and for realistic parameters, we get of the order of $10$ Gbits of secret key after $n=10^{12}$ rounds and distances of few tens of kilometres.

Proceedings ArticleDOI
17 Apr 2023
TL;DR: The code-based McEliece cryptosystem was originally proposed using Goppa codes in 1978 as mentioned in this paper , and it has made it as far as the fourth round of the NIST Post-Quantum Cryptography standardization process to update the standards and include post-quantum cryptography in digital signatures, encryption and key exchange.
Abstract: Post-quantum cryptography is a growing area since Shor showed that a quantum computer with enough qubits could be used to break the most widely used public-key cryptographic protocols today, such as RSA or those based on the discrete logarithm problem. For this reason, it has become urgent to design cryptosystems that are robust against quantum computer attacks. One of them is the code-based McEliece cryptosystem, which was originally proposed using Goppa codes in 1978. The improved version of the original McEliece cryptosystem, called Classic McEliece, made it as far as the fourth round of the NIST Post-Quantum Cryptography standardization process launched by the National Institute of Technology to update the standards and include post-quantum cryptography in digital signatures, encryption and key exchange. In this work we describe and analyze two variants of the original cryptosystem designed to overcome its main drawbacks, such as its large key size and weakness against known attacks. In addition, both the recent attack that allows the recovery of the private key with limited complexity and the ways in which this attack can be prevented by changing the shape of some constituent arrays in these two new variants are discussed.

Proceedings ArticleDOI
26 Jun 2023
TL;DR: Using Nonlinear Congruential Generators (NCGs) as discussed by the authors , the authors proposed a key expansion algorithm that helps to design more secure block encryption algorithms or hash functions, and the experimental results show that the proposed algorithm is feasible and resistant to side-channel attacks.
Abstract: The block cipher stands out among the reliable methods for data security. Key expansion is a crucial step in the block encryption algorithm and is thus important to develop secure round keys that are statistically independent and sensitive. Using Nonlinear Congruential Generators (NCGs), we propose a key expansion algorithm that helps to design more secure block encryption algorithms or hash functions. Despite the advancement in digital technologies, NCGs remain the effective method of Pseudorandom Number Generation (PRNG). However, conventional linear congruence generators have difficulties in applying the key expansion algorithm. In contrast to the conventional linear congruence generator, the round constant extension algorithm requires a random sequence of 01, which produces random integers within a certain range. To improve the Advanced Encryption Standard (AES) key expansion algorithm, we propose an NCG over a Galois field GF ( 28 ). Our analysis includes analyzing a key expansion of 128 bits in length and a key of other lengths performed similarly. The experimental results show that the proposed algorithm is feasible and resistant to side-channel attacks. Our findings can be used to improve existing block cipher algorithms and make them more secure.

Journal ArticleDOI
TL;DR: In this paper , the authors proposed a ring learning with errors (RLWE) reusing errors (ReRLWE), which reduces the size of the evaluation keys by reusing the error that is used when generating an RLWE sample.
Abstract: As cloud computing and AI as a Service are provided, it is increasingly necessary to deal with privacy sensitive data. To deal with the sensitive data, there are two cases of outsourcing process: i) many clients participate dynamically ii) many clients are pre-determined. The solutions for protecting sensitive data in both cases are the multi-key homomorphic encryption (MKHE) scheme and the threshold multi-key homomorphic encryption (TMKHE) scheme. However, these schemes may be difficult for clients with limited resources to perform MKHE and TMKHE. In addition, due to the large size of the evaluation keys, in particular multiplication and rotation keys, the communication between the clients and server that provide outsourcing service increases. Also, the size of the evaluation keys that the server must hold is tremendous, in particular, for the multiplication and rotation keys, which are essential for bootstrapping operation. In this paper, we propose a variant of MKHE and TMKHE with reduced evaluation keys. To reduce the size of the evaluation keys, we propose a variant of ring learning with errors (RLWE), called RLWE reusing errors (ReRLWE). ReRLWE generates other components by reusing the error that is used when generating an RLWE sample. We prove that RLWE can be reduced to ReRLWE and propose modified evaluation keys under the ReRLWE assumption, which are the modified multiplication and rotation keys. For MKHE, multiplication and rotation keys are reduced by 66% and 25%, respectively. For TMKHE, a multiplication and rotation keys are reduced by 50% and 25%, respectively.

Journal ArticleDOI
TL;DR: In this paper , a new algebraic attack on DASTA was proposed, where the key feed-forward operation, the properties of the nonlinear layer and the invariance from the linear layer are successfully utilized in the attack.
Abstract: As a fully homomorphic encryption friendly symmetric-key primitive, DASTA was invented by Hebborn at Fast Software Encryption 2020. A new fixed linear layer design concept is introduced in the DASTA stream cipher so that its AND depth and the number of ANDs per encrypted bit are quite small. Currently, the security of the DASTA stream cipher has received extensive attention. Note that the best-known attack (i.e., algebraic attack) on DASTA still has a very high data complexity. It appears to be an important task to reduce the data complexity of the attack on DASTA. In this article, a new algebraic attack on DASTA is proposed. More specifically, the key feed-forward operation, the properties of the nonlinear layer and the invariance from the linear layer are successfully utilized in the attack. In particular, the nonlinear relation of internal states in DASTA is linearized effectively. In this case, more secret key bit equations with low algebraic degrees are collected by fixing the bit. It is illustrated that four ( r − 1 )-round instances of the DASTA cipher family are theoretically broken by the attack, where r is the iterative number of round operations. Compared with the results of previous algebraic attacks, our approach achieves more favorable data complexity.

Journal ArticleDOI
TL;DR: In this article , the Modified Elliptic Curve Cryptography Multi Signature Scheme (MECC-MSS) was proposed for multiple node accessibility by finding nearest path for secure transaction.
Abstract: Internet of Things (IoT) is an emerging technology that moves the world in the direction of smart things. But, IoT security is the complex problem due to its centralized architecture, and limited capacity. So, blockchain technology has great attention due to its features of decentralized architecture, transparency, immutable records and cryptography hash functions when combining with IoT. Cryptography hash algorithms are very important in blockchain technology for secure transmission. It converts the variable size inputs to a fixed size hash output which is unchangeable. Existing cryptography hash algorithms with digital signature have issues of single node accessibility and accessed up to 128 bytes of key size only. As well as, if the attacker tries to hack the key, it cancels the transaction. This paper presents the Modified Elliptic Curve Cryptography Multi Signature Scheme (MECC-MSS) for multiple node accessibility by finding nearest path for secure transaction. In this work, the input key size can be extended up to 512 bytes to enhance the security. The performance of the proposed algorithm is analyzed with other cryptography hash algorithms like Secure Hashing Algorithms (SHAs) such as SHA224, SHA256, SHA384, SHA512, SHA3-224, SHA3-256, SHA3-384, SHA3-512 and Message Digest5 by one-way analysis of variance test in terms of accuracy and time complexity. Results show that the MECC-MSS achieves 90.85% of accuracy and time complexity of 1.4 nano seconds with significance less than 0.05. From the statistical analysis, it is observed that the proposed algorithm is significantly better than other cryptography hash algorithms and also having less time complexity.

Posted ContentDOI
16 May 2023
TL;DR: In this article , the authors proposed a new strategy for secure IOT data communication between a satellite link and a terrestrial link that uses the principles of ECC elliptic curve cryptography and the NIST P-256 standard for key agreement and encryption for transmitting messages over the satellite communication platform.
Abstract: Abstract With the expansion of Internet of Things (IOT) services and the use of satellite communications, according to the regional or continental extent of these services, the need for lightweight encryption has increased. In satellite communications, due to long distances, there are limitations in applying security, so heavy encryption algorithms such as RSA cannot be trusted for security. ECC elliptic curve cryptography provides a lighter alternative by invoking a mathematical problem called the ECDLP elliptic curve discrete logarithm problem that cannot be solved in sub exponential time. Here, we propose a new strategy for secure IOT data communication between a satellite link and a terrestrial link that uses the principles of ECC elliptic curve cryptography and the NIST P-256 standard for key agreement and encryption for transmitting messages over the satellite communication platform.

Journal ArticleDOI
TL;DR: In this paper , the size of the public key is reduced by using non-linear mappings defined as exponentiation operations in finite extended fields represented in the form of finite algebras.
Abstract: Purpose of work is the reduction in the size of the public key of public-key algorithms of multivariate cryptography based on the computational difficulty of solving systems of many power equations with many unknowns. Research method is use of non-linear mappings defined as exponentiation operations in finite extended fields GF(qm) represented in the form of finite algebras. The latter makes it possible to perform the exponentiation operation in the field GF(qm) by calculating the values of power polynomials over the field GF(q), which define a hardly reversible nonlinear mapping of the vector space over GF(q) with a secret trapdoor. Due to the use of nonlinear mappings of this type, it is possible to specify a public key in multidimensional cryptography algorithms in the form of a nonlinear mapping implemented as a calculation of the values of a set of polynomials of the third and sixth degree. At the same time, due to the use of masking linear mappings that do not lead to an increase in the number of terms in polynomials, the size of the public key is reduced in comparison with known analogue algorithms, in which the public key is represented by a set of polynomials of the second and third degrees. The proposed approach potentially expands the areas of practical application of post-quantum algorithms for public encryption and electronic digital signature, related to multidimensional cryptography, by significantly reducing the size of the public key. Results of the study are the main provisions of a new approach to the development of algorithms of multidimensional cryptography are formulated. Hardly invertible nonlinear mappings with a secret trapdoor are proposed in the form of exponentiation operations to the second and third powers in finite extended fields GF(qm), represented in a form of a finite algebra. A rationale is given for specifying a public key in a form that includes a superposition of two non-linear mappings performed as a calculation of a set of second and third degree polynomials defined over GF(q). Techniques for implementing mappings of this type are proposed and specific options for specifying the fields GF(qm) in the form of finite algebras are considered. An estimate of the size of the public key in the algorithms developed within the framework of the new approach is made. at a given security level.. Practical relevance includes the developed main provisions of a new method for constructing multidimensional cryptography algorithms based on the computational difficulty of solving systems of many power equations with many unknowns and related to post-quantum cryptoschemes. The proposed approach expands the areas of practical application of post-quantum algorithms of this type by significantly reducing the size of the public key, which provides the prerequisites for improving performance and reducing technical resources for their implementation

Proceedings ArticleDOI
05 Apr 2023
TL;DR: In this paper , the authors proposed three key levels for RSA to overcome the disadvantage of GCD attack in RSA by combining RSA private and public key with ECC private and private key using Exclusive OR(XOR) to generate a new hybrid public key.
Abstract: Rivest, Shamir, and Adleman (RSA) is one of the most important types of asymmetric cryptography. In RSA, the key size is larger compared to algorithms in cryptography. In this concept, keys are generated with any two prime numbers. An unauthorized person can easily obtain the keys using a Greatest Common Delvisor (GCD) attack. This is one of the main disadvantages of RSA cryptography. Elliptic Curve Cryptography (ECC) is also an asymmetric cryptography with a small key size. The proposed work aims to overcome the disadvantage of GCD attack in RSA. An RSA private and public key can be combined with an ECC private and public key using Exclusive OR(XOR) to generate a new hybrid private and public key. RSA encryption can be done using a new hybrid public key. RSA decryption can be done using a new hybrid private key. The advantage of the proposed work is to create three key levels for cryptography to prevent GCD attack in RSA. The key strength is higher compared to simple RSA key generation.

Posted ContentDOI
TL;DR: In this paper , the effect of numeric key fingerprint length on comparison time and error rate was investigated in end-to-end encrypted instant messaging, secure email, and device pairing.
Abstract: In applications such as end-to-end encrypted instant messaging, secure email, and device pairing, users need to compare key fingerprints to detect impersonation and adversary-in-the-middle attacks. Key fingerprints are usually computed as truncated hashes of each party's view of the channel keys, encoded as an alphanumeric or numeric string, and compared out-of-band, e.g. manually, to detect any inconsistencies. Previous work has extensively studied the usability of various verification strategies and encoding formats, however, the exact effect of key fingerprint length on the security and usability of key fingerprint verification has not been rigorously investigated. We present a 162-participant study on the effect of numeric key fingerprint length on comparison time and error rate. While the results confirm some widely-held intuitions such as general comparison times and errors increasing significantly with length, a closer look reveals interesting nuances. The significant rise in comparison time only occurs when highly similar fingerprints are compared, and comparison time remains relatively constant otherwise. On errors, our results clearly distinguish between security non-critical errors that remain low irrespective of length and security critical errors that significantly rise, especially at higher fingerprint lengths. A noteworthy implication of this latter result is that Signal/WhatsApp key fingerprints provide a considerably lower level of security than usually assumed.


Book ChapterDOI
TL;DR: In this article , the authors presented a new key generation algorithm by introducing new family of codes called quasi-centrosymmetric Goppa codes with a moderate key size for storage optimisation.
Abstract: As the development of quantum machines is booming and would threaten our standard cryptography algorithms, a transition period is necessary for the protection of the data processed by our classical machines as well before the arrival of theses machines as after. Recently, to get ahead of the curve, the National Institute of Standards and Technology (NIST) launched the Post Quantum Cryptography Standardization Project, started since late 2016. Among finalists, 3 promising code-theoretic finalist candidates, Classic McEliece, BIKE, and HQC are sent to the fourth round. In this work, to reduce classical McEliece key size without loss of security, we present a new key generation algorithm by introducing new family of codes called quasi-centrosymmetric Goppa codes with a moderate key size for storage optimisation. We also have characterized these codes in the case where the parity matrix is in Cauchy form by giving an algorithm to build them. We ended up giving a detailed analysis of the security against the most known structural attacks by giving the new complexities.

Journal ArticleDOI
TL;DR: In this article , the authors proposed a new event in the implementation method: the inputs to the elements of the controlling parameters and the initial conditions of the symmetric chaotic functions produce an ECC by a secret shared key between two parties or a number of parties participating in the group.
Abstract: The elliptical curve system has received great interest in the science of security systems, and this has a great number of advantages. When linked with chaotic systems, it gave a broad comprehensiveness in the science of finding common keys between the two parties or in a system as a server that distributes the common keys among the participants in the distress. The proposed algorithm reduces processor load, reduces power consumption, increases processing speed, enhances storage efficiency , requires smaller certificates, and is good at saving bandwidth. Where ECC gives high-level arithmetic operations ECC is an algebraic structure in wide areas for a large number of points and any point in the Cartesian coordinates, as well as for a number of prime numbers generated from the chaotic three-dimensional system using the multiplication and addition algorithm of the elliptical curved system. This paper suggested a new event in the implementation method: the inputs to the elements of the controlling parameters and the initial conditions of the symmetric chaotic functions produce an ECC by a secret shared key between two parties or a number of parties participating in the group, and all these points of the share keys lie on the points of the curve. and this key technology is available for authentication, confidentiality, and non-repudiation.

Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , the 256-bit AES algorithm targeted at FPGA-Field Programmable Gate Arrays architectures and compares it with the 128-bit implementation, reporting performance and resource utilization.
Abstract: Digital communication of any form must provide data confidentiality as the threats are increasing in today’s rapid world. Data privacy and security are crucial factors as data is considered gold in the modern era. The 128-bit Advanced Encryption Standard algorithm, commonly known as AES, has been implemented in several designs, focusing on specific purposes and is used widely. The 256-bit variant uses the same fundamental cipher blocks as the 128-bit version but differs in key size, the key expansion function and the number of cipher rounds. This paper investigates the 256-bit AES algorithm targeted at FPGA-Field Programmable Gate Arrays architectures and compares it with the 128-bit implementation, reporting performance and resource utilization. Also, the security offered is discussed. The security is determined by the complexity of recovering the key using cryptanalytic attacks. Both encryption and decryption processes are handled by this implementation and are tested in Verilog language using the Xilinx Vivado software on the Xilinx Zynq-7000 (xc7z020-clg484-1) FPGA.

Posted ContentDOI
23 Mar 2023
TL;DR: In this paper , the performance of the RK-GT-ECC algorithm with ECC and game theory method is compared in terms of text encryption and decryption security in the Aadhaar card.
Abstract: Abstract In the Aadhaar Card, security is a major issue. As the data once to be kept as private as possible, we will require some data-handling strategies. This document would be useful for data security. The advantage of this technique is in terms of text encryption and decryption security. Asymmetric algorithms include RSA (Rivest, Adi Shamir, and Leonard Adleman) and ECC (Elliptic Curve Cryptography). The cryptographic algorithms RSA and ECC are used to create a pair of keys, a public key, and a private key. The performances of the RK (Runge-Kutta) algorithm with ECC and the GT (Game Theory) with ECC method are compared in this study. Combining the ECC Method and the RK with GT Method improves the speed and security of this paper. The Avalanche Effect, Speed, Throughput, and Power Consumption of the RK-GT-ECC algorithm is recommended to be improved. The experimental findings of the RK-GT-ECC algorithm show increased performance. There is also a detailed mathematical justification for the use of the RK-GT-ECC algorithm. The improved performance of the RK-GT-ECC approaches, as well as experimental findings, are discussed.