Institution
ITM University, Gurgaon, Haryana
Education•Gurgaon, India•
About: ITM University, Gurgaon, Haryana is a education organization based out in Gurgaon, India. It is known for research contribution in the topics: Encryption & Cryptosystem. The organization has 749 authors who have published 1159 publications receiving 12997 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Amorphous thin films of Se 80 − x Te 20 Sb x (x = 0, 6, 12) chalcogenide glasses have been deposited onto pre-cleaned glass substrate using thermal evaporation technique under a vacuum of 10 −5 ǫTorr as discussed by the authors.
14 citations
••
TL;DR: Random weighted singular value decomposition is purely based upon random weights, isometric matrix and orthogonal triangular decomposition and all these fragments enhances the security of double random phase encoding cryptosystem.
Abstract: A new asymmetric encryption system for double random phase encoding based on random weighted singular value decomposition and fractional Hartley transform domain has been proposed. Random weighted singular value decomposition is purely based upon random weights, isometric matrix and orthogonal triangular decomposition and all these fragments enhances the security of double random phase encoding cryptosystem. Random weights and orthogonal triangular decomposition are considered as heart of this cryptosystem. This system is carried out in fractional Hartley domain, where fractional orders play a vital role. On the receiver side, it is only possible to decrypt the image if anyone knows all the three components, its multiplication order, fractional order of fractional Hartley transform. Proposed cryptosystem is efficiently compared with singular value decomposition and truncated singular value decomposition. Similar to singular value decomposition and truncated singular value decomposition, proposed cryptosystem also yields three components. Because of random weights, these three components are highly differing from traditional singular value decomposition and truncated singular value decomposition components. Some analysis is offered to authenticate the opportunity.
14 citations
••
01 Jan 2018TL;DR: The aim of this research is to assist engineers in requirement prioritization by reducing time and hence minimizing decision-making efforts.
Abstract: Nowadays, software has become a part of greater significance in human life. The chief goal in today’s environment is to prepare the software that can meet all the needs of stakeholders’ increase in complexity of software boost in the requirement too. Many requirements are fulfilled in the given time duration or can be considered first to reduce the risk by the proper utilization of the software. In this way, gathering and prioritizing requirements can help in the successive development of the software. Requirement prioritization is one of the essential decision-making processes. Its main objective is to extract the suitable facts from the client, describing the requirements by the attributes. The ultimate purpose is to minimize the user decision-making effort and to increase the accuracy of the final requirements’ ranking. A number of requirement prioritization techniques have been proposed till date but are not efficient performance wise. This piece of work presents a new technique which performs an enhancement in performance by using least-squares-based random genetic algorithm. The aim of this research is to assist engineers in requirement prioritization by reducing time and hence minimizing decision-making efforts. In the proposed work as the number of pairs increases, the distance is slightly increases. Select the initial population by analyzing some of the input partial orders, or it can be done randomly. The time also reduces. In future, to integrate genetic algorithm (GA) and value-oriented prioritization (VOP), take the benefit of both for the better outputs.
13 citations
•
16 Mar 2016TL;DR: Insight is given how big data from Health care can be analyzed using predictive analytics method by exploiting the potential of Hadoop/Map Reduce tool.
Abstract: Healthcare industry has taken a move towards modernization and has stepped into creating electronic health records which is generating massive data. Data generated from health care industry is huge in volume and highly unstructured in nature, it is thus important to structure the data and leverage its actual potential. One of the major contributors of mortality and morbidity in developing countries like India is a non communicable disease, chronic kidney failure. Early diagnosis of a disease becomes significantly important and needs more attention. This survey paper discusses overview of the researches done in this area. Based on the survey it is found that not much significant work has been done in this area. This paper also gives an insight how big data from Health care can be analyzed using predictive analytics method by exploiting the potential of Hadoop / Map Reduce tool. The benefit of implementing this technique would be that the disease would be diagnosed at an early stage based on the various symptoms of the patient and thus can help patients to get right cure and care at a right time which will lead to better health.
13 citations
••
TL;DR: This paper studies systems of two or more linear congruences, considering 2t prime numbers to construct t share holders and splitting the secret S into t parts and all the shares needed to reconstruct the secret using CRT.
Abstract: In recent years, Chinese remainder theorem (CRT)-based function sharing schemes are proposed in the literature. In this paper, we study systems of two or more linear congruences. When the moduli are pairwise coprime, the main theorem is known as the CRT, because special cases of the theorem were known to the ancient Chinese. In modern algebra the CRT is a powerful tool in a variety of applications, such as cryptography, error control coding, fault-tolerant systems and certain aspects of signal processing. Threshold schemes enable a group of users to share a secret by providing each user with a share. The scheme has a threshold t+1 if any subset with cardinality t+1 of the shares enables the secret to be recovered. In this paper, we are considering 2t prime numbers to construct t share holders. Using the t share holders, we split the secret S into t parts and all the t shares are needed to reconstruct the secret using CRT.
13 citations
Authors
Showing all 763 results
Name | H-index | Papers | Citations |
---|---|---|---|
S. K. Maurya | 37 | 121 | 3488 |
Prem Vrat | 33 | 69 | 4894 |
Kehar Singh | 30 | 197 | 4555 |
Stefan Fischer | 30 | 198 | 4477 |
Abhishek Jain | 29 | 120 | 3556 |
Prabhata K. Swamee | 29 | 150 | 3278 |
R. C. Mittal | 28 | 107 | 2456 |
Ram Kumar Sharma | 25 | 129 | 2243 |
Pramila Goyal | 23 | 52 | 1524 |
B. K. Das | 22 | 100 | 1879 |
Divya Agarwal | 22 | 198 | 2020 |
Yugal Kumar | 20 | 70 | 847 |
Sudheer Ch | 20 | 30 | 1336 |
Amparo Borrell | 20 | 87 | 1155 |
Anil Kumar Yadav | 19 | 54 | 1145 |