scispace - formally typeset
Search or ask a question
Institution

ITM University, Gurgaon, Haryana

EducationGurgaon, India
About: ITM University, Gurgaon, Haryana is a education organization based out in Gurgaon, India. It is known for research contribution in the topics: Encryption & Cryptosystem. The organization has 749 authors who have published 1159 publications receiving 12997 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Amorphous thin films of Se 80 − x Te 20 Sb x (x = 0, 6, 12) chalcogenide glasses have been deposited onto pre-cleaned glass substrate using thermal evaporation technique under a vacuum of 10 −5 ǫTorr as discussed by the authors.

14 citations

Journal ArticleDOI
TL;DR: Random weighted singular value decomposition is purely based upon random weights, isometric matrix and orthogonal triangular decomposition and all these fragments enhances the security of double random phase encoding cryptosystem.
Abstract: A new asymmetric encryption system for double random phase encoding based on random weighted singular value decomposition and fractional Hartley transform domain has been proposed. Random weighted singular value decomposition is purely based upon random weights, isometric matrix and orthogonal triangular decomposition and all these fragments enhances the security of double random phase encoding cryptosystem. Random weights and orthogonal triangular decomposition are considered as heart of this cryptosystem. This system is carried out in fractional Hartley domain, where fractional orders play a vital role. On the receiver side, it is only possible to decrypt the image if anyone knows all the three components, its multiplication order, fractional order of fractional Hartley transform. Proposed cryptosystem is efficiently compared with singular value decomposition and truncated singular value decomposition. Similar to singular value decomposition and truncated singular value decomposition, proposed cryptosystem also yields three components. Because of random weights, these three components are highly differing from traditional singular value decomposition and truncated singular value decomposition components. Some analysis is offered to authenticate the opportunity.

14 citations

Book ChapterDOI
01 Jan 2018
TL;DR: The aim of this research is to assist engineers in requirement prioritization by reducing time and hence minimizing decision-making efforts.
Abstract: Nowadays, software has become a part of greater significance in human life. The chief goal in today’s environment is to prepare the software that can meet all the needs of stakeholders’ increase in complexity of software boost in the requirement too. Many requirements are fulfilled in the given time duration or can be considered first to reduce the risk by the proper utilization of the software. In this way, gathering and prioritizing requirements can help in the successive development of the software. Requirement prioritization is one of the essential decision-making processes. Its main objective is to extract the suitable facts from the client, describing the requirements by the attributes. The ultimate purpose is to minimize the user decision-making effort and to increase the accuracy of the final requirements’ ranking. A number of requirement prioritization techniques have been proposed till date but are not efficient performance wise. This piece of work presents a new technique which performs an enhancement in performance by using least-squares-based random genetic algorithm. The aim of this research is to assist engineers in requirement prioritization by reducing time and hence minimizing decision-making efforts. In the proposed work as the number of pairs increases, the distance is slightly increases. Select the initial population by analyzing some of the input partial orders, or it can be done randomly. The time also reduces. In future, to integrate genetic algorithm (GA) and value-oriented prioritization (VOP), take the benefit of both for the better outputs.

13 citations

Proceedings Article
16 Mar 2016
TL;DR: Insight is given how big data from Health care can be analyzed using predictive analytics method by exploiting the potential of Hadoop/Map Reduce tool.
Abstract: Healthcare industry has taken a move towards modernization and has stepped into creating electronic health records which is generating massive data. Data generated from health care industry is huge in volume and highly unstructured in nature, it is thus important to structure the data and leverage its actual potential. One of the major contributors of mortality and morbidity in developing countries like India is a non communicable disease, chronic kidney failure. Early diagnosis of a disease becomes significantly important and needs more attention. This survey paper discusses overview of the researches done in this area. Based on the survey it is found that not much significant work has been done in this area. This paper also gives an insight how big data from Health care can be analyzed using predictive analytics method by exploiting the potential of Hadoop / Map Reduce tool. The benefit of implementing this technique would be that the disease would be diagnosed at an early stage based on the various symptoms of the patient and thus can help patients to get right cure and care at a right time which will lead to better health.

13 citations

Journal ArticleDOI
TL;DR: This paper studies systems of two or more linear congruences, considering 2t prime numbers to construct t share holders and splitting the secret S into t parts and all the shares needed to reconstruct the secret using CRT.
Abstract: In recent years, Chinese remainder theorem (CRT)-based function sharing schemes are proposed in the literature. In this paper, we study systems of two or more linear congruences. When the moduli are pairwise coprime, the main theorem is known as the CRT, because special cases of the theorem were known to the ancient Chinese. In modern algebra the CRT is a powerful tool in a variety of applications, such as cryptography, error control coding, fault-tolerant systems and certain aspects of signal processing. Threshold schemes enable a group of users to share a secret by providing each user with a share. The scheme has a threshold t+1 if any subset with cardinality t+1 of the shares enables the secret to be recovered. In this paper, we are considering 2t prime numbers to construct t share holders. Using the t share holders, we split the secret S into t parts and all the t shares are needed to reconstruct the secret using CRT.

13 citations


Authors

Showing all 763 results

NameH-indexPapersCitations
S. K. Maurya371213488
Prem Vrat33694894
Kehar Singh301974555
Stefan Fischer301984477
Abhishek Jain291203556
Prabhata K. Swamee291503278
R. C. Mittal281072456
Ram Kumar Sharma251292243
Pramila Goyal23521524
B. K. Das221001879
Divya Agarwal221982020
Yugal Kumar2070847
Sudheer Ch20301336
Amparo Borrell20871155
Anil Kumar Yadav19541145
Network Information
Related Institutions (5)
Amity University
12.7K papers, 86K citations

87% related

Motilal Nehru National Institute of Technology Allahabad
5K papers, 61.8K citations

85% related

Thapar University
8.5K papers, 130.3K citations

84% related

National Institute of Technology, Karnataka
7K papers, 70.3K citations

84% related

National Institute of Technology, Rourkela
10.7K papers, 150.1K citations

84% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20237
202221
2021115
2020111
2019140
2018130