scispace - formally typeset
Search or ask a question
Author

Rinkaj Goyal

Bio: Rinkaj Goyal is an academic researcher from Guru Gobind Singh Indraprastha University. The author has contributed to research in topics: Cloud computing & Software quality. The author has an hindex of 6, co-authored 31 publications receiving 204 citations.

Papers
More filters
Journal ArticleDOI
08 Aug 2016
TL;DR: In this article, the authors use agent-based modeling to analyze five types of interaction (including ideal cases) using an integrated selection process to empirically investigate the influence of above mentioned control factors in such a society.
Abstract: Communication and sharing of opinions play a crucial role in shaping the views of a person in a society. Interactions with other people enable a person to interpret their views and expound his opinion. Ordinarily, people tend to change their opinions in compliance with those having significantly higher expertise thereby leading to a bipartite society of two intellectual groups i.e. mavens (highly intellectual and confident people) and laypeople (diffident people with little or no experience and knowledge). However, the sharing of information in a group is influenced by the weight of advice with which people consider opinion of others and several control factors like interaction procedure adopted, possibility of mutual exchange of information, and the time at which information is updated. Moreover, the effects of these factors are observable in both physical and digital societies during opinion formation. This study is build upon the prior work of Moussad et al. (PLoS ONE 8:78433, 2013). In this study, we use agent based modeling to analyze five types of interaction (including ideal cases) using an integrated selection process to empirically investigate the influence of above mentioned control factors in such a society. Through the simulations, we identify the minimum number of iterations required to reach an agreement in such a group of people and the critical proportion of the respective group to become observable in the opinion formation under different scenarios. We observe that increasing the weight of advice has a positive effect on the quality of consensus reached as well as the speed of convergence of crowd towards an opinion. Furthermore, the interaction procedure adopted plays a dominant role in demarcating the critical proportions of the groups to dominate the consensus.
Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , an alternative ECA rules classification using information-theoretic measures with 88 minimal representatives of ECA rule is presented using entropy-time diagrams (variation in BiEntropy values with time) to compare ECAs behavior.
Abstract: This study demonstrates an alternative ECA rules classification using information-theoretic measures with 88 minimal representatives of ECA rules. It proposes using entropy-time diagrams (variation in BiEntropy values with time) to compare ECAs behavior, where two configurations may correspond to the same approximate information content despite their visual differences in the space-time diagrams. The genotype of an ECA rule, which is perceived as the amount of information processed, is captured through four proposed measures, i.e., DiffEntropy (DE), SimConfigOrdered (SCO), SimConfigImmediate (SCI), and SimConfigFluctuation (SCF). By clustering the temporal sequences of entropy values of different rules using dynamic time warping (DTW), a Genealogy Interceded Phenotypic Analysis (GIPA) of 88 ECA rules is proposed. This study is restricted to synchronous ECA with periodic boundary conditions.
01 Jan 2015
TL;DR: In this article, the authors investigated the use of a customized discrete firefly algorithm (DFA) for ordering the disk requests to minimize the total access time, which simulates the movement of each firefly within a population towards other users using a variation of edge-based mutation.
Abstract: 1,† Abstract: This study empirically investigates the usage of a customized discrete firefly algorithm (DFA) for ordering the disk requests to minimize the total access time. The procedure simulates the movement of each firefly within a population towards oth- ers using a variation of edge-based mutation. The algorithm was applied to random- ized standard disk sequences with varying length of input disk requests. Owing to the greater impact of seek time in the determination of access time for a disk having sizable number of tracks, this has been taken as the primary performance factor in the sched- uling of tasks. The analysis of the results obtained establishes the relative advantage of using the firefly optimization method over the traditional disk scheduling algorithms.

Cited by
More filters
01 Jan 2012

3,692 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations

01 Jun 2008
TL;DR: This chapter discusses designing and Developing Agent-Based Models, and building the Collectivities Model Step by Step, as well as reporting on advances in agent-Based Modeling.
Abstract: Series Editor's Introduction Preface Acknowledgments 1. The Idea of Agent-Based Modeling 1.1 Agent-Based Modeling 1.2 Some Examples 1.3 The Features of Agent-Based Modeling 1.4 Other Related Modeling Approaches 2. Agents, Environments, and Timescales 2.1 Agents 2.2 Environments 2.3 Randomness 2.4 Time 3. Using Agent-Based Models in Social Science Research 3.1 An Example of Developing an Agent-Based Model 3.2 Verification: Getting Rid of the Bugs 3.3 Validation 3.4 Techniques for Validation 3.5 Summary 4. Designing and Developing Agent-Based Models 4.1 Modeling Toolkits, Libraries, Languages, Frameworks, and Environments 4.2 Using NetLogo to Build Models 4.3 Building the Collectivities Model Step by Step 4.4 Planning an Agent-Based Model Project 4.5 Reporting Agent-Based Model Research 4.6 Summary 5. Advances in Agent-Based Modeling 5.1 Geographical Information Systems 5.2 Learning 5.3 Simulating Language Resources Glossary References Index About the Author

473 citations

Journal ArticleDOI
TL;DR: The purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges, as well as some promising cross-cutting data reduction and forensics intelligence techniques.
Abstract: Today is the era of the Internet of Things (IoT). The recent advances in hardware and information technology have accelerated the deployment of billions of interconnected, smart and adaptive devices in critical infrastructures like health, transportation, environmental control, and home automation. Transferring data over a network without requiring any kind of human-to-computer or human-to-human interaction, brings reliability and convenience to consumers, but also opens a new world of opportunity for intruders, and introduces a whole set of unique and complicated questions to the field of Digital Forensics. Although IoT data could be a rich source of evidence, forensics professionals cope with diverse problems, starting from the huge variety of IoT devices and non-standard formats, to the multi-tenant cloud infrastructure and the resulting multi-jurisdictional litigations. A further challenge is the end-to-end encryption which represents a trade-off between users’ right to privacy and the success of the forensics investigation. Due to its volatile nature, digital evidence has to be acquired and analyzed using validated tools and techniques that ensure the maintenance of the Chain of Custody. Therefore, the purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges. Furthermore, this work provides an overview of the past and current theoretical models in the digital forensics science. Special attention is paid to frameworks that aim to extract data in a privacy-preserving manner or secure the evidence integrity using decentralized blockchain-based solutions. In addition, the present paper addresses the ongoing Forensics-as-a-Service (FaaS) paradigm, as well as some promising cross-cutting data reduction and forensics intelligence techniques. Finally, several other research trends and open issues are presented, with emphasis on the need for proactive Forensics Readiness strategies and generally agreed-upon standards.

440 citations