scispace - formally typeset
Search or ask a question
Author

Rourab Paul

Bio: Rourab Paul is an academic researcher from Siksha O Anusandhan University. The author has contributed to research in topics: Field-programmable gate array & Encryption. The author has an hindex of 6, co-authored 43 publications receiving 155 citations. Previous affiliations of Rourab Paul include Indian Institute of Technology Kanpur & Information Technology University.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: An overview ofCRU architecture in ALICE is given, the different interfaces are discussed, along with the firmware design and implementation of CRU on the LHCb PCIe40 board.
Abstract: The ALICE experiment at the CERN Large Hadron Collider (LHC) is presently going for a major upgrade in order to fully exploit the scientific potential of the upcoming high luminosity run, scheduled to start in the year 2021. The high interaction rate and the large event size will result in an experimental data flow of about 1 TB/s from the detectors, which need to be processed before sending to the online computing system and data storage. This processing is done in a dedicated Common Readout Unit (CRU), proposed for data aggregation, trigger and timing distribution and control moderation. It act as common interface between sub-detector electronic systems, computing system and trigger processors. The interface links include GBT, TTC-PON and PCIe. GBT (Gigabit transceiver) is used for detector data payload transmission and fixed latency path for trigger distribution between CRU and detector readout electronics. TTC-PON (Timing, Trigger and Control via Passive Optical Network) is employed for time multiplex trigger distribution between CRU and Central Trigger Processor (CTP). PCIe (Peripheral Component Interconnect Express) is the high-speed serial computer expansion bus standard for bulk data transport between CRU boards and processors. In this article, we give an overview of CRU architecture in ALICE, discuss the different interfaces, along with the firmware design and implementation of CRU on the LHCb PCIe40 board.

30 citations

Proceedings ArticleDOI
25 May 2019
TL;DR: In this work, some machine learning approaches have been discussed which have proved their mettle in outlier detection and their importance in Internet of Things framework and Wireless Sensor Networks.
Abstract: Outlier or anomaly detection in the sensed data for Internet of Things framework and Wireless Sensor Networks is a growing trend among researchers. Wireless Sensor Networks form the basis for Internet of Things framework in which the sensors sense a huge amount of data based on which certain actions or decisions or taken. So, the quality of data must be thoroughly checked as any kind of outlier may degrade the quality of the data and hence affect the final decision. Thus, it becomes imperative to maintain the quality of the data. In this work, some machine learning approaches have been discussed which have proved their mettle in outlier detection.

23 citations

Journal ArticleDOI
TL;DR: This work has adopted Dempster-Shafer Theory of Evidence which is a popular learning method to collate the information from sensors to come up with a decision regarding the faulty status of a sensor node to verify the validity of the proposed method.
Abstract: Fault detection in sensor nodes is a pertinent issue that has been an important area of research for a very long time. But it is not explored much as yet in the context of Internet of Things. Internet of Things work with a massive amount of data so the responsibility for guaranteeing the accuracy of the data also lies with it. Moreover, a lot of important and critical decisions are made based on these data, so ensuring its correctness and accuracy is also very important. Also, the detection needs to be as precise as possible to avoid negative alerts. For this purpose, this work has adopted Dempster–Shafer theory of evidence to collate the information from sensors to come up with a decision regarding the faulty status of a sensor node from a data-centric perspective. To verify the validity of the proposed method, simulations have been performed on a benchmark data set and data collected through a test bed in a laboratory set-up. For the different types of faults, the proposed method shows very high accuracy for both the benchmark (99.8%) and laboratory data sets (99.9%) when compared to the other state-of-the-art machine learning techniques.

22 citations

Proceedings ArticleDOI
01 Sep 2018
TL;DR: An IoT based smart city architecture which adopted the blockchain technology preserving all the cryptographic security issues is proposed and comparison of all security parameters with existing literature shows that the architecture is reasonably efficient in terms of security.
Abstract: Standard security protocols are heavy weight interms of memory foot prints which make all the security protocol unfit for budgeted platforms such as Internet of Things (IoT). The blockchain (BC) is a very efficient architecture to preserve five basic cryptographic primitives, such as authenticity, integrity, confidentiality, availability and non-repudiation. Conventional adoption of blockchain in IoT causes significant energy consumption, delay, and computational overhead which are not suitable for various resource-constrained IoT devices. In our submission we change the basic architecture of blockchain and make it more efficient for IoT application. The article proposes an IoT based smart city architecture which adopted the blockchain technology preserving all the cryptographic security issues. The adoption of blockchain causes very minimal overhead on IoT platform. Comparison of all security parameters with existing literature shows that the architecture is reasonably efficient in terms of security

18 citations

Posted Content
TL;DR: The principal strategy of the NIST Statistical Test Suite is to judge statistical randomness property of random bit generating algorithms.
Abstract: The NIST Statistical Test Suite has 15 tests. The principal strategy of the NIST Statistical Test Suite is to judge statistical randomness property of random bit generating algorithms. Based on 300 to 500 different keys, the algorithm generates a series of even number of different long random sequences of n bits, n varying between 13 and 15 lacs, each of which is tested by the 15 tests. Each test has a specific statistic parameter for bit sequences under the assumption of randomness and calculates the deviation of the respective statistic parameter for the series of tested bit sequences.

15 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,188 citations

Journal Article
TL;DR: Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) with einer Schlüssellänge von 128 bit zum Sicherheitsrisiko, und zuletzt konnte 1998 mit einem von der „Electronic Frontier Foundation“ (EFF) entwickkelten Spezialmaschine mit 1.800 parallel arbeit
Abstract: Im Jahre 1977 wurde der „Data Encryption Algorithm“ (DEA) vom „National Bureau of Standards“ (NBS, später „National Institute of Standards and Technology“ – NIST) zum amerikanischen Verschlüsselungsstandard für Bundesbehörden erklärt [NBS_77]. 1981 folgte die Verabschiedung der DEA-Spezifikation als ANSI-Standard „DES“ [ANSI_81]. Die Empfehlung des DES als StandardVerschlüsselungsverfahren wurde auf fünf Jahre befristet und 1983, 1988 und 1993 um jeweils weitere fünf Jahre verlängert. Derzeit liegt eine Neufassung des NISTStandards vor [NIST_99], in dem der DES für weitere fünf Jahre übergangsweise zugelassen sein soll, aber die Verwendung von Triple-DES empfohlen wird: eine dreifache Anwendung des DES mit drei verschiedenen Schlüsseln (effektive Schlüssellänge: 168 bit) [NIST_99]. Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) mit einer Schlüssellänge von 128 bit. Da die amerikanische „National Security Agency“ (NSA) dafür gesorgt hatte, daß der DES eine Schlüssellänge von lediglich 64 bit besitzt, von denen nur 56 bit relevant sind, und spezielle Substitutionsboxen (den „kryptographischen Kern“ des Verfahrens) erhielt, deren Konstruktionskriterien von der NSA nicht veröffentlicht wurden, war das Verfahren von Beginn an umstritten. Kritiker nahmen an, daß es eine geheime „Trapdoor“ in dem Verfahren gäbe, die der NSA eine OnlineEntschlüsselung auch ohne Kenntnis des Schlüssels erlauben würde. Zwar ließ sich dieser Verdacht nicht erhärten, aber sowohl die Zunahme von Rechenleistung als auch die Parallelisierung von Suchalgorithmen machen heute eine Schlüssellänge von 56 bit zum Sicherheitsrisiko. Zuletzt konnte 1998 mit einer von der „Electronic Frontier Foundation“ (EFF) entwickelten Spezialmaschine mit 1.800 parallel arbeitenden, eigens entwickelten Krypto-Prozessoren ein DES-Schlüssel in einer Rekordzeit von 2,5 Tagen gefunden werden. Um einen Nachfolger für den DES zu finden, kündigte das NIST am 2. Januar 1997 die Suche nach einem „Advanced Encryption Standard“ (AES) an. Ziel dieser Initiative ist, in enger Kooperation mit Forschung und Industrie ein symmetrisches Verschlüsselungsverfahren zu finden, das geeignet ist, bis weit ins 21. Jahrhundert hinein amerikanische Behördendaten wirkungsvoll zu verschlüsseln. Dazu wurde am 12. September 1997 ein offizieller „Call for Algorithm“ ausgeschrieben. An die vorzuschlagenden symmetrischen Verschlüsselungsalgorithmen wurden die folgenden Anforderungen gestellt: nicht-klassifiziert und veröffentlicht, weltweit lizenzfrei verfügbar, effizient implementierbar in Hardund Software, Blockchiffren mit einer Blocklänge von 128 bit sowie Schlüssellängen von 128, 192 und 256 bit unterstützt. Auf der ersten „AES Candidate Conference“ (AES1) veröffentlichte das NIST am 20. August 1998 eine Liste von 15 vorgeschlagenen Algorithmen und forderte die Fachöffentlichkeit zu deren Analyse auf. Die Ergebnisse wurden auf der zweiten „AES Candidate Conference“ (22.-23. März 1999 in Rom, AES2) vorgestellt und unter internationalen Kryptologen diskutiert. Die Kommentierungsphase endete am 15. April 1999. Auf der Basis der eingegangenen Kommentare und Analysen wählte das NIST fünf Kandidaten aus, die es am 9. August 1999 öffentlich bekanntmachte: MARS (IBM) RC6 (RSA Lab.) Rijndael (Daemen, Rijmen) Serpent (Anderson, Biham, Knudsen) Twofish (Schneier, Kelsey, Whiting, Wagner, Hall, Ferguson).

624 citations

Journal ArticleDOI
TL;DR: A use case of fully autonomous driving is presented to show 6G supports massive IoT and some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized.
Abstract: Nowadays, many disruptive Internet-of-Things (IoT) applications emerge, such as augmented/virtual reality online games, autonomous driving, and smart everything, which are massive in number, data intensive, computation intensive, and delay sensitive. Due to the mismatch between the fifth generation (5G) and the requirements of such massive IoT-enabled applications, there is a need for technological advancements and evolutions for wireless communications and networking toward the sixth-generation (6G) networks. 6G is expected to deliver extended 5G capabilities at a very high level, such as Tbps data rate, sub-ms latency, cm-level localization, and so on, which will play a significant role in supporting massive IoT devices to operate seamlessly with highly diverse service requirements. Motivated by the aforementioned facts, in this article, we present a comprehensive survey on 6G-enabled massive IoT. First, we present the drivers and requirements by summarizing the emerging IoT-enabled applications and the corresponding requirements, along with the limitations of 5G. Second, visions of 6G are provided in terms of core technical requirements, use cases, and trends. Third, a new network architecture provided by 6G to enable massive IoT is introduced, i.e., space–air–ground–underwater/sea networks enhanced by edge computing. Fourth, some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized. Finally, a use case of fully autonomous driving is presented to show 6G supports massive IoT.

263 citations

Journal ArticleDOI
TL;DR: A state-of-art survey on the integration of blockchain with 5G networks and beyond, including discussions on the potential of blockchain for enabling key 5G technologies, including cloud/edge computing, Software Defined Networks, Network Function Virtualization, Network Slicing, and D2D communications.

244 citations