scispace - formally typeset
Search or ask a question
Author

Somnath Paul

Bio: Somnath Paul is an academic researcher from Intel. The author has contributed to research in topics: Reconfigurable computing & Computing with Memory. The author has an hindex of 21, co-authored 104 publications receiving 2092 citations. Previous affiliations of Somnath Paul include Texas A&M University & Case Western Reserve University.


Papers
More filters
Book ChapterDOI
30 Aug 2009
TL;DR: A test pattern generation technique based on multiple excitation of rare logic conditions at internal nodes that maximizes the probability of inserted Trojans getting triggered and detected by logic testing, while drastically reducing the number of vectors compared to a weighted random pattern based test generation.
Abstract: In order to ensure trusted in---field operation of integrated circuits, it is important to develop efficient low---cost techniques to detect malicious tampering (also referred to as Hardware Trojan ) that causes undesired change in functional behavior Conventional post--- manufacturing testing, test generation algorithms and test coverage metrics cannot be readily extended to hardware Trojan detection In this paper, we propose a test pattern generation technique based on multiple excitation of rare logic conditions at internal nodes Such a statistical approach maximizes the probability of inserted Trojans getting triggered and detected by logic testing, while drastically reducing the number of vectors compared to a weighted random pattern based test generation Moreover, the proposed test generation approach can be effective towards increasing the sensitivity of Trojan detection in existing side---channel approaches that monitor the impact of a Trojan circuit on power or current signature Simulation results for a set of ISCAS benchmarks show that the proposed test generation approach can achieve comparable or better Trojan detection coverage with about 85% reduction in test length on average over random patterns

411 citations

Journal ArticleDOI
TL;DR: A novel noninvasive, multiple-parameter side-channel analysisbased Trojan detection approach that uses the intrinsic relationship between dynamic current and maximum operating frequency of a circuit to isolate the effect of a Trojan circuit from process noise.
Abstract: Hardware Trojan attack in the form of malicious modification of a design has emerged as a major security threat. Sidechannel analysis has been investigated as an alternative to conventional logic testing to detect the presence of hardware Trojans. However, these techniques suffer from decreased sensitivity toward small Trojans, especially because of the large process variations present in modern nanometer technologies. In this paper, we propose a novel noninvasive, multiple-parameter side-channel analysisbased Trojan detection approach. We use the intrinsic relationship between dynamic current and maximum operating frequency of a circuit to isolate the effect of a Trojan circuit from process noise. We propose a vector generation approach and several design/test techniques to improve the detection sensitivity. Simulation results with two large circuits, a 32-bit integer execution unit (IEU) and a 128-bit advanced encryption standard (AES) cipher, show a detection resolution of 1.12 percent amidst ±20 percent parameter variations. The approach is also validated with experimental results. Finally, the use of a combined side-channel analysis and logic testing approach is shown to provide high overall detection coverage for hardware Trojan circuits of varying types and sizes.

207 citations

Journal ArticleDOI
21 Jun 2017
TL;DR: In this article, an event-driven random back-propagation (eRBP) rule was proposed for learning deep representations in neuromorphic computing hardware using a two-compartment leaky integrate & fire neuron and a membrane-voltage modulated, spike-driven plasticity rule.
Abstract: An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. The gradient descent back-propagation rule is a powerful algorithm that is ubiquitous in deep learning, but it relies on the immediate availability of network-wide information stored with high-precision memory. However, recent work shows that exact backpropagated weights are not essential for learning deep representations. Here, we demonstrate an event-driven random backpropagation (eRBP) rule that uses an error-modulated synaptic plasticity rule for learning deep representations in neuromorphic computing hardware. The rule is very suitable for implementation in neuromorphic hardware using a two-compartment leaky integrate & fire neuron and a membrane-voltage modulated, spike-driven plasticity rule. Our results show that using eRBP, deep representations are rapidly learned without using backpropagated gradients, achieving nearly identical classification accuracies compared to artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.

195 citations

Proceedings ArticleDOI
13 Jun 2010
TL;DR: A novel non-invasive, multiple-parameter side-channel analysis based Trojan detection approach that is capable of detecting malicious hardware modifications in the presence of large process variation induced noise.
Abstract: Malicious alterations of integrated circuits during fabrication in untrusted foundries pose major concern in terms of their reliable and trusted field operation. It is extremely difficult to discover such alterations, also referred to as “hardware Trojans” using conventional structural or functional testing strategies. In this paper, we propose a novel non-invasive, multiple-parameter side-channel analysis based Trojan detection approach that is capable of detecting malicious hardware modifications in the presence of large process variation induced noise. We exploit the intrinsic relationship between dynamic current (I DDT ) and maximum operating frequency (F max ) of a circuit to distinguish the effect of a Trojan from process induced fluctuations in I DDT . We propose a vector generation approach for I DDT measurement that can improve the Trojan detection sensitivity for arbitrary Trojan instances. Simulation results with two large circuits, a 32-bit integer execution unit (IEU) and a 128-bit Advanced Encryption System (AES) cipher, show a detection resolution of 0.04% can be achieved in presence of ±20% parameter (V th ) variations. The approach is also validated with experimental results using 120nm FPGA (Xilinx Virtex-II) chips.

148 citations

Posted Content
16 Dec 2016
TL;DR: An event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations, achieving classification accuracies on permutation invariant datasets comparable to those obtained in artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.
Abstract: An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Back Propagation (BP) rule, often relies on the immediate availability of network-wide information stored with high-precision memory, and precise operations that are difficult to realize in neuromorphic hardware. Remarkably, recent work showed that exact backpropagated weights are not essential for learning deep representations. Random BP replaces feedback weights with random ones and encourages the network to adjust its feed-forward weights to learn pseudo-inverses of the (random) feedback weights. Building on these results, we demonstrate an event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations in neuromorphic computing hardware. The rule requires only one addition and two comparisons for each synaptic weight using a two-compartment leaky Integrate & Fire (I&F) neuron, making it very suitable for implementation in digital or mixed-signal neuromorphic hardware. Our results show that using eRBP, deep representations are rapidly learned, achieving nearly identical classification accuracies compared to artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.

128 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Loihi is a 60-mm2 chip fabricated in Intels 14-nm process that advances the state-of-the-art modeling of spiking neural networks in silicon, and can solve LASSO optimization problems with over three orders of magnitude superior energy-delay-product compared to conventional solvers running on a CPU iso-process/voltage/area.
Abstract: Loihi is a 60-mm2 chip fabricated in Intels 14-nm process that advances the state-of-the-art modeling of spiking neural networks in silicon. It integrates a wide range of novel features for the field, such as hierarchical connectivity, dendritic compartments, synaptic delays, and, most importantly, programmable synaptic learning rules. Running a spiking convolutional form of the Locally Competitive Algorithm, Loihi can solve LASSO optimization problems with over three orders of magnitude superior energy-delay-product compared to conventional solvers running on a CPU iso-process/voltage/area. This provides an unambiguous example of spike-based computation, outperforming all known conventional solutions.

2,331 citations

Journal ArticleDOI
TL;DR: A classification of hardware Trojans and a survey of published techniques for Trojan detection are presented.
Abstract: Editor's note:Today's integrated circuits are vulnerable to hardware Trojans, which are malicious alterations to the circuit, either during design or fabrication. This article presents a classification of hardware Trojans and a survey of published techniques for Trojan detection.

1,227 citations

Journal ArticleDOI
Feng Pan1, Song Gao1, Chao Chen1, Cheng Song1, Fei Zeng1 
TL;DR: A comprehensive review of the recent progress in the so-called resistive random access memories (RRAMs) can be found in this article, where a brief introduction is presented to describe the construction and development of RRAMs, their potential for broad applications in the fields of nonvolatile memory, unconventional computing and logic devices, and the focus of research concerning RRAMS over the past decade.
Abstract: This review article attempts to provide a comprehensive review of the recent progress in the so-called resistive random access memories (RRAMs) First, a brief introduction is presented to describe the construction and development of RRAMs, their potential for broad applications in the fields of nonvolatile memory, unconventional computing and logic devices, and the focus of research concerning RRAMs over the past decade Second, both inorganic and organic materials used in RRAMs are summarized, and their respective advantages and shortcomings are discussed Third, the important switching mechanisms are discussed in depth and are classified into ion migration, charge trapping/de-trapping, thermochemical reaction, exclusive mechanisms in inorganics, and exclusive mechanisms in organics Fourth, attention is given to the application of RRAMs for data storage, including their current performance, methods for performance enhancement, sneak-path issue and possible solutions, and demonstrations of 2-D and 3-D crossbar arrays Fifth, prospective applications of RRAMs in unconventional computing, as well as logic devices and multi-functionalization of RRAMs, are comprehensively summarized and thoroughly discussed The present review article ends with a short discussion concerning the challenges and future prospects of the RRAMs

1,129 citations

Journal ArticleDOI
TL;DR: A survey of techniques for approximate computing (AC), which discusses strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units, processor components, memory technologies, and so forth, as well as programming frameworks for AC.
Abstract: Approximate computing trades off computation quality with effort expended, and as rising performance demands confront plateauing resource budgets, approximate computing has become not merely attractive, but even imperative. In this article, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU, and FPGA), processor components, memory technologies, and so forth, as well as programming frameworks for AC. We classify these techniques based on several key characteristics to emphasize their similarities and differences. The aim of this article is to provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.

890 citations