scispace - formally typeset
L

Liu Liu

Researcher at University of California, Santa Barbara

Publications -  17
Citations -  338

Liu Liu is an academic researcher from University of California, Santa Barbara. The author has contributed to research in topics: Artificial neural network & Deep learning. The author has an hindex of 7, co-authored 17 publications receiving 207 citations. Previous affiliations of Liu Liu include University of California.

Papers
More filters
Journal ArticleDOI

$L1$ -Norm Batch Normalization for Efficient Training of Deep Neural Networks

TL;DR: A hardware-friendly normalization method that not only surpasses L2BN in speed but also simplifies the design of deep learning accelerators and promises a fully quantized training of DNNs, which empowers future artificial intelligence applications on mobile devices.
Proceedings ArticleDOI

Leveraging 3D Technologies for Hardware Security: Opportunities and Challenges

TL;DR: A 3D architecture for shielding side-channel information; a split fabrication using active interposers; circuit camouflage on monolithic 3D IC, and3D IC-based security processing-in-memory (PIM) are presented, showing that the new designs can improve existing counter-measures against security threats and further provide new security features.
Proceedings ArticleDOI

NVSim-CAM: a circuit-level simulator for emerging nonvolatile memory based content-addressable memory

TL;DR: A circuit-level model and a simulation tool are proposed that helps researchers to make early design decisions, and to evaluate device/circuit innovations, and a novel 3D vertical ReRAM based TCAM cell, the 3DvTCAM is proposed.
Proceedings Article

Dynamic Sparse Graph for Efficient Deep Learning

TL;DR: In this paper, the authors propose to execute deep neural networks (DNNs) with dynamic and sparse graph (DSG) structure for compressive memory and accelerative execution during both training and inference.
Proceedings ArticleDOI

Building energy-efficient multi-level cell STT-RAM caches with data compression

TL;DR: This paper proposes two techniques using data compression to optimize MLC STT-RAM cache design and introduces a second technique to increase the cache capacity by enabling the left hard-bit region to store another compressed cache line, which can improve the system performance for memory intensive workloads.