scispace - formally typeset
Open AccessJournal ArticleDOI

Deep learning with coherent nanophotonic circuits

TLDR
A new architecture for a fully optical neural network is demonstrated that enables a computational speed enhancement of at least two orders of magnitude and three order of magnitude in power efficiency over state-of-the-art electronics.
Abstract
Artificial Neural Networks have dramatically improved performance for many machine learning tasks. We demonstrate a new architecture for a fully optical neural network that enables a computational speed enhancement of at least two orders of magnitude and three orders of magnitude in power efficiency over state-of-the-art electronics.

read more

Citations
More filters
Proceedings ArticleDOI

Towards Area-Efficient Optical Neural Networks: An FFT-based Architecture

TL;DR: This paper proposes an area-efficient ONN architecture based on structured neural networks, leveraging optical fast Fourier transform for efficient computation, and proposes a two-phase software training flow with structured pruning to further reduce the optical component utilization.
Posted Content

Physics for Neuromorphic Computing

TL;DR: In this paper, the authors make the case that building this new hardware necessitates reinventing electronics, and they show that research in physics and material science will be key to create artificial nano-neurons and synapses, to connect them together in huge numbers, to organize them in complex systems.
Journal ArticleDOI

All-in-one silicon photonic polarization processor

TL;DR: In this paper, an all-in-one chip-scale polarization processor based on a linear optical network is presented, where the polarization functions can be configured by tuning the array of phase shifters on the chip.
Journal ArticleDOI

Roadmap on material-function mapping for photonic-electronic hybrid neural networks

TL;DR: In this article, the authors provide a roadmap to pave the way for emerging hybridized photonic-electronic neural networks by taking a detailed look into a single node perceptron, and assess the advantages of using nonlinear optical materials as efficient and instantaneous activation functions.
Journal ArticleDOI

Tutorial: High-speed low-power neuromorphic systems based on magnetic Josephson junctions

TL;DR: In this paper, a tutorial on the spiking behavior of Josephson junctions, the use of the nanoscale magnetic structure to modulate the coupling across the junction; the design and operation of magnetic Josephson junction, device models, and simulation of magneticJosephson junction neuromorphic circuits; and potential neuromorphic architectures based on hybrid superconducting/magnetic technology.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI

Human-level control through deep reinforcement learning

TL;DR: This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.
Journal ArticleDOI

Reducing the Dimensionality of Data with Neural Networks

TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Related Papers (5)