scispace - formally typeset
Open AccessJournal ArticleDOI

Deep learning with coherent nanophotonic circuits

Reads0
Chats0
TLDR
A new architecture for a fully optical neural network is demonstrated that enables a computational speed enhancement of at least two orders of magnitude and three order of magnitude in power efficiency over state-of-the-art electronics.
Abstract
Artificial Neural Networks have dramatically improved performance for many machine learning tasks. We demonstrate a new architecture for a fully optical neural network that enables a computational speed enhancement of at least two orders of magnitude and three orders of magnitude in power efficiency over state-of-the-art electronics.

read more

Citations
More filters
Journal ArticleDOI

Optical Convolutional Neural Network With WDM-Based Optical Patching and Microring Weighting Banks

TL;DR: In this article, an optical convolutional neural network (OCNN) architecture for high-speed and energy-efficient deep learning accelerators is proposed, where the WDM-based optical patching scheme (WDM-OPS) is adopted as the data-feeding structure for its superior energy efficiency and the microring banks are used for the large-scale weighting and summing.
Proceedings ArticleDOI

Artificial Synapse with Mnemonic Functionality using GSST-based Photonic Integrated Memory

TL;DR: In this article, a multi-level discrete-state nonvolatile photonic memory based on an ultra-compact hybrid phase change material GSST-silicon Mach Zehnder modulator, with low insertion losses (3dB), was presented as node in a photonic neural network.
Journal ArticleDOI

Continuous and rapid fabrication of photochromic fibers by facilely coating tungsten oxide/polyvinyl alcohol composites

TL;DR: In this article, the continuous fabrication of photochromic fibers in a simple and low-cost way by dip-coating WO3/PVA composites was reported, which showed fast and reversible color switch from light yellow to dark blue upon UV irradiation and infrared heating treatment.
Proceedings ArticleDOI

PCNNA: A Photonic Convolutional Neural Network Accelerator

TL;DR: In this article, a photonic convolutional neural network accelerator (PCNNA) is proposed to speed up the convolution operation for CNNs based on the recently introduced silicon photonic microring weight banks, which use broadcast-and-weight protocol to perform multiply and accumulate (MAC) operation and move data through layers of a neural network.
Journal ArticleDOI

Wavelength-division-multiplexing (WDM)-based integrated electronic–photonic switching network (EPSN) for high-speed data processing and transportation

TL;DR: A WDM-based electronic–photonic switching network (EPSN) is proposed to realize the functions of the binary decoder and the multiplexer, which are fundamental elements in microprocessors for data transportation and processing.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI

Human-level control through deep reinforcement learning

TL;DR: This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.
Journal ArticleDOI

Reducing the Dimensionality of Data with Neural Networks

TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Related Papers (5)