Deep learning with coherent nanophotonic circuits
Yichen Shen,Nicholas C. Harris,Scott Skirlo,Dirk Englund,Marin Soljacic +4 more
- Vol. 11, Iss: 7, pp 441-446
TLDR
A new architecture for a fully optical neural network is demonstrated that enables a computational speed enhancement of at least two orders of magnitude and three order of magnitude in power efficiency over state-of-the-art electronics.Abstract:
Artificial Neural Networks have dramatically improved performance for many machine learning tasks. We demonstrate a new architecture for a fully optical neural network that enables a computational speed enhancement of at least two orders of magnitude and three orders of magnitude in power efficiency over state-of-the-art electronics.read more
Citations
More filters
Journal ArticleDOI
Neuromorphic Photonics: 2D or not 2D?
TL;DR: A novel three-dimensional computational unit is proposed, with its compactness, ultrahigh efficiency, and lossless interconnectivity, is foreseen to allow scalable computation AI chipsets that outperform electronics in computational speed and energy efficiency to shape the future of neuromorphic computing.
Journal ArticleDOI
Low-threshold all-optical nonlinear activation function based on Ge/Si hybrid structure in microring resonator
TL;DR: In this paper , the authors demonstrate that Ge/Si hybrid structure would be a qualified candidate owing to its property of CMOS-compatibility, low nonlinear threshold and compact footprint, thanks to the strong thermal-optic effect of germanium in conjunction with micro-ring resonator.
Proceedings ArticleDOI
SqueezeLight: Towards Scalable Optical Neural Networks with Multi-Operand Ring Resonators
TL;DR: In this paper, a nonlinear optical neuron based on multi-operand ring resonators is proposed to achieve neuromorphic computing with a compact footprint, low wavelength usage, learnable neuron balancing, and built-in nonlinearity.
Posted ContentDOI
All-Optical Computing Based on Convolutional Neural Networks
Kun Liao,Ye Chen,Zhongcheng Yu,Xiaoyong Hu,Xingyuan Wang,Cuicui Lu,Hongtao Lin,Qingyang Du,Juejun Hu,Qihuang Gong +9 more
TL;DR: This work reports a strategy to realize ultrafast and ultralow-energy-consumption all-optical computing based on convolutional neural networks, leveraging entirely linear optical interactions, and paves a new way for on-chip all- optical computing.
Journal ArticleDOI
Simultaneous excitatory and inhibitory dynamics in an excitable laser.
Philip Y. Ma,Bhavin J. Shastri,Thomas Ferreira de Lima,Chaoran Huang,Alexander N. Tait,Mitchell A. Nahmias,Hsuan-Tung Peng,Paul R. Prucnal +7 more
TL;DR: This work demonstrates the simultaneous excitatory and inhibitory dynamics in an excitable (i.e., a pulsed) laser neuron, both numerically and experimentally, and introduces inhibition by directly modulating the gain of the laser.
References
More filters
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI
Deep learning
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI
Human-level control through deep reinforcement learning
Volodymyr Mnih,Koray Kavukcuoglu,David Silver,Andrei Rusu,Joel Veness,Marc G. Bellemare,Alex Graves,Martin Riedmiller,Andreas K. Fidjeland,Georg Ostrovski,Stig Petersen,Charles Beattie,Amir Sadik,Ioannis Antonoglou,Helen King,Dharshan Kumaran,Daan Wierstra,Shane Legg,Demis Hassabis +18 more
TL;DR: This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.
Journal ArticleDOI
Reducing the Dimensionality of Data with Neural Networks
TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Journal ArticleDOI
Deep learning in neural networks
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.