scispace - formally typeset
Search or ask a question
Journal ArticleDOI

All-optical spiking neurosynaptic networks with self-learning capabilities.

08 May 2019-Nature (Nature Publishing Group)-Vol. 569, Iss: 7755, pp 208-214
TL;DR: An optical version of a brain-inspired neurosynaptic system, using wavelength division multiplexing techniques, is presented that is capable of supervised and unsupervised learning.
Abstract: Software implementations of brain-inspired computing underlie many important computational tasks, from image processing to speech recognition, artificial intelligence and deep learning applications. Yet, unlike real neural tissue, traditional computing architectures physically separate the core computing functions of memory and processing, making fast, efficient and low-energy computing difficult to achieve. To overcome such limitations, an attractive alternative is to design hardware that mimics neurons and synapses. Such hardware, when connected in networks or neuromorphic systems, processes information in a way more analogous to brains. Here we present an all-optical version of such a neurosynaptic system, capable of supervised and unsupervised learning. We exploit wavelength division multiplexing techniques to implement a scalable circuit architecture for photonic neural networks, successfully demonstrating pattern recognition directly in the optical domain. Such photonic neurosynaptic networks promise access to the high speed and high bandwidth inherent to optical systems, thus enabling the direct processing of optical telecommunication and visual data. An optical version of a brain-inspired neurosynaptic system, using wavelength division multiplexing techniques, is presented that is capable of supervised and unsupervised learning.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the spin degree of freedom of electrons and/or holes, which can also interact with their orbital moments, is described with respect to the spin generation methods as detailed in Sections 2-~-9.

614 citations

Journal ArticleDOI
TL;DR: In this paper, the authors review recent advances in integrated photonic neuromorphic systems, discuss current and future challenges, and outline the advances in science and technology needed to meet those challenges.
Abstract: Research in photonic computing has flourished due to the proliferation of optoelectronic components on photonic integration platforms. Photonic integrated circuits have enabled ultrafast artificial neural networks, providing a framework for a new class of information processing machines. Algorithms running on such hardware have the potential to address the growing demand for machine learning and artificial intelligence in areas such as medical diagnosis, telecommunications, and high-performance and scientific computing. In parallel, the development of neuromorphic electronics has highlighted challenges in that domain, particularly related to processor latency. Neuromorphic photonics offers sub-nanosecond latencies, providing a complementary opportunity to extend the domain of artificial intelligence. Here, we review recent advances in integrated photonic neuromorphic systems, discuss current and future challenges, and outline the advances in science and technology needed to meet those challenges. Photonics offers an attractive platform for implementing neuromorphic computing due to its low latency, multiplexing capabilities and integrated on-chip technology.

480 citations

Journal ArticleDOI
TL;DR: Recent advances in integrated photonic neuromorphic neuromorphic systems are reviewed, current and future challenges are discussed, and the advances in science and technology needed to meet those challenges are outlined.
Abstract: Research in photonic computing has flourished due to the proliferation of optoelectronic components on photonic integration platforms. Photonic integrated circuits have enabled ultrafast artificial neural networks, providing a framework for a new class of information processing machines. Algorithms running on such hardware have the potential to address the growing demand for machine learning and artificial intelligence, in areas such as medical diagnosis, telecommunications, and high-performance and scientific computing. In parallel, the development of neuromorphic electronics has highlighted challenges in that domain, in particular, related to processor latency. Neuromorphic photonics offers sub-nanosecond latencies, providing a complementary opportunity to extend the domain of artificial intelligence. Here, we review recent advances in integrated photonic neuromorphic systems, discuss current and future challenges, and outline the advances in science and technology needed to meet those challenges.

454 citations


Cites background or methods from "All-optical spiking neurosynaptic n..."

  • ...Tuning methods based on chalcogenide PCMs allow weights to retain their values without further holding power after being set [24, 51]....

    [...]

  • ...[24, 37] demultiplex the wavelengths, attenuate each channel, and then remultiplex before WDM fan-in....

    [...]

  • ...4c [24] is feedforward, spiking, with both external and local training....

    [...]

  • ...Both are based on the use of optically induced changes in chalcogenide materials to control the light propagation in waveguides (the former Si3N4 integrated waveguides [24], the latter metal-sulphide fibers [52])....

    [...]

  • ...a structural phase transition [24, 65] (Fig....

    [...]

Journal ArticleDOI
TL;DR: Recent progress in deep-learning-based photonic design is reviewed by providing the historical background, algorithm fundamentals and key applications, with the emphasis on various model architectures for specific photonic tasks.
Abstract: Innovative approaches and tools play an important role in shaping design, characterization and optimization for the field of photonics. As a subset of machine learning that learns multilevel abstraction of data using hierarchically structured layers, deep learning offers an efficient means to design photonic structures, spawning data-driven approaches complementary to conventional physics- and rule-based methods. Here, we review recent progress in deep-learning-based photonic design by providing the historical background, algorithm fundamentals and key applications, with the emphasis on various model architectures for specific photonic tasks. We also comment on the challenges and perspectives of this emerging research direction. The application of deep learning to the design of photonic structures and devices is reviewed, including algorithm fundamentals.

446 citations

Journal ArticleDOI
28 Feb 2020-Science
TL;DR: The results provide an approach that breaks the long-standing trade-off between low energy consumption and high-speed nanophotonics, introducing vortex microlasers that are switchable at terahertz frequencies.
Abstract: The development of classical and quantum information–processing technology calls for on-chip integrated sources of structured light. Although integrated vortex microlasers have been previously demonstrated, they remain static and possess relatively high lasing thresholds, making them unsuitable for high-speed optical communication and computing. We introduce perovskite-based vortex microlasers and demonstrate their application to ultrafast all-optical switching at room temperature. By exploiting both mode symmetry and far-field properties, we reveal that the vortex beam lasing can be switched to linearly polarized beam lasing, or vice versa, with switching times of 1 to 1.5 picoseconds and energy consumption that is orders of magnitude lower than in previously demonstrated all-optical switching. Our results provide an approach that breaks the long-standing trade-off between low energy consumption and high-speed nanophotonics, introducing vortex microlasers that are switchable at terahertz frequencies.

414 citations

References
More filters
Journal ArticleDOI
01 Jan 1998
TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Abstract: Multilayer neural networks trained with the back-propagation algorithm constitute the best example of a successful gradient based learning technique. Given an appropriate network architecture, gradient-based learning algorithms can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters, with minimal preprocessing. This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task. Convolutional neural networks, which are specifically designed to deal with the variability of 2D shapes, are shown to outperform all other techniques. Real-life document recognition systems are composed of multiple modules including field extraction, segmentation recognition, and language modeling. A new learning paradigm, called graph transformer networks (GTN), allows such multimodule systems to be trained globally using gradient-based methods so as to minimize an overall performance measure. Two systems for online handwriting recognition are described. Experiments demonstrate the advantage of global training, and the flexibility of graph transformer networks. A graph transformer network for reading a bank cheque is also described. It uses convolutional neural network character recognizers combined with global training techniques to provide record accuracy on business and personal cheques. It is deployed commercially and reads several million cheques per day.

42,067 citations

Journal ArticleDOI
TL;DR: This review looks at the unique property combination that characterizes phase-change materials, in particular the contrast between the amorphous and crystalline states, and the origin of the fast crystallization kinetics.
Abstract: Phase-change materials are some of the most promising materials for data-storage applications. They are already used in rewriteable optical data storage and offer great potential as an emerging non-volatile electronic memory. This review looks at the unique property combination that characterizes phase-change materials. The crystalline state often shows an octahedral-like atomic arrangement, frequently accompanied by pronounced lattice distortions and huge vacancy concentrations. This can be attributed to the chemical bonding in phase-change alloys, which is promoted by p-orbitals. From this insight, phase-change alloys with desired properties can be designed. This is demonstrated for the optical properties of phase-change alloys, in particular the contrast between the amorphous and crystalline states. The origin of the fast crystallization kinetics is also discussed.

2,985 citations

Journal ArticleDOI
01 Jul 2017
TL;DR: A new architecture for a fully optical neural network is demonstrated that enables a computational speed enhancement of at least two orders of magnitude and three order of magnitude in power efficiency over state-of-the-art electronics.
Abstract: Artificial Neural Networks have dramatically improved performance for many machine learning tasks. We demonstrate a new architecture for a fully optical neural network that enables a computational speed enhancement of at least two orders of magnitude and three orders of magnitude in power efficiency over state-of-the-art electronics.

1,955 citations

Posted Content
TL;DR: In this paper, a 9-layered locally connected sparse autoencoder with pooling and local contrast normalization was used to train a face detector without having to label images as containing a face or not.
Abstract: We consider the problem of building high-level, class-specific feature detectors from only unlabeled data. For example, is it possible to learn a face detector using only unlabeled images? To answer this, we train a 9-layered locally connected sparse autoencoder with pooling and local contrast normalization on a large dataset of images (the model has 1 billion connections, the dataset has 10 million 200x200 pixel images downloaded from the Internet). We train this network using model parallelism and asynchronous SGD on a cluster with 1,000 machines (16,000 cores) for three days. Contrary to what appears to be a widely-held intuition, our experimental results reveal that it is possible to train a face detector without having to label images as containing a face or not. Control experiments show that this feature detector is robust not only to translation but also to scaling and out-of-plane rotation. We also find that the same network is sensitive to other high-level concepts such as cat faces and human bodies. Starting with these learned features, we trained our network to obtain 15.8% accuracy in recognizing 20,000 object categories from ImageNet, a leap of 70% relative improvement over the previous state-of-the-art.

1,796 citations

Journal ArticleDOI
TL;DR: Spike timing-dependent modifications, together with selective spread of synaptic changes, provide a set of cellular mechanisms that are likely to be important for the development and functioning of neural networks.
Abstract: ■ Abstract Correlated spiking of pre- and postsynaptic neurons can result in strengthening or weakening of synapses, depending on the temporal order of spiking. Recent findings indicate that there are narrow and cell type‐specific temporal windows for such synaptic modification and that the generally accepted input- (or synapse-) specific rule for modification appears not to be strictly adhered to. Spike timing‐ dependent modifications, together with selective spread of synaptic changes, provide a set of cellular mechanisms that are likely to be important for the development and functioning of neural networks. When an axon of cell A is near enough to excite cell B or repeatedly or consistently takes part in firing it, some growth or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased. Donald Hebb (1949)

1,435 citations