scispace - formally typeset
Open AccessJournal ArticleDOI

Photonic machine learning implementation for signal recovery in optical communications.

Reads0
Chats0
TLDR
A simplified photonic reservoir computing scheme for data classification of severely distorted optical communication signals after extended fibre transmission is introduced, which demonstrates an improvement in bit-error-rate by two orders of magnitude compared to directly classifying the transmitted signal.
Abstract
Machine learning techniques have proven very efficient in assorted classification tasks. Nevertheless, processing time-dependent high-speed signals can turn into an extremely challenging task, especially when these signals have been nonlinearly distorted. Recently, analogue hardware concepts using nonlinear transient responses have been gaining significant interest for fast information processing. Here, we introduce a simplified photonic reservoir computing scheme for data classification of severely distorted optical communication signals after extended fibre transmission. To this end, we convert the direct bit detection process into a pattern recognition problem. Using an experimental implementation of our photonic reservoir computer, we demonstrate an improvement in bit-error-rate by two orders of magnitude, compared to directly classifying the transmitted signal. This improvement corresponds to an extension of the communication range by over 75%. While we do not yet reach full real-time post-processing at telecom rates, we discuss how future designs might close the gap.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Novel frontier of photonics for data processing—Photonic accelerator

TL;DR: The bottleneck and the paradigm shift of digital computing are reviewed and an array of PAXEL architectures and applications are reviewed, including artificial neural networks, reservoir computing, pass-gate logic, decision making, and compressed sensing are reviewed.
Journal ArticleDOI

Tutorial: Photonic neural networks in delay systems

TL;DR: The most relevant aspects of Artificial Neural Networks and delay systems are introduced, the seminal experimental demonstrations of Reservoir Computing in photonic delay systems, plus the most recent and advanced realizations are explained.
Journal ArticleDOI

Reservoir Computing Using Multiple Lasers With Feedback on a Photonic Integrated Circuit

TL;DR: It is found that the scheme using multiple lasers outperforms that using a single laser with multiple delay times, and large memory capacity can also be obtained for the multiple lasers.
Journal ArticleDOI

Photonic neuromorphic information processing and reservoir computing

TL;DR: This paper will review some of the exciting work that has been going in this area and then focus on one particular technology, namely, photonic reservoir computing.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Posted Content

Neural Machine Translation by Jointly Learning to Align and Translate

TL;DR: In this paper, the authors propose to use a soft-searching model to find the parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Posted Content

Sequence to Sequence Learning with Neural Networks

TL;DR: This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Journal ArticleDOI

Real-time computing without stable states: a new framework for neural computation based on perturbations

TL;DR: A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Related Papers (5)