scispace - formally typeset
Open AccessPosted Content

Predicting ultrafast nonlinear dynamics in fibre optics with a recurrent neural network

Reads0
Chats0
TLDR
In this paper, a recurrent neural network with long short-term memory (LSTM) was used to predict the temporal and spectral evolution of higher-order soliton compression and supercontinuum generation.
Abstract
The propagation of ultrashort pulses in optical fibre displays complex nonlinear dynamics that find important applications in fields such as high power pulse compression and broadband supercontinuum generation. Such nonlinear evolution however, depends sensitively on both the input pulse and fibre characteristics, and optimizing propagation for application purposes requires extensive numerical simulations based on generalizations of a nonlinear Schrodinger-type equation. This is computationally-demanding and creates a severe bottleneck in using numerical techniques to design and optimize experiments in real-time. Here, we present a solution to this problem using a machine-learning based paradigm to predict complex nonlinear propagation in optical fibres with a recurrent neural network, bypassing the need for direct numerical solution of a governing propagation model. Specifically, we show how a recurrent neural network with long short-term memory accurately predicts the temporal and spectral evolution of higher-order soliton compression and supercontinuum generation, solely from a given transform-limited input pulse intensity profile. Comparison with experiments for the case of soliton compression shows remarkable agreement in both temporal and spectral domains. In optics, our results apply readily to the optimization of pulse compression and broadband light sources, and more generally in physics, they open up new perspectives for studies in all nonlinear Schrodinger-type systems in studies of Bose-Einstein condensates, plasma physics, and hydrodynamics.

read more

Citations
More filters
Journal ArticleDOI

Machine learning and applications in ultrafast photonics

TL;DR: A number of specific areas where the promise of machine learning in ultrafast photonics has already been realized are highlighted, including the design and operation of pulsed lasers, and the characterization and control of ultrafast propagation dynamics.
Journal ArticleDOI

A memristor-based analogue reservoir computing system for real-time and power-efficient signal processing

TL;DR: In this article , a fully analogue reservoir computing system that uses dynamic memristors for the reservoir layer and non-volatile memristor for the readout layer is presented.
Journal ArticleDOI

Fiber laser development enabled by machine learning: review and prospect

TL;DR: In this paper , the authors highlight recent attractive research that adopted machine learning in the fiber laser field, including design and manipulation for on-demand laser output, prediction and control of nonlinear effects, reconstruction and evaluation of laser properties.
Journal ArticleDOI

Artificial Intelligence in Classical and Quantum Photonics

TL;DR: In this paper , the authors provide the reader with the fundamental notions of machine learning and neural networks and present the main AI applications in the fields of spectroscopy and chemometrics, computational imaging (CI), wavefront shaping and quantum optics.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Book

Nonlinear Fiber Optics

TL;DR: The field of nonlinear fiber optics has advanced enough that a whole book was devoted to it as discussed by the authors, which has been translated into Chinese, Japanese, and Russian languages, attesting to the worldwide activity in the field.
Proceedings ArticleDOI

TensorFlow: a system for large-scale machine learning

TL;DR: TensorFlow as mentioned in this paper is a machine learning system that operates at large scale and in heterogeneous environments, using dataflow graphs to represent computation, shared state, and the operations that mutate that state.
Journal ArticleDOI

Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations

TL;DR: In this article, the authors introduce physics-informed neural networks, which are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear partial differential equations.
Journal ArticleDOI

Backpropagation through time: what it does and how to do it

TL;DR: This paper first reviews basic backpropagation, a simple method which is now being widely used in areas like pattern recognition and fault diagnosis, and describes further extensions of this method, to deal with systems other than neural networks, systems involving simultaneous equations or true recurrent networks, and other practical issues which arise with this method.
Related Papers (5)