scispace - formally typeset
Search or ask a question
Author

Danish Rafique

Bio: Danish Rafique is an academic researcher from ADVA Optical Networking. The author has contributed to research in topics: Quadrature amplitude modulation & Transmission (telecommunications). The author has an hindex of 24, co-authored 113 publications receiving 2205 citations. Previous affiliations of Danish Rafique include Polytechnic University of Catalonia & University College Cork.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: This article reports the work on next generation transponders for optical networks carried out within the last few years, highlighting advantages, economics, and complexity.
Abstract: This article reports the work on next generation transponders for optical networks carried out within the last few years. A general architecture supporting super-channels (i.e., optical connections composed of several adjacent subcarriers) and sliceability (i.e., subcarriers grouped in a number of independent super-channels with different destinations) is presented. Several transponder implementations supporting different transmission techniques are considered, highlighting advantages, economics, and complexity. Discussions include electronics, optical components, integration, and programmability. Application use cases are reported.

228 citations

Journal ArticleDOI
TL;DR: This tutorial paper reviews several machine learning concepts tailored to the optical networking industry and discusses algorithm choices, data and model management strategies, and integration into existing network control and management tools.
Abstract: Networks are complex interacting systems involving cloud operations, core and metro transport, and mobile connectivity all the way to video streaming and similar user applications.With localized and highly engineered operational tools, it is typical of these networks to take days to weeks for any changes, upgrades, or service deployments to take effect. Machine learning, a sub-domain of artificial intelligence, is highly suitable for complex system representation. In this tutorial paper, we review several machine learning concepts tailored to the optical networking industry and discuss algorithm choices, data and model management strategies, and integration into existing network control and management tools. We then describe four networking case studies in detail, covering predictive maintenance, virtual network topology management, capacity optimization, and optical spectral analysis.

201 citations

Journal ArticleDOI
TL;DR: A digital back-propagation simplification method is investigated to enable computationally-efficient digital nonlinearity compensation for a coherently-detected 112 Gb/s polarization multiplexed quadrature phase shifted keying transmission over a 1,600 km link with no inline compensation.
Abstract: We investigate a digital back-propagation simplification method to enable computationally-efficient digital nonlinearity compensation for a coherently-detected 112 Gb/s polarization multiplexed quadrature phase shifted keying transmission over a 1,600 km link (20x80km) with no inline compensation. Through numerical simulation, we report up to 80% reduction in required back-propagation steps to perform nonlinear compensation, in comparison to the standard back-propagation algorithm. This method takes into account the correlation between adjacent symbols at a given instant using a weighted-average approach, and optimization of the position of nonlinear compensator stage to enable practical digital back-propagation.

134 citations

Journal ArticleDOI
TL;DR: This work details the strategies adopted in the European research project IDEALIST to overcome the predicted data plane capacity crunch in optical networks and highlights the novelties stemming from the flex-grid concept.
Abstract: In this work we detail the strategies adopted in the European research project IDEALIST to overcome the predicted data plane capacity crunch in optical networks. In order for core and metropolitan telecommunication systems to be able to catch up with Internet traffic, which keeps growing exponentially, we exploit the elastic optical networks paradigm for its astounding characteristics: flexible bandwidth allocation and reach tailoring through adaptive line rate, modulation formats, and spectral efficiency. We emphasize the novelties stemming from the flex-grid concept and report on the corresponding proposed target network scenarios. Fundamental building blocks, like the bandwidth-variable transponder and complementary node architectures ushering those systems, are detailed focusing on physical layer, monitoring aspects, and node architecture design.

119 citations

Journal ArticleDOI
TL;DR: A significant performance constraint is identified, originating from four-wave mixing between signals and amplified spontaneous emission noise which induces a linear increase in the standard deviation of the received field with signal power, and linear dependence on transmission distance.
Abstract: Limitations in the performance of coherent transmission systems employing digital back-propagation due to four-wave mixing impairments are reported for the first time. A significant performance constraint is identified, originating from four-wave mixing between signals and amplified spontaneous emission noise which induces a linear increase in the standard deviation of the received field with signal power, and linear dependence on transmission distance.

116 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper tries to gather the recent results regarding the Gaussian-noise model definition, understanding, relations versus other models, validation, limitations, closed form solutions, approximations and, in general, its applications and implications in link analysis and optimization, also within a network environment.
Abstract: Several approximate non-linear fiber propagation models have been proposed over the years. Recent re-consideration and extension of earlier modeling efforts has led to the formalization of the so-called Gaussian-noise (GN) model. The evidence collected so far hints at the GN-model as being a relatively simple and, at the same time, sufficiently reliable tool for performance prediction of uncompensated coherent systems, characterized by a favorable accuracy versus complexity trade-off. This paper tries to gather the recent results regarding the GN-model definition, understanding, relations versus other models, validation, limitations, closed form solutions, approximations and, in general, its applications and implications in link analysis and optimization, also within a network environment.

618 citations

Journal ArticleDOI
TL;DR: Significant technological breakthroughs to achieve connectivity goals within 6G include: a network operating at the THz band with much wider spectrum resources, intelligent communication environments that enable a wireless propagation environment with active signal transmission and reception, and pervasive artificial intelligence.
Abstract: 6G and beyond will fulfill the requirements of a fully connected world and provide ubiquitous wireless connectivity for all. Transformative solutions are expected to drive the surge for accommodating a rapidly growing number of intelligent devices and services. Major technological breakthroughs to achieve connectivity goals within 6G include: (i) a network operating at the THz band with much wider spectrum resources, (ii) intelligent communication environments that enable a wireless propagation environment with active signal transmission and reception, (iii) pervasive artificial intelligence, (iv) large-scale network automation, (v) an all-spectrum reconfigurable front-end for dynamic spectrum access, (vi) ambient backscatter communications for energy savings, (vii) the Internet of Space Things enabled by CubeSats and UAVs, and (viii) cell-free massive MIMO communication networks. In this roadmap paper, use cases for these enabling techniques as well as recent advancements on related topics are highlighted, and open problems with possible solutions are discussed, followed by a development timeline outlining the worldwide efforts in the realization of 6G. Going beyond 6G, promising early-stage technologies such as the Internet of NanoThings, the Internet of BioNanoThings, and quantum communications, which are expected to have a far-reaching impact on wireless communications, have also been discussed at length in this paper.

595 citations

Journal ArticleDOI
TL;DR: In this paper, 16 researchers, each a world-leading expert in their respective subfields, contribute a section to this invited review article, summarizing their views on state-of-the-art and future developments in optical communications.
Abstract: Lightwave communications is a necessity for the information age. Optical links provide enormous bandwidth, and the optical fiber is the only medium that can meet the modern society's needs for transporting massive amounts of data over long distances. Applications range from global high-capacity networks, which constitute the backbone of the internet, to the massively parallel interconnects that provide data connectivity inside datacenters and supercomputers. Optical communications is a diverse and rapidly changing field, where experts in photonics, communications, electronics, and signal processing work side by side to meet the ever-increasing demands for higher capacity, lower cost, and lower energy consumption, while adapting the system design to novel services and technologies. Due to the interdisciplinary nature of this rich research field, Journal of Optics has invited 16 researchers, each a world-leading expert in their respective subfields, to contribute a section to this invited review article, summarizing their views on state-of-the-art and future developments in optical communications.

477 citations

Journal ArticleDOI
TL;DR: An overview of the application of ML to optical communications and networking is provided, relevant literature is classified and surveyed, and an introductory tutorial on ML is provided for researchers and practitioners interested in this field.
Abstract: Today’s telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users’ behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, machine learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing, and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude this paper proposing new possible research directions.

437 citations

Journal ArticleDOI
TL;DR: This paper provides a comprehensive survey on the literature involving machine learning algorithms applied to SDN, from the perspective of traffic classification, routing optimization, quality of service/quality of experience prediction, resource management and security.
Abstract: In recent years, with the rapid development of current Internet and mobile communication technologies, the infrastructure, devices and resources in networking systems are becoming more complex and heterogeneous. In order to efficiently organize, manage, maintain and optimize networking systems, more intelligence needs to be deployed. However, due to the inherently distributed feature of traditional networks, machine learning techniques are hard to be applied and deployed to control and operate networks. Software defined networking (SDN) brings us new chances to provide intelligence inside the networks. The capabilities of SDN (e.g., logically centralized control, global view of the network, software-based traffic analysis, and dynamic updating of forwarding rules) make it easier to apply machine learning techniques. In this paper, we provide a comprehensive survey on the literature involving machine learning algorithms applied to SDN. First, the related works and background knowledge are introduced. Then, we present an overview of machine learning algorithms. In addition, we review how machine learning algorithms are applied in the realm of SDN, from the perspective of traffic classification, routing optimization, quality of service/quality of experience prediction, resource management and security. Finally, challenges and broader perspectives are discussed.

436 citations