scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Communications Surveys and Tutorials in 2021"


Journal ArticleDOI
TL;DR: In this article, the authors present a survey of the state of the art in satellite communications, while highlighting the most promising open research topics, such as new constellation types, on-board processing capabilities, non-terrestrial networks and space-based data collection/processing.
Abstract: Satellite communications (SatComs) have recently entered a period of renewed interest motivated by technological advances and nurtured through private investment and ventures. The present survey aims at capturing the state of the art in SatComs, while highlighting the most promising open research topics. Firstly, the main innovation drivers are motivated, such as new constellation types, on-board processing capabilities, non-terrestrial networks and space-based data collection/processing. Secondly, the most promising applications are described, i.e., 5G integration, space communications, Earth observation, aeronautical and maritime tracking and communication. Subsequently, an in-depth literature review is provided across five axes: i) system aspects, ii) air interface, iii) medium access, iv) networking, v) testbeds & prototyping. Finally, a number of future challenges and the respective open research topics are described.

475 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an in-depth tutorial of the 3GPP Release 16 5G NR V2X standard for vehicular communications, with a particular focus on the sidelink.
Abstract: The Third Generation Partnership Project (3GPP) has recently published its Release 16 that includes the first Vehicle-to-Everything (V2X) standard based on the 5G New Radio (NR) air interface 5G NR V2X introduces advanced functionalities on top of the 5G NR air interface to support connected and automated driving use cases with stringent requirements This article presents an in-depth tutorial of the 3GPP Release 16 5G NR V2X standard for V2X communications, with a particular focus on the sidelink, since it is the most significant part of 5G NR V2X The main part of the paper is an in-depth treatment of the key aspects of 5G NR V2X: the physical layer, the resource allocation, the quality of service management, the enhancements introduced to the Uu interface and the mobility management for V2N (Vehicle to Network) communications, as well as the co-existence mechanisms between 5G NR V2X and LTE V2X We also review the use cases, the system architecture, and describe the evaluation methodology and simulation assumptions for 5G NR V2X Finally, we provide an outlook on possible 5G NR V2X enhancements, including those identified within Release 17

350 citations


Journal ArticleDOI
TL;DR: A comprehensive overview of the state-of-the-art on RISs, with focus on their operating principles, performance evaluation, beamforming design and resource management, applications of machine learning to RIS-enhanced wireless networks, as well as the integration of RISs with other emerging technologies.
Abstract: Reconfigurable intelligent surfaces (RISs), also known as intelligent reflecting surfaces (IRSs), or large intelligent surfaces (LISs), 1 have received significant attention for their potential to enhance the capacity and coverage of wireless networks by smartly reconfiguring the wireless propagation environment. Therefore, RISs are considered a promising technology for the sixth-generation (6G) of communication networks. In this context, we provide a comprehensive overview of the state-of-the-art on RISs, with focus on their operating principles, performance evaluation, beamforming design and resource management, applications of machine learning to RIS-enhanced wireless networks, as well as the integration of RISs with other emerging technologies. We describe the basic principles of RISs both from physics and communications perspectives, based on which we present performance evaluation of multiantenna assisted RIS systems. In addition, we systematically survey existing designs for RIS-enhanced wireless networks encompassing performance analysis, information theory, and performance optimization perspectives. Furthermore, we survey existing research contributions that apply machine learning for tackling challenges in dynamic scenarios, such as random fluctuations of wireless channels and user mobility in RIS-enhanced wireless networks. Last but not least, we identify major issues and research opportunities associated with the integration of RISs and other emerging technologies for applications to next-generation networks. 1 Without loss of generality, we use the name of RIS in the remainder of this paper.

343 citations


Journal ArticleDOI
TL;DR: A comprehensive survey on RA in HetNets for 5G communications is provided and two potential structures for 6G communications are provided, such as a learning-based RA structure and a control- based RA structure.
Abstract: In the fifth-generation (5G) mobile communication system, various service requirements of different communication environments are expected to be satisfied. As a new evolution network structure, heterogeneous network (HetNet) has been studied in recent years. Compared with homogeneous networks, HetNets can increase the opportunity in the spatial resource reuse and improve users’ quality of service by developing small cells into the coverage of macrocells. Since there is mutual interference among different users and the limited spectrum resource in HetNets, however, efficient resource allocation (RA) algorithms are vitally important to reduce the mutual interference and achieve spectrum sharing. In this article, we provide a comprehensive survey on RA in HetNets for 5G communications. Specifically, we first introduce the definition and different network scenarios of HetNets. Second, RA models are discussed. Then, we present a classification to analyze current RA algorithms for the existing works. Finally, some challenging issues and future research trends are discussed. Accordingly, we provide two potential structures for 6G communications to solve the RA problems of the next-generation HetNets, such as a learning-based RA structure and a control-based RA structure. The goal of this article is to provide important information on HetNets, which could be used to guide the development of more efficient techniques in this research area.

321 citations


Journal ArticleDOI
TL;DR: In this paper, a comprehensive survey of the emerging applications of federated learning in IoT networks is provided, which explores and analyzes the potential of FL for enabling a wide range of IoT services, including IoT data sharing, data offloading and caching, attack detection, localization, mobile crowdsensing and IoT privacy and security.
Abstract: The Internet of Things (IoT) is penetrating many facets of our daily life with the proliferation of intelligent services and applications empowered by artificial intelligence (AI). Traditionally, AI techniques require centralized data collection and processing that may not be feasible in realistic application scenarios due to the high scalability of modern IoT networks and growing data privacy concerns. Federated Learning (FL) has emerged as a distributed collaborative AI approach that can enable many intelligent IoT applications, by allowing for AI training at distributed IoT devices without the need for data sharing. In this article, we provide a comprehensive survey of the emerging applications of FL in IoT networks, beginning from an introduction to the recent advances in FL and IoT to a discussion of their integration. Particularly, we explore and analyze the potential of FL for enabling a wide range of IoT services, including IoT data sharing, data offloading and caching, attack detection, localization, mobile crowdsensing, and IoT privacy and security. We then provide an extensive survey of the use of FL in various key IoT applications such as smart healthcare, smart transportation, Unmanned Aerial Vehicles (UAVs), smart cities, and smart industry. The important lessons learned from this review of the FL-IoT services and applications are also highlighted. We complete this survey by highlighting the current challenges and possible directions for future research in this booming area.

319 citations


Journal ArticleDOI
TL;DR: The landscape of MAR through the past and its future prospects with respect to the 5G systems and complementary technology MEC are discussed and an informative analysis of the network formation of current and future MAR systems in terms of cloud, edge, localized, and hybrid architectural options is provided.
Abstract: The Augmented Reality (AR) technology enhances the human perception of the world by combining the real environment with the virtual space. With the explosive growth of powerful, less expensive mobile devices, and the emergence of sophisticated communication infrastructure, Mobile Augmented Reality (MAR) applications are gaining increased popularity. MAR allows users to run AR applications on mobile devices with greater mobility and at a lower cost. The emerging 5G communication technologies act as critical enablers for future MAR applications to achieve ultra-low latency and extremely high data rates while Multi-access Edge Computing (MEC) brings enhanced computational power closer to the users to complement MAR. This paper extensively discusses the landscape of MAR through the past and its future prospects with respect to the 5G systems and complementary technology MEC. The paper especially provides an informative analysis of the network formation of current and future MAR systems in terms of cloud, edge, localized, and hybrid architectural options. The paper discusses key application areas for MAR and their future with the advent of 5G technologies. The paper also discusses the requirements and limitations of MAR technical aspects such as communication, mobility management, energy management, service offloading and migration, security, and privacy and analyzes the role of 5G technologies.

259 citations


Journal ArticleDOI
TL;DR: This survey provides a comprehensive tutorial on federated learning and its associated concepts, technologies and learning approaches, and designs a three-level classification scheme that first categorizes the Federated learning literature based on the high-level challenge that they tackle, and classify each high- level challenge into a set of specific low-level challenges to foster a better understanding of the topic.
Abstract: The communication and networking field is hungry for machine learning decision-making solutions to replace the traditional model-driven approaches that proved to be not rich enough for seizing the ever-growing complexity and heterogeneity of the modern systems in the field. Traditional machine learning solutions assume the existence of (cloud-based) central entities that are in charge of processing the data. Nonetheless, the difficulty of accessing private data, together with the high cost of transmitting raw data to the central entity gave rise to a decentralized machine learning approach called Federated Learning . The main idea of federated learning is to perform an on-device collaborative training of a single machine learning model without having to share the raw training data with any third-party entity. Although few survey articles on federated learning already exist in the literature, the motivation of this survey stems from three essential observations. The first one is the lack of a fine-grained multi-level classification of the federated learning literature, where the existing surveys base their classification on only one criterion or aspect. The second observation is that the existing surveys focus only on some common challenges, but disregard other essential aspects such as reliable client selection, resource management and training service pricing. The third observation is the lack of explicit and straightforward directives for researchers to help them design future federated learning solutions that overcome the state-of-the-art research gaps. To address these points, we first provide a comprehensive tutorial on federated learning and its associated concepts, technologies and learning approaches. We then survey and highlight the applications and future directions of federated learning in the domain of communication and networking. Thereafter, we design a three-level classification scheme that first categorizes the federated learning literature based on the high-level challenge that they tackle. Then, we classify each high-level challenge into a set of specific low-level challenges to foster a better understanding of the topic. Finally, we provide, within each low-level challenge, a fine-grained classification based on the technique used to address this particular challenge. For each category of high-level challenges, we provide a set of desirable criteria and future research directions that are aimed to help the research community design innovative and efficient future solutions. To the best of our knowledge, our survey is the most comprehensive in terms of challenges and techniques it covers and the most fine-grained in terms of the multi-level classification scheme it presents.

252 citations


Journal ArticleDOI
TL;DR: The struggles of designing a family of polar codes able to satisfy the demands of 5G systems are illustrated, with particular attention to rate flexibility and low decoding latency.
Abstract: Polar codes have attracted the attention of academia and industry alike in the past decade, such that the $5^{\mathrm {th}}$ generation wireless systems (5G) standardization process of the $3^{\mathrm {rd}}$ generation partnership project (3GPP) chose polar codes as a channel coding scheme. In this tutorial, we provide a description of the encoding process of polar codes adopted by the 5G standard. We illustrate the struggles of designing a family of polar codes able to satisfy the demands of 5G systems, with particular attention to rate flexibility and low decoding latency. The result of these efforts is an elaborate framework that applies novel coding techniques to provide a solid channel code for NR requirements.

197 citations


Journal ArticleDOI
TL;DR: This survey presents a comprehensive analysis of the exploitation of network slicing in IoT realisation and discusses the role of other emerging technologies and concepts, such as blockchain and Artificial Intelligence/Machine Learning (AI/ML) in network slicing and IoT integration.
Abstract: Internet of Things (IoT) is an emerging technology that makes people’s lives smart by conquering a plethora of diverse application and service areas. In near future, the fifth-generation (5G) wireless networks provide the connectivity for this IoT ecosystem. It has been carefully designed to facilitate the exponential growth in the IoT field. Network slicing is one of the key technologies in the 5G architecture that has the ability to divide the physical network into multiple logical networks (i.e., slices) with different network characteristics. Therefore, network slicing is also a key enabler of realisation of IoT in 5G. Network slicing can satisfy the various networking demands by heterogeneous IoT applications via dedicated slices. In this survey, we present a comprehensive analysis of the exploitation of network slicing in IoT realisation. We discuss network slicing utilisation in different IoT application scenarios, along with the technical challenges that can be solved via network slicing. Furthermore, integration challenges and open research problems related to the network slicing in the IoT realisation are also discussed in this paper. Finally, we discuss the role of other emerging technologies and concepts, such as blockchain and Artificial Intelligence/Machine Learning (AI/ML) in network slicing and IoT integration.

173 citations


Journal ArticleDOI
TL;DR: The challenges in existing LTE for supporting V2X communications such as physical layer structure, synchronization, multimedia broadcast multicast services (MBMS), resource allocation, security and survey the recent solutions to these challenges.
Abstract: A wide variety of works have been conducted in vehicle-to-everything (V2X) communications to enable a variety of applications for road safety, traffic efficiency and passenger infotainment. Although dedicated short-range communications (DSRC) based V2X is already in the deployment phase, cellular based V2X is gaining more interest in academia and industry most recently. This article surveys the existing work and challenges on LTE and 5G to support efficient V2X communications. First, we present the motivations for cellular based V2X communications. Second, we summarize the LTE V2X architecture and operating scenarios being considered. Third, we discuss the challenges in existing LTE for supporting V2X communications such as physical layer structure, synchronization, multimedia broadcast multicast services (MBMS), resource allocation, security and survey the recent solutions to these challenges. We further discuss the challenges and possible solutions for 5G based vehicular communications. Finally, we discuss the open research issues and possible research directions in cellular based vehicular communications.

159 citations


Journal ArticleDOI
TL;DR: In this paper, a taxonomy of federated learning over IoT networks is presented, where a set of metrics such as sparsification, robustness, quantization, scalability, security, and privacy are evaluated.
Abstract: The Internet of Things (IoT) will be ripe for the deployment of novel machine learning algorithm for both network and application management. However, given the presence of massively distributed and private datasets, it is challenging to use classical centralized learning algorithms in the IoT. To overcome this challenge, federated learning can be a promising solution that enables on-device machine learning without the need to migrate the private end-user data to a central cloud. In federated learning, only learning model updates are transferred between end-devices and the aggregation server. Although federated learning can offer better privacy preservation than centralized machine learning, it has still privacy concerns. In this paper, first, we present the recent advances of federated learning towards enabling federated learning-powered IoT applications. A set of metrics such as sparsification, robustness, quantization, scalability, security, and privacy, is delineated in order to rigorously evaluate the recent advances. Second, we devise a taxonomy for federated learning over IoT networks. Finally, we present several open research challenges with their possible solutions.

Journal ArticleDOI
TL;DR: A classification framework with three dimensions, including AUV performance, formation control, and communication capability is proposed for AUV formation research and can be used to compare different methods and help engineers choose suitable formation methods for various applications.
Abstract: Autonomous underwater vehicles (AUVs) are submersible underwater vehicles controlled by onboard computers. AUV formation is a cooperative control which focuses on controlling multiple AUVs to move in a group while executing tasks. In contrast to a single AUV, multi-AUV formation represents higher efficiency and better stability for many applications, such as oil and gas industries, hydrographic surveys, and military missions, etc. To achieve better formation, there are several key factors, including AUV performance, formation control, and communication capability. However, most studies in the field of AUV formation mainly focus on formation control methods. We observe that the research of communication capability and AUV performance of multiple AUV formation is still in an early stage. It is beneficial to researchers to present a comprehensive survey of the state of the art of AUV formation research and development. In this paper, we study AUV, formation control, and underwater acoustic communication capability in detail. We propose a classification framework with three dimensions, including AUV performance, formation control, and communication capability. This framework provides a comprehensive classification method for future AUV formation research. It also can be used to compare different methods and help engineers choose suitable formation methods for various applications. Moreover, our survey discusses formation architecture with communication constraints and we identify some common misconceptions and questionable research for formation control related to communication.

Journal ArticleDOI
TL;DR: This paper describes ARAN architecture and its fundamental features for the development of 6G networks, and introduces technologies that enable the success of ARAN implementations in terms of energy replenishment, operational management, and data delivery.
Abstract: Current access infrastructures are characterized by heterogeneity, low latency, high throughput, and high computational capability, enabling massive concurrent connections and various services. Unfortunately, this design does not pay significant attention to mobile services in underserved areas. In this context, the use of aerial radio access networks (ARANs) is a promising strategy to complement existing terrestrial communication systems. Involving airborne components such as unmanned aerial vehicles, drones, and satellites, ARANs can quickly establish a flexible access infrastructure on demand. ARANs are expected to support the development of seamless mobile communication systems toward a comprehensive sixth-generation (6G) global access infrastructure. This paper provides an overview of recent studies regarding ARANs in the literature. First, we investigate related work to identify areas for further exploration in terms of recent knowledge advancements and analyses. Second, we define the scope and methodology of this study. Then, we describe ARAN architecture and its fundamental features for the development of 6G networks. In particular, we analyze the system model from several perspectives, including transmission propagation, energy consumption, communication latency, and network mobility. Furthermore, we introduce technologies that enable the success of ARAN implementations in terms of energy replenishment, operational management, and data delivery. Subsequently, we discuss application scenarios envisioned for these technologies. Finally, we highlight ongoing research efforts and trends toward 6G ARANs.

Journal ArticleDOI
TL;DR: A thorough investigation of the identification and the analysis of threat vectors in the ETSI standardized MEC architecture is introduced and the vulnerabilities leading to the identified threat vectors are analyzed and potential security solutions to overcome these vulnerabilities are proposed.
Abstract: The European Telecommunications Standards Institute (ETSI) has introduced the paradigm of Multi-Access Edge Computing (MEC) to enable efficient and fast data processing in mobile networks. Among other technological requirements, security and privacy are significant factors in the realization of MEC deployments. In this paper, we analyse the security and privacy of the MEC system. We introduce a thorough investigation of the identification and the analysis of threat vectors in the ETSI standardized MEC architecture. Furthermore, we analyse the vulnerabilities leading to the identified threat vectors and propose potential security solutions to overcome these vulnerabilities. The privacy issues of MEC are also highlighted, and clear objectives for preserving privacy are defined. Finally, we present future directives to enhance the security and privacy of MEC services.

Journal ArticleDOI
TL;DR: This tutorial focuses on the role of DRL with an emphasis on deep Multi-Agent Reinforcement Learning (MARL) for AI-enabled wireless networks, and provides a selective description of RL algorithms such as Model-Based RL (MBRL) and cooperative MARL and highlights their potential applications in future wireless networks.
Abstract: Deep Reinforcement Learning (DRL) has recently witnessed significant advances that have led to multiple successes in solving sequential decision-making problems in various domains, particularly in wireless communications. The next generation of wireless networks is expected to provide scalable, low-latency, ultra-reliable services empowered by the application of data-driven Artificial Intelligence (AI). The key enabling technologies of future wireless networks, such as intelligent meta-surfaces, aerial networks, and AI at the edge, involve more than one agent which motivates the importance of multi-agent learning techniques. Furthermore, cooperation is central to establishing self-organizing, self-sustaining, and decentralized networks. In this context, this tutorial focuses on the role of DRL with an emphasis on deep Multi-Agent Reinforcement Learning (MARL) for AI-enabled wireless networks. The first part of this paper will present a clear overview of the mathematical frameworks for single-agent RL and MARL. The main idea of this work is to motivate the application of RL beyond the model-free perspective which was extensively adopted in recent years. Thus, we provide a selective description of RL algorithms such as Model-Based RL (MBRL) and cooperative MARL and we highlight their potential applications in future wireless networks. Finally, we overview the state-of-the-art of MARL in fields such as Mobile Edge Computing (MEC), Unmanned Aerial Vehicles (UAV) networks, and cell-free massive MIMO, and identify promising future research directions. We expect this tutorial to stimulate more research endeavors to build scalable and decentralized systems based on MARL.

Journal ArticleDOI
TL;DR: The state of the art of V-VLC is studied, open research questions are identified and the research community as a whole is introduced and the characteristics of the VLC channel are dug into.
Abstract: Visible Light Communications (VLC) is becoming a mature communication technology, particularly for indoor usage. The application in outdoor environments is especially interesting in the scope of Vehicular VLC (V-VLC), however, there are some critical challenges remaining. In general, VLC is a good complement to Radio Frequency (RF)-based communication. For automotive use cases, V-VLC can benefit from the huge available spectrum and the readily available Light Emitting Diode (LED)-based lighting systems of modern cars. Its Line Of Sight (LOS) characteristics, the directionality of the light, and the smaller collision domain substantially reduces interference. In this survey article, we study the state of the art of V-VLC and identify open issues and challenges. We study the V-VLC communication system as a whole and also dig into the characteristics of the VLC channel. For the beginner in the field, this review acts as a guide to the most relevant literature to quickly catch up with current trends and achievements. For the expert, we identify open research questions and also introduce the V-VLC research community as a whole.

Journal ArticleDOI
TL;DR: A vision and framework for the HAPS networks of the future supported by a comprehensive and state-of-the-art literature review is provided and the unrealized potential of HAPS systems is highlighted and elaborate on their unique ability to serve metropolitan areas.
Abstract: A High Altitude Platform Station (HAPS) is a network node that operates in the stratosphere at an of altitude around 20 km and is instrumental for providing communication services. Precipitated by technological innovations in the areas of autonomous avionics, array antennas, solar panel efficiency levels, and battery energy densities, and fueled by flourishing industry ecosystems, the HAPS has emerged as an indispensable component of next-generations of wireless networks. In this article, we provide a vision and framework for the HAPS networks of the future supported by a comprehensive and state-of-the-art literature review. We highlight the unrealized potential of HAPS systems and elaborate on their unique ability to serve metropolitan areas. The latest advancements and promising technologies in the HAPS energy and payload systems are discussed. The integration of the emerging Reconfigurable Smart Surface (RSS) technology in the communications payload of HAPS systems for providing a cost-effective deployment is proposed. A detailed overview of the radio resource management in HAPS systems is presented along with synergistic physical layer techniques, including Faster-Than-Nyquist (FTN) signaling. Numerous aspects of handoff management in HAPS systems are described. The notable contributions of Artificial Intelligence (AI) in HAPS, including machine learning in the design, topology management, handoff, and resource allocation aspects are emphasized. The extensive overview of the literature we provide is crucial for substantiating our vision that depicts the expected deployment opportunities and challenges in the next 10 years (next-generation networks), as well as in the subsequent 10 years (next-next-generation networks).

Journal ArticleDOI
TL;DR: The IoUT, BMD, and their synthesis are comprehensively surveyed to inspire researchers, engineers, data scientists, and governmental bodies to further progress the field, to develop new tools and techniques, as well as to make informed decisions and set regulations related to the maritime and underwater environments around the world.
Abstract: The Internet of Underwater Things (IoUT) is an emerging communication ecosystem developed for connecting underwater objects in maritime and underwater environments. The IoUT technology is intricately linked with intelligent boats and ships, smart shores and oceans, automatic marine transportations, positioning and navigation, underwater exploration, disaster prediction and prevention, as well as with intelligent monitoring and security. The IoUT has an influence at various scales ranging from a small scientific observatory, to a mid-sized harbor, and to covering global oceanic trade. The network architecture of IoUT is intrinsically heterogeneous and should be sufficiently resilient to operate in harsh environments. This creates major challenges in terms of underwater communications, whilst relying on limited energy resources. Additionally, the volume, velocity, and variety of data produced by sensors, hydrophones, and cameras in IoUT is enormous, giving rise to the concept of Big Marine Data (BMD), which has its own processing challenges. Hence, conventional data processing techniques will falter, and bespoke Machine Learning (ML) solutions have to be employed for automatically learning the specific BMD behavior and features facilitating knowledge extraction and decision support. The motivation of this article is to comprehensively survey the IoUT, BMD, and their synthesis. It also aims for exploring the nexus of BMD with ML. We set out from underwater data collection and then discuss the family of IoUT data communication techniques with an emphasis on the state-of-the-art research challenges. We then review the suite of ML solutions suitable for BMD handling and analytics. We treat the subject deductively from an educational perspective, critically appraising the material surveyed. Accordingly, the reader will become familiar with the pivotal issues of IoUT and BMD processing, whilst gaining an insight into the state-of-the-art applications, tools, and techniques. Finally, we analyze the architectural challenges of the IoUT, followed by proposing a range of promising direction for research and innovation in the broad areas of IoUT and BMD. Our hope is to inspire researchers, engineers, data scientists, and governmental bodies to further progress the field, to develop new tools and techniques, as well as to make informed decisions and set regulations related to the maritime and underwater environments around the world.

Journal ArticleDOI
TL;DR: In this paper, the authors present a survey-style introduction to HLWNets, starting with a framework of system design in the aspects of network architectures, cell deployments, multiple access and modulation schemes, illumination requirements and backhaul.
Abstract: In order to tackle the rapidly growing number of mobile devices and their expanding demands for Internet services, network convergence is envisaged to integrate different technology domains. For indoor wireless communications, one promising approach is to coordinate light fidelity (LiFi) and wireless fidelity (WiFi), namely hybrid LiFi and WiFi networks (HLWNets). This hybrid network combines the high-speed data transmission of LiFi and the ubiquitous coverage of WiFi. In this article, we present a survey-style introduction to HLWNets, starting with a framework of system design in the aspects of network architectures, cell deployments, multiple access and modulation schemes, illumination requirements and backhaul. Key performance metrics and recent achievements are then reviewed to demonstrate the superiority of HLWNets against stand-alone networks. Further, the unique challenges facing HLWNets are elaborated on key research topics including user behavior modeling, interference management, handover and load balancing. Moreover, the potential of HLWNets in the application areas is presented, exemplified by indoor positioning and physical layer security. Finally, the challenges and future research directions are discussed.

Journal ArticleDOI
TL;DR: This paper presents a comprehensive literature review on resource management for JRC, and presents security issues to JRC and provides a discussion of countermeasures to the security issues.
Abstract: Joint radar and communication (JRC) has recently attracted substantial attention. The first reason is that JRC allows individual radar and communication systems to share spectrum bands and thus improves the spectrum utilization. The second reason is that JRC enables a single hardware platform, e.g., an autonomous vehicle or a UAV, to simultaneously perform the communication function and the radar function. As a result, JRC is able to improve the efficiency of resources, i.e., spectrum and energy, reduce the system size, and minimize the system cost. However, there are several challenges to be solved for the JRC design. In particular, sharing the spectrum imposes the interference caused by the systems, and sharing the hardware platform and energy resource complicates the design of the JRC transmitter and compromises the performance of each function. To address the challenges, several resource management approaches have been recently proposed, and this paper presents a comprehensive literature review on resource management for JRC. First, we give fundamental concepts of JRC, important performance metrics used in JRC systems, and applications of the JRC systems. Then, we review and analyze resource management approaches, i.e., spectrum sharing, power allocation, and interference management, for JRC. In addition, we present security issues to JRC and provide a discussion of countermeasures to the security issues. Finally, we highlight important challenges in the JRC design and discuss future research directions related to JRC.

Journal ArticleDOI
TL;DR: In this paper, the authors present a tutorial and a comprehensive survey on Segment Routing (SR) technology, analyzing standardization efforts, patents, research activities and implementation results.
Abstract: Fixed and mobile telecom operators, enterprise network operators and cloud providers strive to face the challenging demands coming from the evolution of IP networks (e.g., huge bandwidth requirements, integration of billions of devices and millions of services in the cloud). Proposed in the early 2010s, Segment Routing (SR) architecture helps face these challenging demands, and it is currently being adopted and deployed. SR architecture is based on the concept of source routing and has interesting scalability properties, as it dramatically reduces the amount of state information to be configured in the core nodes to support complex services. SR architecture was first implemented with the MPLS dataplane and then, quite recently, with the IPv6 dataplane (SRv6). IPv6 SR architecture (SRv6) has been extended from the simple steering of packets across nodes to a general network programming approach, making it very suitable for use cases such as Service Function Chaining and Network Function Virtualization. In this article, we present a tutorial and a comprehensive survey on SR technology, analyzing standardization efforts, patents, research activities and implementation results. We start with an introduction on the motivations for Segment Routing and an overview of its evolution and standardization. Then, we provide a tutorial on Segment Routing technology, with a focus on the novel SRv6 solution. We discuss the standardization efforts and the patents providing details on the most important documents and mentioning other ongoing activities. We then thoroughly analyze research activities according to a taxonomy. We have identified 8 main categories during our analysis of the current state of play: Monitoring, Traffic Engineering, Failure Recovery, Centrally Controlled Architectures, Path Encoding, Network Programming, Performance Evaluation and Miscellaneous. We report the current status of SR deployments in production networks and of SR implementations (including several open source projects). Finally, we report our experience from this survey work and we identify a set of future research directions related to Segment Routing.

Journal ArticleDOI
TL;DR: This survey overviews standards, with particular emphasis on 5G and virtualization of network functions, then it addresses flexibility of MEC smart resource deployment and its migration capabilities.
Abstract: The increasing number of heterogeneous devices connected to the Internet, together with tight 5G requirements have generated new challenges for designing network infrastructures. Industrial verticals such as automotive, smart city and eHealthcare (among others) need secure, low latency and reliable communications. To meet these stringent requirements, computing resources have to be moved closer to the user, from the core to the edge of the network. In this context, ETSI standardized Multi-Access Edge Computing (MEC). However, due to the cost of resources, MEC provisioning has to be carefully designed and evaluated. This survey firstly overviews standards, with particular emphasis on 5G and virtualization of network functions, then it addresses flexibility of MEC smart resource deployment and its migration capabilities. This survey explores how the MEC is used and how it will enable industrial verticals.

Journal ArticleDOI
TL;DR: In this article, the authors provide a comprehensive survey on various machine learning techniques applied to both communication and network parts in vehicular network and present several open issues and potential directions that are worthy of research for the future intelligent vehicular networks.
Abstract: Towards future intelligent vehicular network, the machine learning as the promising artificial intelligence tool is widely researched to intelligentize communication and networking functions. In this paper, we provide a comprehensive survey on various machine learning techniques applied to both communication and network parts in vehicular network. To benefit reading, we first give a preliminary on communication technologies and machine learning technologies in vehicular network. Then, we detailedly describe the challenges of conventional techniques in vehicular network and corresponding machine learning based solutions. Finally, we present several open issues and emphasize potential directions that are worthy of research for the future intelligent vehicular network.

Journal ArticleDOI
TL;DR: A comprehensive survey that overviews DRL algorithms and discusses DRL-enabled IoT applications and highlights emerging challenges and outline future research directions in driving the further success of DRL in IoT applications.
Abstract: The incumbent Internet of Things suffers from poor scalability and elasticity exhibiting in communication, computing, caching and control (4Cs) problems. The recent advances in deep reinforcement learning (DRL) algorithms can potentially address the above problems of IoT systems. In this context, this paper provides a comprehensive survey that overviews DRL algorithms and discusses DRL-enabled IoT applications. In particular, we first briefly review the state-of-the-art DRL algorithms and present a comprehensive analysis on their advantages and challenges. We then discuss on applying DRL algorithms to a wide variety of IoT applications including smart grid, intelligent transportation systems, industrial IoT applications, mobile crowdsensing, and blockchain-empowered IoT. Meanwhile, the discussion of each IoT application domain is accompanied by an in-depth summary and comparison of DRL algorithms. Moreover, we highlight emerging challenges and outline future research directions in driving the further success of DRL in IoT applications.

Journal ArticleDOI
TL;DR: In this article, the authors present the architecture of edge computing, under which different collaborative manners for resource scheduling are discussed, and introduce a unified model before summarizing the current works on resource scheduling from three research issues.
Abstract: With the proliferation of the Internet of Things (IoT) and the wide penetration of wireless networks, the surging demand for data communications and computing calls for the emerging edge computing paradigm. By moving the services and functions located in the cloud to the proximity of users, edge computing can provide powerful communication, storage, networking, and communication capacity. The resource scheduling in edge computing, which is the key to the success of edge computing systems, has attracted increasing research interests. In this paper, we survey the state-of-the-art research findings to know the research progress in this field. Specifically, we present the architecture of edge computing, under which different collaborative manners for resource scheduling are discussed. Particularly, we introduce a unified model before summarizing the current works on resource scheduling from three research issues, including computation offloading, resource allocation, and resource provisioning. Based on two modes of operation, i.e., centralized and distributed modes, different techniques for resource scheduling are discussed and compared. Also, we summarize the main performance indicators based on the surveyed literature. To shed light on the significance of resource scheduling in real-world scenarios, we discuss several typical application scenarios involved in the research of resource scheduling in edge computing. Finally, we highlight some open research challenges yet to be addressed and outline several open issues as the future research direction.

Journal ArticleDOI
TL;DR: A comprehensive survey on the recent machine learning based network optimization methods to guarantee the end-to-end QoS and QoE and discusses some open issues and potential future research directions.
Abstract: The end-to-end quality of service (QoS) and quality of experience (QoE) guarantee is quite important for network optimization. The current 5G and conceived 6G network in the future with ultra high density, bandwidth, mobility and large scale brings urgent requirement of high efficient end-to-end optimization methods. The conventional network optimization methods without learning and intelligent decision ability are hard to handle the high complexity and dynamic scenarios of 6G. Recently, machine learning based QoS and QoE aware network optimization algorithms emerge as a hot research area and attract much attention, which is widely acknowledged as the potential solution for end-to-end optimization in 6G. However, there are still many critical issues of employing machine learning in networks, especially in 6G. In this paper, we give a comprehensive survey on the recent machine learning based network optimization methods to guarantee the end-to-end QoS and QoE. To easy to follow, we introduce the investigated works following the end-to-end transmission flow from network access, routing to network congestion control and adaptive steaming control. Then we discuss some open issues and potential future research directions.

Journal ArticleDOI
TL;DR: This paper presents a survey on current solutions for the deployment of services in remote/rural areas by exploiting satellites, highlighting that low-orbit satellites offer an efficient solution to support long-range services, with a good trade-off in terms of coverage and latency.
Abstract: The Internet of Things (IoT) is expected to bring new opportunities for improving several services for the Society, from transportation to agriculture, from smart cities to fleet management. In this framework, massive connectivity represents one of the key issues. This is especially relevant when IoT systems are expected to cover a large geographical area or a region not reached by terrestrial network connections. In such scenarios, the usage of satellites might represent a viable solution for providing wide area coverage and connectivity in a flexible and affordable manner. Our paper presents a survey on current solutions for the deployment of IoT services in remote/rural areas by exploiting satellites. Several architectures and technical solutions are analyzed, underlining their features and limitations, and real test cases are presented. It has been highlighted that low-orbit satellites offer an efficient solution to support long-range IoT services, with a good trade-off in terms of coverage and latency. Moreover, open issues, new challenges, and innovative technologies have been focused, carefully considering the perimeter that current IoT standardization framework will impose to the practical implementation of future satellite based IoT systems.

Journal ArticleDOI
TL;DR: A detailed survey of features and techniques that can be used in the Physical-Layer Authentication schemes, and divides the existing PLA schemes into two categories: passive and active schemes.
Abstract: Authentication is an important issue in wireless communications because the open nature of the wireless medium provides more security vulnerabilities. Recently, Physical-Layer Authentication (PLA) attracts many research interests because it provides information-theory security and low complexity. Although many researchers focus on the PLA and exploit its potential in enhancing wireless security, the literature is surprisingly sparse with no comprehensive overview of the state-of-the-art PLA and the key fundamentals involved. Thus, this article provides a detailed survey of features and techniques that can be used in the PLA. We categorize the existing PLA schemes into two categories: passive and active schemes. In the passive schemes, a receiver authenticates the transmitter based on the physical-layer features of the received signals. We further divide the passive schemes into two sub-categories: device-based features and channel-based features. In the active schemes, a transmitter generates a tag based on a secret key and embeds it into a source message. Then, a receiver authenticates the transmitter based on the tag whether it exists in the received signal. We further divide active schemes into two sub-categories: non-covert schemes and covert schemes. Moreover, we also provide some future research directions.

Journal ArticleDOI
TL;DR: In this article, the authors provide a systematic overview of security and privacy issues based on prospective technologies for 6G in the physical, connection, and service layers, as well as through lessons learned from the failures of existing security architectures and state-of-the-art defenses.
Abstract: Sixth-generation (6G) mobile networks will have to cope with diverse threats on a space-air-ground integrated network environment, novel technologies, and an accessible user information explosion. However, for now, security and privacy issues for 6G remain largely in concept. This survey provides a systematic overview of security and privacy issues based on prospective technologies for 6G in the physical, connection, and service layers, as well as through lessons learned from the failures of existing security architectures and state-of-the-art defenses. Two key lessons learned are as follows. First, other than inheriting vulnerabilities from the previous generations, 6G has new threat vectors from new radio technologies, such as the exposed location of radio stripes in ultra-massive MIMO systems at Terahertz bands and attacks against pervasive intelligence. Second, physical layer protection, deep network slicing, quantum-safe communications, artificial intelligence (AI) security, platform-agnostic security, real-time adaptive security, and novel data protection mechanisms such as distributed ledgers and differential privacy are the top promising techniques to mitigate the attack magnitude and personal data breaches substantially.

Journal ArticleDOI
TL;DR: In this article, a review of underwater routing protocols for UWSNs is presented, which classify the existing protocols into three categories: energy-based, data-based and geographic information-based protocols.
Abstract: Underwater wireless sensor network (UWSN) is currently a hot research field in academia and industry with many underwater applications, such as ocean monitoring, seismic monitoring, environment monitoring, and seabed exploration. However, UWSNs suffer from various limitations and challenges: high ocean interference and noise, high propagation delay, narrow bandwidth, dynamic network topology, and limited battery energy of sensor nodes. The design of routing protocols is one of the solutions to address these issues. A routing protocol can efficiently transfer the data from the source node to the destination node in the network. This article presents a review of underwater routing protocols for UWSNs. We classify the existing underwater routing protocols into three categories: energy-based, data-based, and geographic information-based protocols. In this article, we summarize the underwater routing protocols proposed in recent years. The proposed protocols are described in detail and give advantages and disadvantages. Meanwhile, the performance of different underwater routing protocols is analyzed in detail. Besides, we also present the research challenges and future directions of underwater routing protocols, which can help the researcher better explore in the future.