scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Communications Surveys and Tutorials in 2020"


Journal ArticleDOI
TL;DR: The concept of federated learning (FL) as mentioned in this paperederated learning has been proposed to enable collaborative training of an ML model and also enable DL for mobile edge network optimization in large-scale and complex mobile edge networks, where heterogeneous devices with varying constraints are involved.
Abstract: In recent years, mobile devices are equipped with increasingly advanced sensing and computing capabilities. Coupled with advancements in Deep Learning (DL), this opens up countless possibilities for meaningful applications, e.g., for medical purposes and in vehicular networks. Traditional cloud-based Machine Learning (ML) approaches require the data to be centralized in a cloud server or data center. However, this results in critical issues related to unacceptable latency and communication inefficiency. To this end, Mobile Edge Computing (MEC) has been proposed to bring intelligence closer to the edge, where data is produced. However, conventional enabling technologies for ML at mobile edge networks still require personal data to be shared with external parties, e.g., edge servers. Recently, in light of increasingly stringent data privacy legislations and growing privacy concerns, the concept of Federated Learning (FL) has been introduced. In FL, end devices use their local data to train an ML model required by the server. The end devices then send the model updates rather than raw data to the server for aggregation. FL can serve as an enabling technology in mobile edge networks since it enables the collaborative training of an ML model and also enables DL for mobile edge network optimization. However, in a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved. This raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale. In this survey, we begin with an introduction to the background and fundamentals of FL. Then, we highlight the aforementioned challenges of FL implementation and review existing solutions. Furthermore, we present the applications of FL for mobile edge network optimization. Finally, we discuss the important challenges and future research directions in FL.

895 citations


Journal ArticleDOI
TL;DR: A literature review on recent applications and design aspects of the intelligent reflecting surface (IRS) in the future wireless networks, and the joint optimization of the IRS’s phase control and the transceivers’ transmission control in different network design problems, e.g., rate maximization and power minimization problems.
Abstract: This paper presents a literature review on recent applications and design aspects of the intelligent reflecting surface (IRS) in the future wireless networks. Conventionally, the network optimization has been limited to transmission control at two endpoints, i.e., end users and network controller. The fading wireless channel is uncontrollable and becomes one of the main limiting factors for performance improvement. The IRS is composed of a large array of scattering elements, which can be individually configured to generate additional phase shifts to the signal reflections. Hence, it can actively control the signal propagation properties in favor of signal reception, and thus realize the notion of a smart radio environment. As such, the IRS’s phase control, combined with the conventional transmission control, can potentially bring performance gain compared to wireless networks without IRS. In this survey, we first introduce basic concepts of the IRS and the realizations of its reconfigurability. Then, we focus on applications of the IRS in wireless communications. We overview different performance metrics and analytical approaches to characterize the performance improvement of IRS-assisted wireless networks. To exploit the performance gain, we discuss the joint optimization of the IRS’s phase control and the transceivers’ transmission control in different network design problems, e.g., rate maximization and power minimization problems. Furthermore, we extend the discussion of IRS-assisted wireless networks to some emerging use cases. Finally, we highlight important practical challenges and future research directions for realizing IRS-assisted wireless networks in beyond 5G communications.

642 citations


Journal ArticleDOI
TL;DR: By consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge, i.e., Edge DL.
Abstract: Ubiquitous sensors and smart devices from factories and communities are generating massive amounts of data, and ever-increasing computing power is driving the core of computation and services from the cloud to the edge of the network. As an important enabler broadly changing people’s lives, from face recognition to ambitious smart factories and cities, developments of artificial intelligence (especially deep learning, DL) based applications and services are thriving. However, due to efficiency and latency issues, the current cloud computing service architecture hinders the vision of “providing artificial intelligence for every person and every organization at everywhere”. Thus, unleashing DL services using resources at the network edge near the data sources has emerged as a desirable solution. Therefore, edge intelligence , aiming to facilitate the deployment of DL services by edge computing, has received significant attention. In addition, DL, as the representative technique of artificial intelligence, can be integrated into edge computing frameworks to build intelligent edge for dynamic, adaptive edge maintenance and management. With regard to mutually beneficial edge intelligence and intelligent edge , this paper introduces and discusses: 1) the application scenarios of both; 2) the practical implementation methods and enabling technologies, namely DL training and inference in the customized edge computing framework; 3) challenges and future trends of more pervasive and fine-grained intelligence. We believe that by consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge , i.e., Edge DL.

611 citations


Journal ArticleDOI
TL;DR: A comprehensive survey of ML methods and recent advances in DL methods that can be used to develop enhanced security methods for IoT systems and presents the opportunities, advantages and shortcomings of each method.
Abstract: The Internet of Things (IoT) integrates billions of smart devices that can communicate with one another with minimal human intervention. IoT is one of the fastest developing fields in the history of computing, with an estimated 50 billion devices by the end of 2020. However, the crosscutting nature of IoT systems and the multidisciplinary components involved in the deployment of such systems have introduced new security challenges. Implementing security measures, such as encryption, authentication, access control, network and application security for IoT devices and their inherent vulnerabilities is ineffective. Therefore, existing security methods should be enhanced to effectively secure the IoT ecosystem. Machine learning and deep learning (ML/DL) have advanced considerably over the last few years, and machine intelligence has transitioned from laboratory novelty to practical machinery in several important applications. Consequently, ML/DL methods are important in transforming the security of IoT systems from merely facilitating secure communication between devices to security-based intelligence systems. The goal of this work is to provide a comprehensive survey of ML methods and recent advances in DL methods that can be used to develop enhanced security methods for IoT systems. IoT security threats that are related to inherent or newly introduced threats are presented, and various potential IoT system attack surfaces and the possible threats related to each surface are discussed. We then thoroughly review ML/DL methods for IoT security and present the opportunities, advantages and shortcomings of each method. We discuss the opportunities and challenges involved in applying ML/DL to IoT security. These opportunities and challenges can serve as potential future research directions.

543 citations


Journal ArticleDOI
TL;DR: The Internet of Nano Things and Tactile Internet are driving the innovation in the H-IoT applications and the future course for improving the Quality of Service (QoS) using these new technologies are identified.
Abstract: The impact of the Internet of Things (IoT) on the advancement of the healthcare industry is immense. The ushering of the Medicine 4.0 has resulted in an increased effort to develop platforms, both at the hardware level as well as the underlying software level. This vision has led to the development of Healthcare IoT (H-IoT) systems. The basic enabling technologies include the communication systems between the sensing nodes and the processors; and the processing algorithms for generating an output from the data collected by the sensors. However, at present, these enabling technologies are also supported by several new technologies. The use of Artificial Intelligence (AI) has transformed the H-IoT systems at almost every level. The fog/edge paradigm is bringing the computing power close to the deployed network and hence mitigating many challenges in the process. While the big data allows handling an enormous amount of data. Additionally, the Software Defined Networks (SDNs) bring flexibility to the system while the blockchains are finding the most novel use cases in H-IoT systems. The Internet of Nano Things (IoNT) and Tactile Internet (TI) are driving the innovation in the H-IoT applications. This paper delves into the ways these technologies are transforming the H-IoT systems and also identifies the future course for improving the Quality of Service (QoS) using these new technologies.

446 citations


Journal ArticleDOI
TL;DR: The purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges, as well as some promising cross-cutting data reduction and forensics intelligence techniques.
Abstract: Today is the era of the Internet of Things (IoT). The recent advances in hardware and information technology have accelerated the deployment of billions of interconnected, smart and adaptive devices in critical infrastructures like health, transportation, environmental control, and home automation. Transferring data over a network without requiring any kind of human-to-computer or human-to-human interaction, brings reliability and convenience to consumers, but also opens a new world of opportunity for intruders, and introduces a whole set of unique and complicated questions to the field of Digital Forensics. Although IoT data could be a rich source of evidence, forensics professionals cope with diverse problems, starting from the huge variety of IoT devices and non-standard formats, to the multi-tenant cloud infrastructure and the resulting multi-jurisdictional litigations. A further challenge is the end-to-end encryption which represents a trade-off between users’ right to privacy and the success of the forensics investigation. Due to its volatile nature, digital evidence has to be acquired and analyzed using validated tools and techniques that ensure the maintenance of the Chain of Custody. Therefore, the purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges. Furthermore, this work provides an overview of the past and current theoretical models in the digital forensics science. Special attention is paid to frameworks that aim to extract data in a privacy-preserving manner or secure the evidence integrity using decentralized blockchain-based solutions. In addition, the present paper addresses the ongoing Forensics-as-a-Service (FaaS) paradigm, as well as some promising cross-cutting data reduction and forensics intelligence techniques. Finally, several other research trends and open issues are presented, with emphasis on the need for proactive Forensics Readiness strategies and generally agreed-upon standards.

440 citations


Journal ArticleDOI
TL;DR: In this article, the authors review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning and investigate their employment in the compelling applications of wireless networks, including heterogeneous networks, cognitive radios (CR), Internet of Things (IoT), machine to machine networks (M2M), and so on.
Abstract: Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of Things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.

413 citations


Journal ArticleDOI
TL;DR: This paper systematically review the security requirements, attack vectors, and the current security solutions for the IoT networks, and sheds light on the gaps in these security solutions that call for ML and DL approaches.
Abstract: The future Internet of Things (IoT) will have a deep economical, commercial and social impact on our lives. The participating nodes in IoT networks are usually resource-constrained, which makes them luring targets for cyber attacks. In this regard, extensive efforts have been made to address the security and privacy issues in IoT networks primarily through traditional cryptographic approaches. However, the unique characteristics of IoT nodes render the existing solutions insufficient to encompass the entire security spectrum of the IoT networks. Machine Learning (ML) and Deep Learning (DL) techniques, which are able to provide embedded intelligence in the IoT devices and networks, can be leveraged to cope with different security problems. In this paper, we systematically review the security requirements, attack vectors, and the current security solutions for the IoT networks. We then shed light on the gaps in these security solutions that call for ML and DL approaches. Finally, we discuss in detail the existing ML and DL solutions for addressing different security problems in IoT networks. We also discuss several future research directions for ML- and DL-based IoT security.

407 citations


Journal ArticleDOI
TL;DR: The most significant use cases expected for 5G including their corresponding scenarios and traffic models are presented and guidelines to help and ease the performance evaluation of current and future 5G innovations, as well as the dimensioning of 5G future deployments are provided.
Abstract: The fifth-generation mobile initiative, 5G, is a tremendous and collective effort to specify, standardize, design, manufacture, and deploy the next cellular network generation. 5G networks will support demanding services such as enhanced Mobile Broadband, Ultra-Reliable and Low Latency Communications and massive Machine-Type Communications, which will require data rates of tens of Gbps, latencies of few milliseconds and connection densities of millions of devices per square kilometer. This survey presents the most significant use cases expected for 5G including their corresponding scenarios and traffic models. First, the paper analyzes the characteristics and requirements for 5G communications, considering aspects such as traffic volume, network deployments, and main performance targets. Secondly, emphasizing the definition of performance evaluation criteria for 5G technologies, the paper reviews related proposals from principal standards development organizations and industry alliances. Finally, well-defined and significant 5G use cases are provided. As a result, these guidelines will help and ease the performance evaluation of current and future 5G innovations, as well as the dimensioning of 5G future deployments.

399 citations


Journal ArticleDOI
TL;DR: A comprehensive review and analysis on the state-of-the-art blockchain consensus protocols is presented in this article, where the authors identify five core components of a blockchain consensus protocol, namely, block proposal, block validation, information propagation, block finalization, and incentive mechanism.
Abstract: Since the inception of Bitcoin, cryptocurrencies and the underlying blockchain technology have attracted an increasing interest from both academia and industry. Among various core components, consensus protocol is the defining technology behind the security and performance of blockchain. From incremental modifications of Nakamoto consensus protocol to innovative alternative consensus mechanisms, many consensus protocols have been proposed to improve the performance of the blockchain network itself or to accommodate other specific application needs. In this survey, we present a comprehensive review and analysis on the state-of-the-art blockchain consensus protocols. To facilitate the discussion of our analysis, we first introduce the key definitions and relevant results in the classic theory of fault tolerance which help to lay the foundation for further discussion. We identify five core components of a blockchain consensus protocol, namely, block proposal, block validation, information propagation, block finalization, and incentive mechanism. A wide spectrum of blockchain consensus protocols are then carefully reviewed accompanied by algorithmic abstractions and vulnerability analyses. The surveyed consensus protocols are analyzed using the five-component framework and compared with respect to different performance metrics. These analyses and comparisons provide us new insights in the fundamental differences of various proposals in terms of their suitable application scenarios, key assumptions, expected fault tolerance, scalability, drawbacks and trade-offs. We believe this survey will provide blockchain developers and researchers a comprehensive view on the state-of-the-art consensus protocols and facilitate the process of designing future protocols.

381 citations


Journal ArticleDOI
TL;DR: This paper surveys the application and implementation of differential privacy in four major applications of CPSs named as energy systems, transportation systems, healthcare and medical systems, and industrial Internet of things (IIoT).
Abstract: Modern cyber physical systems (CPSs) has widely being used in our daily lives because of development of information and communication technologies (ICT). With the provision of CPSs, the security and privacy threats associated to these systems are also increasing. Passive attacks are being used by intruders to get access to private information of CPSs. In order to make CPSs data more secure, certain privacy preservation strategies such as encryption, and k-anonymity have been presented in the past. However, with the advances in CPSs architecture, these techniques also need certain modifications. Meanwhile, differential privacy emerged as an efficient technique to protect CPSs data privacy. In this paper, we present a comprehensive survey of differential privacy techniques for CPSs. In particular, we survey the application and implementation of differential privacy in four major applications of CPSs named as energy systems, transportation systems, healthcare and medical systems, and industrial Internet of things (IIoT). Furthermore, we present open issues, challenges, and future research direction for differential privacy techniques for CPSs. This survey can serve as basis for the development of modern differential privacy techniques to address various problems and data privacy scenarios of CPSs.

Journal ArticleDOI
TL;DR: Some typical application scenarios of edge computing in IIoT, such as prognostics and health management, smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics, are introduced.
Abstract: The Industrial Internet of Things (IIoT) is a crucial research field spawned by the Internet of Things (IoT). IIoT links all types of industrial equipment through the network; establishes data acquisition, exchange, and analysis systems; and optimizes processes and services, so as to reduce cost and enhance productivity. The introduction of edge computing in IIoT can significantly reduce the decision-making latency, save bandwidth resources, and to some extent, protect privacy. This paper outlines the research progress concerning edge computing in IIoT. First, the concepts of IIoT and edge computing are discussed, and subsequently, the research progress of edge computing is discussed and summarized in detail. Next, the future architecture from the perspective of edge computing in IIoT is proposed, and its technical progress in routing, task scheduling, data storage and analytics, security, and standardization is analyzed. Furthermore, we discuss the opportunities and challenges of edge computing in IIoT in terms of 5G-based edge communication, load balancing and data offloading, edge intelligence, as well as data sharing security. Finally, we introduce some typical application scenarios of edge computing in IIoT, such as prognostics and health management (PHM), smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics.

Journal ArticleDOI
TL;DR: A comprehensive detail is presented on the core and enabling technologies, which are used to build the 5G security model; network softwarization security, PHY (Physical) layer security and 5G privacy concerns, among others.
Abstract: Security has become the primary concern in many telecommunications industries today as risks can have high consequences. Especially, as the core and enable technologies will be associated with 5G network, the confidential information will move at all layers in future wireless systems. Several incidents revealed that the hazard encountered by an infected wireless network, not only affects the security and privacy concerns, but also impedes the complex dynamics of the communications ecosystem. Consequently, the complexity and strength of security attacks have increased in the recent past making the detection or prevention of sabotage a global challenge. From the security and privacy perspectives, this paper presents a comprehensive detail on the core and enabling technologies, which are used to build the 5G security model; network softwarization security, PHY (Physical) layer security and 5G privacy concerns, among others. Additionally, the paper includes discussion on security monitoring and management of 5G networks. This paper also evaluates the related security measures and standards of core 5G technologies by resorting to different standardization bodies and provide a brief overview of 5G standardization security forces. Furthermore, the key projects of international significance, in line with the security concerns of 5G and beyond are also presented. Finally, a future directions and open challenges section has included to encourage future research.

Journal ArticleDOI
TL;DR: In this paper, the authors present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission in the mMTC scenario and provide a detailed overview of the existing and emerging solutions toward addressing RAN congestion problem.
Abstract: The ever-increasing number of resource-constrained machine-type communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced mobile broadband (eMBB), massive machine type communications (mMTCs), and ultra-reliable and low latency communications (URLLCs), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include quality of service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead, and radio access network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy random access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions toward addressing RAN congestion problem, and then identify potential advantages, challenges, and use cases for the applications of emerging machine learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity $Q$ -learning approach in the mMTC scenario along with the recent advances toward enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions.

Journal ArticleDOI
TL;DR: In this article, a detailed review of security attacks towards WSNs and IoT, along with the techniques for prevention, detection, and mitigation of those attacks are provided in this paper.
Abstract: Wireless Sensor Networks (WSNs) constitute one of the most promising third-millennium technologies and have wide range of applications in our surrounding environment. The reason behind the vast adoption of WSNs in various applications is that they have tremendously appealing features, e.g., low production cost, low installation cost, unattended network operation, autonomous and longtime operation. WSNs have started to merge with the Internet of Things (IoT) through the introduction of Internet access capability in sensor nodes and sensing ability in Internet-connected devices. Thereby, the IoT is providing access to huge amount of data, collected by the WSNs, over the Internet. Hence, the security of IoT should start with foremost securing WSNs ahead of the other components. However, owing to the absence of a physical line-of-defense, i.e., there is no dedicated infrastructure such as gateways to watch and observe the flowing information in the network, security of WSNs along with IoT is of a big concern to the scientific community. More specifically, for the application areas in which CIA (confidentiality, integrity, availability) has prime importance, WSNs and emerging IoT technology might constitute an open avenue for the attackers. Besides, recent integration and collaboration of WSNs with IoT will open new challenges and problems in terms of security. Hence, this would be a nightmare for the individuals using these systems as well as the security administrators who are managing those networks. Therefore, a detailed review of security attacks towards WSNs and IoT, along with the techniques for prevention, detection, and mitigation of those attacks are provided in this paper. In this text, attacks are categorized and treated into mainly two parts, most or all types of attacks towards WSNs and IoT are investigated under that umbrella: “Passive Attacks” and “Active Attacks”. Understanding these attacks and their associated defense mechanisms will help paving a secure path towards the proliferation and public acceptance of IoT technology.

Journal ArticleDOI
TL;DR: Various grant-free NOMA schemes are presented, their potential and related practical challenges are highlighted, and possible future directions are thoroughly discussed at the end.
Abstract: Massive machine-type communications (mMTC) is one of the main three focus areas in the 5th generation (5G) of wireless communications technologies to enable connectivity of a massive number of Internet of things (IoT) devices with little or no human intervention. In conventional human-type communications (HTC), due to the limited number of available channel resources and orthogonal resource allocation techniques, users get a transmission slot by making scheduling/connection requests. The involved control channel signaling, negligible with respect to the huge transmit data, is not a major issue. However, this may turn into a potential performance bottleneck in mMTC, where huge number of devices transmit short packet data in a sporadic way. To tackle the limited radio resources and massive connectivity challenges, non-orthogonal multiple access (NOMA) has emerged as a promising technology that allows multiple users to simultaneously transmit their data over the same channel resource. This is achieved by employing user-specific signature sequences at the transmitting devices, which are exploited by the receiver for multi-user data detection. Due to its massive connectivity potential, NOMA has also been considered to enable grant-free transmissions especially in mMTC, where devices can transmit their data whenever they need without the scheduling requests. The existing surveys majorly discuss different NOMA schemes, and exploit their potential, in typical grant-based HTC scenarios, where users are connected with the base station, and various system parameters are pre-defined in the scheduling phase. Different from these works, this survey provides a comprehensive review of the recent advances in NOMA from a grant-free connectivity perspective. Various grant-free NOMA schemes are presented, their potential and related practical challenges are highlighted, and possible future directions are thoroughly discussed at the end.

Journal ArticleDOI
TL;DR: This paper surveys the different rate optimization scenarios studied in the literature when PD-NOMA is combined with one or more of the candidate schemes and technologies for B5G networks including multiple-input-single-output (MISO), multiple- input-multiple- Output (MIMO), massive-MIMo), advanced antenna architectures, higher frequency millimeter-wave (mmWave) and terahertz (THz) communications.
Abstract: The ambitious high data-rate applications in the envisioned future beyond fifth-generation (B5G) wireless networks require new solutions, including the advent of more advanced architectures than the ones already used in 5G networks, and the coalition of different communications schemes and technologies to enable these applications requirements. Among the candidate communications schemes for future wireless networks are non-orthogonal multiple access (NOMA) schemes that allow serving more than one user in the same resource block by multiplexing users in other domains than frequency or time. In this way, NOMA schemes tend to offer several advantages over orthogonal multiple access (OMA) schemes such as improved user fairness and spectral efficiency, higher cell-edge throughput, massive connectivity support, and low transmission latency. With these merits, NOMA-enabled transmission schemes are being increasingly looked at as promising multiple access schemes for future wireless networks. When the power domain is used to multiplex the users, it is referred to as the power domain NOMA (PD-NOMA). In this paper, we survey the integration of PD-NOMA with the enabling communications schemes and technologies that are expected to meet the various requirements of B5G networks. In particular, this paper surveys the different rate optimization scenarios studied in the literature when PD-NOMA is combined with one or more of the candidate schemes and technologies for B5G networks including multiple-input-single-output (MISO), multiple-input-multiple-output (MIMO), massive-MIMO (mMIMO), advanced antenna architectures, higher frequency millimeter-wave (mmWave) and terahertz (THz) communications, advanced coordinated multi-point (CoMP) transmission and reception schemes, cooperative communications, cognitive radio (CR), visible light communications (VLC), unmanned aerial vehicle (UAV) assisted communications and others. The considered system models, the optimization methods utilized to maximize the achievable rates, and the main lessons learnt on the optimization and the performance of these NOMA-enabled schemes and technologies are discussed in detail along with the future research directions for these combined schemes. Moreover, the role of machine learning in optimizing these NOMA-enabled technologies is addressed.

Journal ArticleDOI
TL;DR: This article provides a comprehensive survey on LoRa networks, including the technical challenges of deployingLoRa networks and recent solutions, and some open issues of LoRa networking are discussed.
Abstract: Wireless networks have been widely deployed for many Internet-of-Things (IoT) applications, like smart cities and precision agriculture. Low Power Wide Area Networking (LPWAN) is an emerging IoT networking paradigm to meet three key requirements of IoT applications, i.e., low cost, large scale deployment and high energy efficiency. Among all available LPWAN technologies, LoRa networking has attracted much attention from both academia and industry, since it specifies an open standard and allows us to build autonomous LPWAN networks without any third-party infrastructure. Many LoRa networks have been developed recently, e.g., managing solar plants in Carson City, Nevada, USA and power monitoring in Lyon and Grenoble, France. However, there are still many research challenges to develop practical LoRa networks, e.g., link coordination, resource allocation, reliable transmissions and security. This article provides a comprehensive survey on LoRa networks, including the technical challenges of deploying LoRa networks and recent solutions. Based on our detailed analysis of current solutions, some open issues of LoRa networking are discussed. The goal of this survey paper is to inspire more works on improving the performance of LoRa networks and enabling more practical deployments.

Journal ArticleDOI
TL;DR: Inspired by the extensive research results in NDN-based VANET, this paper provides a detailed and systematic review ofNDN-driven VANet and discusses the feasibility of NDN architecture in VANets environment.
Abstract: Information-centric networking (ICN) has been proposed as one of the future Internet architectures. It is poised to address the challenges faced by today’s Internet that include, but not limited to, scalability, addressing, security, and privacy. Furthermore, it also aims at meeting the requirements for new emerging Internet applications. To realize ICN, named data networking (NDN) is one of the recent implementations of ICN that provides a suitable communication approach due to its clean slate design and simple communication model. There are a plethora of applications realized through ICN in different domains where data is the focal point of communication. One such domain is intelligent transportation system realized through vehicular ad hoc network (VANET) where vehicles exchange information and content with each other and with the infrastructure. Up to date, excellent research results have been yielded in the VANET domain aiming at safe, reliable, and infotainment-rich driving experience. However, due to the dynamic topologies, host-centric model, and ephemeral nature of vehicular communication, various challenges are faced by VANET that hinder the realization of successful vehicular networks and adversely affect the data dissemination, content delivery, and user experiences. To fill these gaps, NDN has been extensively used as underlying communication paradigm for VANET. Inspired by the extensive research results in NDN-based VANET, in this paper, we provide a detailed and systematic review of NDN-driven VANET. More precisely, we investigate the role of NDN in VANET and discuss the feasibility of NDN architecture in VANET environment. Subsequently, we cover in detail, NDN-based naming, routing and forwarding, caching, mobility, and security mechanism for VANET. Furthermore, we discuss the existing standards, solutions, and simulation tools used in NDN-based VANET. Finally, we also identify open challenges and issues faced by NDN-driven VANET and highlight future research directions that should be addressed by the research community.

Journal ArticleDOI
TL;DR: This survey addresses the issue of how blockchain technology inserts into current deployed cloud solutions and enables the reengineering of cloud datacenter, and investigates recent efforts in the technical fusion of blockchain and clouds.
Abstract: Blockchain technology has been deemed to be an ideal choice for strengthening existing computing systems in varied manners. As one of the network-enabled technologies, cloud computing has been broadly adopted in the industry through numerous cloud service models. Fusing blockchain technology with existing cloud systems has a great potential in both functionality/performance enhancement and security/privacy improvement. The question remains on how blockchain technology inserts into current deployed cloud solutions and enables the reengineering of cloud datacenter. This survey addresses this issue and investigates recent efforts in the technical fusion of blockchain and clouds. Three technical dimensions roughly are covered in this work. First, we concern the service model and review an emerging cloud-relevant blockchain service model, Blockchain-as-a-Service (BaaS); second, security is considered a key technical dimension in this work and both access control and searchable encryption schemes are assessed; finally, we examine the performance of cloud datacenter with supports/participance of blockchain from hardware and software perspectives. Main findings of this survey will be theoretical supports for future reference of blockchain-enabled reengineering of cloud datacenter.

Journal ArticleDOI
TL;DR: An extensive survey on SDN and the edge computing ecosystem to solve the challenge of complex IoT management and comprehensively present security and privacy vulnerabilities in the SDIoT-Edge computing and detailed taxonomies of multiple attack possibilities in this paradigm.
Abstract: Millions of sensors continuously produce and transmit data to control real-world infrastructures using complex networks in the Internet of Things (IoT). However, IoT devices are limited in computational power, including storage, processing, and communication resources, to effectively perform compute-intensive tasks locally. Edge computing resolves the resource limitation problems by bringing computation closer to the edge of IoT devices. Providing distributed edge nodes across the network reduces the stress of centralized computation and overcomes latency challenges in the IoT. Therefore, edge computing presents low-cost solutions for compute-intensive tasks. Software-Defined Networking (SDN) enables effective network management by presenting a global perspective of the network. While SDN was not explicitly developed for IoT challenges, it can, however, provide impetus to solve the complexity issues and help in efficient IoT service orchestration. The current IoT paradigm of massive data generation, complex infrastructures, security vulnerabilities, and requirements from the newly developed technologies make IoT realization a challenging issue. In this research, we provide an extensive survey on SDN and the edge computing ecosystem to solve the challenge of complex IoT management. We present the latest research on Software-Defined Internet of Things orchestration using Edge (SDIoT-Edge) and highlight key requirements and standardization efforts in integrating these diverse architectures. An extensive discussion on different case studies using SDIoT-Edge computing is presented to envision the underlying concept. Furthermore, we classify state-of-the-art research in the SDIoT-Edge ecosystem based on multiple performance parameters. We comprehensively present security and privacy vulnerabilities in the SDIoT-Edge computing and provide detailed taxonomies of multiple attack possibilities in this paradigm. We highlight the lessons learned based on our findings at the end of each section. Finally, we discuss critical insights toward current research issues, challenges, and further research directions to efficiently provide IoT services in the SDIoT-Edge paradigm.

Journal ArticleDOI
TL;DR: A comprehensive survey on the use of ML in MEC systems is provided, offering an insight into the current progress of this research area and helpful guidance is supplied by pointing out which MEC challenges can be solved by ML solutions, what are the current trending algorithms in frontier ML research and how they could be used in M EC.
Abstract: Mobile Edge Computing (MEC) is considered an essential future service for the implementation of 5G networks and the Internet of Things, as it is the best method of delivering computation and communication resources to mobile devices. It is based on the connection of the users to servers located on the edge of the network, which is especially relevant for real-time applications that demand minimal latency. In order to guarantee a resource-efficient MEC (which, for example, could mean improved Quality of Service for users or lower costs for service providers), it is important to consider certain aspects of the service model, such as where to offload the tasks generated by the devices, how many resources to allocate to each user (specially in the wired or wireless device-server communication) and how to handle inter-server communication. However, in the MEC scenarios with many and varied users, servers and applications, these problems are characterized by parameters with exceedingly high levels of dimensionality, resulting in too much data to be processed and complicating the task of finding efficient configurations. This will be particularly troublesome when 5G networks and Internet of Things roll out, with their massive amounts of devices. To address this concern, the best solution is to utilize Machine Learning (ML) algorithms, which enable the computer to draw conclusions and make predictions based on existing data without human supervision, leading to quick near-optimal solutions even in problems with high dimensionality. Indeed, in scenarios with too much data and too many parameters, ML algorithms are often the only feasible alternative. In this paper, a comprehensive survey on the use of ML in MEC systems is provided, offering an insight into the current progress of this research area. Furthermore, helpful guidance is supplied by pointing out which MEC challenges can be solved by ML solutions, what are the current trending algorithms in frontier ML research and how they could be used in MEC. These pieces of information should prove fundamental in encouraging future research that combines ML and MEC.

Journal ArticleDOI
TL;DR: The existing wireless sensing systems are surveyed in terms of their basic principles, techniques and system structures to describe how the wireless signals could be utilized to facilitate an array of applications including intrusion detection, room occupancy monitoring, daily activity recognition, gesture recognition, vital signs monitoring, user identification and indoor localization.
Abstract: With the advancement of wireless technologies and sensing methodologies, many studies have shown the success of re-using wireless signals (e.g., WiFi) to sense human activities and thereby realize a set of emerging applications, ranging from intrusion detection, daily activity recognition, gesture recognition to vital signs monitoring and user identification involving even finer-grained motion sensing. These applications arguably can brace various domains for smart home and office environments, including safety protection, well-being monitoring/management, smart healthcare and smart-appliance interaction. The movements of the human body impact the wireless signal propagation (e.g., reflection, diffraction and scattering), which provide great opportunities to capture human motions by analyzing the received wireless signals. Researchers take the advantage of the existing wireless links among mobile/smart devices (e.g., laptops, smartphones, smart thermostats, smart refrigerators and virtual assistance systems) by either extracting the ready-to-use signal measurements or adopting frequency modulated signals to detect the frequency shift. Due to the low-cost and non-intrusive sensing nature, wireless-based human activity sensing has drawn considerable attention and become a prominent research field over the past decade. In this paper, we survey the existing wireless sensing systems in terms of their basic principles, techniques and system structures. Particularly, we describe how the wireless signals could be utilized to facilitate an array of applications including intrusion detection, room occupancy monitoring, daily activity recognition, gesture recognition, vital signs monitoring, user identification and indoor localization. The future research directions and limitations of using wireless signals for human activity sensing are also discussed.

Journal ArticleDOI
TL;DR: The definition of constructive interference (CI) is presented and the corresponding mathematical characterization is formulated for popular modulation types, based on which optimization-based precoding techniques are discussed.
Abstract: Interference is traditionally viewed as a performance limiting factor in wireless communication systems, which is to be minimized or mitigated. Nevertheless, a recent line of work has shown that by manipulating the interfering signals such that they add up constructively at the receiver side, known interference can be made beneficial and further improve the system performance in a variety of wireless scenarios, achieved by symbol-level precoding (SLP). This paper aims to provide a tutorial on interference exploitation techniques from the perspective of precoding design in a multi-antenna wireless communication system, by beginning with the classification of constructive interference (CI) and destructive interference (DI). The definition for CI is presented and the corresponding mathematical characterization is formulated for popular modulation types, based on which optimization-based precoding techniques are discussed. In addition, the extension of CI precoding to other application scenarios as well as for hardware efficiency is also described. Proof-of-concept testbeds are demonstrated for the potential practical implementation of CI precoding, and finally a list of open problems and practical challenges are presented to inspire and motivate further research directions in this area.

Journal ArticleDOI
TL;DR: This paper systematically explore the attack surface of the Blockchain technology, with an emphasis on public Blockchains, and outlines several attacks, including selfish mining, the 51% attack, DNS attacks, distributed denial-of-service (DDoS) attacks, consensus delay, orphaned and stale blocks, block ingestion, wallet thefts, smart contract attacks, and privacy attacks.
Abstract: In this paper, we systematically explore the attack surface of the Blockchain technology, with an emphasis on public Blockchains. Towards this goal, we attribute attack viability in the attack surface to 1) the Blockchain cryptographic constructs, 2) the distributed architecture of the systems using Blockchain, and 3) the Blockchain application context. To each of those contributing factors, we outline several attacks, including selfish mining, the 51% attack, DNS attacks, distributed denial-of-service (DDoS) attacks, consensus delay (due to selfish behavior or distributed denial-of-service attacks), Blockchain forks, orphaned and stale blocks, block ingestion, wallet thefts, smart contract attacks, and privacy attacks. We also explore the causal relationships between these attacks to demonstrate how various attack vectors are connected to one another. A secondary contribution of this work is outlining effective defense measures taken by the Blockchain technology or proposed by researchers to mitigate the effects of these attacks and patch associated vulnerabilities.

Journal ArticleDOI
TL;DR: This paper consists of two contributions: the primary contribution is a systematic review of the literature over the period 2011–2019 on IIoT Security, focusing on how the relatively new paradigm of Fog computing can be leveraged to address these requirements, and thus improve the security of the IIeT.
Abstract: A key application of the Internet of Things (IoT) paradigm lies within industrial contexts. Indeed, the emerging Industrial Internet of Things (IIoT), commonly referred to as Industry 4.0, promises to revolutionize production and manufacturing through the use of large numbers of networked embedded sensing devices, and the combination of emerging computing technologies, such as Fog/Cloud Computing and Artificial Intelligence. The IIoT is characterized by an increased degree of inter-connectivity, which not only creates opportunities for the industries that adopt it, but also for cyber-criminals. Indeed, IoT security currently represents one of the major obstacles that prevent the widespread adoption of IIoT technology. Unsurprisingly, such concerns led to an exponential growth of published research over the last few years. To get an overview of the field, we deem it important to systematically survey the academic literature so far, and distill from it various security requirements as well as their popularity. This paper consists of two contributions: our primary contribution is a systematic review of the literature over the period 2011–2019 on IIoT Security, focusing in particular on the security requirements of the IIoT. Our secondary contribution is a reflection on how the relatively new paradigm of Fog computing can be leveraged to address these requirements, and thus improve the security of the IIoT.

Journal ArticleDOI
TL;DR: In this paper, the authors conduct a systematic and in-depth survey of the ML- and DL-based resource management mechanisms in cellular wireless and IoT networks, and identify the future research directions in using ML and DL for resource allocation and management in IoT networks.
Abstract: Internet-of-Things (IoT) refers to a massively heterogeneous network formed through smart devices connected to the Internet. In the wake of disruptive IoT with a huge amount and variety of data, Machine Learning (ML) and Deep Learning (DL) mechanisms will play a pivotal role to bring intelligence to the IoT networks. Among other aspects, ML and DL can play an essential role in addressing the challenges of resource management in large-scale IoT networks. In this article, we conduct a systematic and in-depth survey of the ML- and DL-based resource management mechanisms in cellular wireless and IoT networks. We start with the challenges of resource management in cellular IoT and low-power IoT networks, review the traditional resource management mechanisms for IoT networks, and motivate the use of ML and DL techniques for resource management in these networks. Then, we provide a comprehensive survey of the existing ML- and DL-based resource management techniques in wireless IoT networks and the techniques specifically designed for HetNets, MIMO and D2D communications, and NOMA networks. To this end, we also identify the future research directions in using ML and DL for resource allocation and management in IoT networks.

Journal ArticleDOI
TL;DR: This study surveys the state of the art and key research directions regarding optical wireless hybrid networks, being the first extensive survey dedicated to this topic and outlines important challenges that need to be addressed for successful deployment of optical Wireless hybrid network systems for 5G and IoT paradigms.
Abstract: Optical wireless communication (OWC) is an excellent complementary solution to its radio frequency (RF) counterpart. OWC technologies have been demonstrated to be able to support high traffic generated by massive connectivity of the Internet of Things (IoT) and upcoming 5th generation (5G) wireless communication systems. As the characteristics of OWC and RF are complementary, a combined application is regarded as a promising approach to support 5G and beyond communication systems. Hybrid RF/optical and optical/optical wireless systems offer an excellent solution for recovering the limitations of individual systems as well as for providing positive features of each of the technologies. An RF/optical wireless hybrid system consists both RF and optical-based wireless technologies, whereas an optical/optical wireless hybrid system consists two or more types of OWC technologies. The co-deployment of wireless systems can improve system performance in terms of throughput, reliability, and energy efficiency of individual networks. This study surveys the state of the art and key research directions regarding optical wireless hybrid networks, being the first extensive survey dedicated to this topic. We provide a technology overview of existing literature on optical wireless hybrid networks, such as RF/optical and optical/optical systems. We consider the RF-based macrocell, small cell, wireless fidelity, and Bluetooth, as well as optical-based visible light communication, light fidelity, optical camera communication, and free-space optical communication technologies for different combinations of hybrid systems. Moreover, we consider underwater acoustic communication for hybrid acoustic/optical systems. The opportunities brought by hybrid systems are presented in detail. We outline important challenges that need to be addressed for successful deployment of optical wireless hybrid network systems for 5G and IoT paradigms.

Journal ArticleDOI
TL;DR: The overall trends of MTD research in terms of critical aspects of defense systems for researchers who seek to develop proactive, adaptive MTD mechanisms are provided.
Abstract: Reactive defense mechanisms, such as intrusion detection systems, have made significant efforts to secure a system or network for the last several decades. However, the nature of reactive security mechanisms has limitations because potential attackers cannot be prevented in advance. We are facing a reality with the proliferation of persistent, advanced, intelligent attacks while defenders are often way behind attackers in taking appropriate actions to thwart potential attackers. The concept of moving target defense (MTD) has emerged as a proactive defense mechanism aiming to prevent attacks. In this work, we conducted a comprehensive, in-depth survey to discuss the following aspects of MTD: key roles, design principles, classifications, common attacks, key methodologies, important algorithms, metrics, evaluation methods, and application domains. We discuss the pros and cons of all aspects of MTD surveyed in this work. Lastly, we highlight insights and lessons learned from this study and suggest future work directions. The aim of this paper is to provide the overall trends of MTD research in terms of critical aspects of defense systems for researchers who seek to develop proactive, adaptive MTD mechanisms.

Journal ArticleDOI
TL;DR: This paper identifies several important aspects of integrating blockchain and ML, including overview, benefits, and applications, and discusses some open issues, challenges, and broader perspectives that need to be addressed to jointly consider blockchain andML for communications and networking systems.
Abstract: Recently, with the rapid development of information and communication technologies, the infrastructures, resources, end devices, and applications in communications and networking systems are becoming much more complex and heterogeneous. In addition, the large volume of data and massive end devices may bring serious security, privacy, services provisioning, and network management challenges. In order to achieve decentralized, secure, intelligent, and efficient network operation and management, the joint consideration of blockchain and machine learning (ML) may bring significant benefits and have attracted great interests from both academia and industry. On one hand, blockchain can significantly facilitate training data and ML model sharing, decentralized intelligence, security, privacy, and trusted decision-making of ML. On the other hand, ML will have significant impacts on the development of blockchain in communications and networking systems, including energy and resource efficiency, scalability, security, privacy, and intelligent smart contracts. However, some essential open issues and challenges that remain to be addressed before the widespread deployment of the integration of blockchain and ML, including resource management, data processing, scalable operation, and security issues. In this paper, we present a survey on the existing works for blockchain and ML technologies. We identify several important aspects of integrating blockchain and ML, including overview, benefits, and applications. Then we discuss some open issues, challenges, and broader perspectives that need to be addressed to jointly consider blockchain and ML for communications and networking systems.