scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Communications Magazine in 2019"


Journal Article•DOI•
TL;DR: Potential technologies for 6G to enable mobile AI applications, as well as AI-enabled methodologies for6G network design and optimization are discussed.
Abstract: The recent upsurge of diversified mobile applications, especially those supported by AI, is spurring heated discussions on the future evolution of wireless communications. While 5G is being deployed around the world, efforts from industry and academia have started to look beyond 5G and conceptualize 6G. We envision 6G to undergo an unprecedented transformation that will make it substantially different from the previous generations of wireless cellular systems. In particular, 6G will go beyond mobile Internet and will be required to support ubiquitous AI services from the core to the end devices of the network. Meanwhile, AI will play a critical role in designing and optimizing 6G architectures, protocols, and operations. In this article, we discuss potential technologies for 6G to enable mobile AI applications, as well as AI-enabled methodologies for 6G network design and optimization. Key trends in the evolution to 6G will also be discussed.

1,245 citations


Journal Article•DOI•
TL;DR: In this paper, a general framework for deep learning-based traffic classification is introduced and presented commonly used deep learning methods and their application in traffic classification tasks and discussed open problems, challenges, and opportunities for traffic classification.
Abstract: Traffic classification has been studied for two decades and applied to a wide range of applications from QoS provisioning and billing in ISPs to security-related applications in firewalls and intrusion detection systems. Port-based, data packet inspection, and classical machine learning methods have been used extensively in the past, but their accuracy has declined due to the dramatic changes in Internet traffic, particularly the increase in encrypted traffic. With the proliferation of deep learning methods, researchers have recently investigated these methods for traffic classification and reported high accuracy. In this article, we introduce a general framework for deep-learning-based traffic classification. We present commonly used deep learning methods and their application in traffic classification tasks. Then we discuss open problems, challenges, and opportunities for traffic classification.

330 citations


Journal Article•DOI•
TL;DR: In this paper, the authors provide an overview of channel coding techniques for URLLC and compare them in terms of performance and complexity, identifying several important research directions and discussed in more detail.
Abstract: This article reviews state of the art channel coding techniques for URLLC. The stringent requirements of URLLC services, such as ultrahigh reliability and low latency, have made it the most challenging feature of 5G of mobile networks. The problem is even more challenging for services beyond the 5G promise, such as tele-surgery and factory automation, which require latencies less than 1ms and packet error rates as low as 10-9. This article provides an overview of channel coding techniques for URLLC and compares them in terms of performance and complexity. Several important research directions are identified and discussed in more detail.

293 citations


Journal Article•DOI•
Bin Cao1, Long Zhang1, Li Yun1, Daquan Feng2, Wei Cao •
TL;DR: The basic concept of MEC and main applications are introduced, and existing fundamental works using various ML-based approaches are reviewed, and some potential issues of AI in MEC for future work are discussed.
Abstract: Multi-access edge computing (MEC), which is deployed in the proximity area of the mobile user side as a supplement to the traditional remote cloud center, has been regarded as a promising technique for 5G heterogeneous networks. With the assistance of MEC, mobile users can access computing resource effectively. Also, congestion in the core network can be alleviated by offloading. To adapt in stochastic and constantly varying environments, augmented intelligence (AI) is introduced in MEC for intelligent decision making. For this reason, several recent works have focused on intelligent offloading in MEC to harvest its potential benefits. Therefore, machine learning (ML)-based approaches, including reinforcement learning, supervised/unsupervised learning, deep learning, as well as deep reinforcement learning for AI in MEC have become hot topics. However, many technical challenges still remain to be addressed for AI in MEC. In this article, the basic concept of MEC and main applications are introduced, and existing fundamental works using various ML-based approaches are reviewed. Furthermore, some potential issues of AI in MEC for future work are discussed.

215 citations


Journal Article•DOI•
TL;DR: This article provides an accessible introduction to the emerging idea of Age of Information (AoI) that quantifies freshness of information and explores its possible role in the efficient design of freshness-aware Internet of Things (IoT).
Abstract: In this article, we provide an accessible introduction to the emerging idea of Age of Information (AoI) that quantifies freshness of information and explore its possible role in the efficient design of freshness-aware Internet of Things (IoT). We start by summarizing the concept of AoI and its variants with emphasis on the differences between AoI and other well-known performance metrics in the literature, such as throughput and delay. Building on this, we explore freshness-aware IoT design for a network in which IoT devices sense potentially different physical processes and are supposed to frequently update the status of these processes at a destination node (e.g., a cellular base station). Inspired by recent interest, we also assume that these IoT devices are powered by wireless energy transfer by the destination node. For this setting, we investigate the optimal sampling policy that jointly optimizes wireless energy transfer and scheduling of update packet transmissions from IoT devices with the goal of minimizing long-term weighted sum-AoI. Using this, we characterize the achievable AoI region. We also compare this AoI-optimal policy with the one that maximizes average throughput (throughput-optimal policy), and demonstrate the impact of system state on their structures. Several promising directions for future research are also presented.

202 citations


Journal Article•DOI•
TL;DR: A hybrid NOMA strategy that reaps the joint benefits of resource allocation and grantfree transmission is investigated to simultaneously accomplish high throughput, large connectivity, and low energy cost.
Abstract: NOMA has been recognized as a highly promising FRA technology to satisfy the requirements of the fifth generation era on high spectral efficiency and massive connectivity. Since the EE has become a growing concern in FRA from both the industrial and societal perspectives, this article discusses the sustainability issues of NOMA. We first thoroughly examine the theoretical power regions of NOMA to show the minimum transmission power with fixed data rate requirement, demonstrating the EE performance advantage of NOMA over orthogonal multiple access. Then we explore the role of energy-aware resource allocation and grant-free transmission in further enhancing the EE performance of NOMA. Based on this exploration, a hybrid NOMA strategy that reaps the joint benefits of resource allocation and grantfree transmission is investigated to simultaneously accomplish high throughput, large connectivity, and low energy cost. Finally, we identify some important and interesting future directions for NOMA designers to follow in the next decade.

173 citations


Journal Article•DOI•
TL;DR: This work proposes a new DRL-based offloading framework, which can efficiently learn the offloading policy uniquely represented by a specially designed S2S neural network, and shows that the method outperforms two heuristic baselines and achieves nearly optimal performance.
Abstract: MEC is an emerging paradigm that utilizes computing resources at the network edge to deploy heterogeneous applications and services. In the MEC system, mobile users and enterprises can offload computation-intensive tasks to nearby computing resources to reduce latency and save energy. When users make offloading decisions, the task dependency needs to be considered. Due to the NP-hardness of the offloading problem, the existing solutions are mainly heuristic, and therefore have difficulties in adapting to the increasingly complex and dynamic applications. To address the challenges of task dependency and adapting to dynamic scenarios, we propose a new DRL-based offloading framework, which can efficiently learn the offloading policy uniquely represented by a specially designed S2S neural network. The proposed DRL solution can automatically discover the common patterns behind various applications so as to infer an optimal offloading policy in different scenarios. Simulation experiments were conducted to evaluate the performance of the proposed DRL-based method with different data transmission rates and task numbers. The results show that our method outperforms two heuristic baselines and achieves nearly optimal performance.

164 citations


Journal Article•DOI•
TL;DR: A green and sustainable virtual network embedding framework for cooperative edge computing in wireless-optical broadband access networks is put forward, which leverages a reliability function to confirm the number of backup edge devices, and embed virtual networks onto the suitable edge devices in CoTs.
Abstract: The proliferation of IoTs beside the emergence of various cloud services push the horizon of edge computing. By offering cloud capabilities at the network edge closer to mobile devices, edge computing is a promising paradigm to resolve several vital challenges in IoTs, such as bandwidth saturation, energy constraints, low latency transmission, and data security and privacy. To provide a comprehensive understanding of edge computing supported by the integration of IoTs and cloud computing, that is, CoTs, this article first discusses some distinct research directions in CoTs with respect to edge computing. Given the significance of energy efficiency and sustainability of edge deployment in CoTs, we put forward a green and sustainable virtual network embedding framework for cooperative edge computing in wireless-optical broadband access networks. Specifically, we leverage a reliability function to confirm the number of backup edge devices, and embed virtual networks onto the suitable edge devices in CoTs. Finally, several research challenges and open issues are discussed.

150 citations


Journal Article•DOI•
TL;DR: The slicing concept in the 5G RAN with the related challenges and research problems is investigated to identify the plausible options for implementing the slicing concept at the RAN level by the mobile network operator to respond to the needs of verticals.
Abstract: This article investigates the slicing concept in the 5G RAN with the related challenges and research problems. The objective is to identify the plausible options for implementing the slicing concept at the RAN level by the mobile network operator to respond to the needs of verticals. We start by identifying the different slice granularity options, that is, how to define slices by combining customer and service needs. We then present how the 5G NR features can be used for facilitating slice implementation and provide typical configurations for different slice types from technology and RAN architecture perspectives. The main challenges for RAN slicing are then discussed, with special attention to the resource allocation problem between slices sharing the same spectrum band. We also investigate the multi-tenant slicing implementation in terms of the openness of the network to third parties, which is regarded as a key issue that may encourage vertical players to use operators' networks rather than deploying their own infrastructure.

149 citations


Journal Article•DOI•
TL;DR: This work proposes to leverage "free" green energy to power IoT devices and revolutionarily enable wireless charging of these devices and lays out the basic design principles for these three steps, shed some light on the solutions and present the corresponding challenges individually.
Abstract: With the ongoing worldwide development of IoT, an unprecedented number of IoT devices imperatively consume a substantial amount of energy. IoT devices have been predicted to be the leading energy guzzler in Information and Communications Technology by 2020. In considering the finite amount of brown energy sources along with their potential harmful impacts to the climate and environment, we propose to leverage "free" green energy to power IoT devices and revolutionarily enable wireless charging of these devices. Specifically, we propose to green IoT in three steps, namely, ambient green energy harvesting, green energy wireless charging and green energy balancing, in which the latter step reinforces the former step to ensure the availability of green energy. We lay out the basic design principles for these three steps, shed some light on the solutions and present the corresponding challenges individually.

121 citations


Journal Article•DOI•
TL;DR: An overview on fog computing enabled mobile communication networks (FogMNW), including network architecture, system capacity and resource management, and a heterogeneous communication and hierarchical fog computing network architecture are provided.
Abstract: The convergence of communication and computing (COM2P) has been taken as a promising solution for the sustainable development of mobile communication systems. The introduction of fog computing in future mobile networks makes COM2P possible. This article provides an overview on fog computing enabled mobile communication networks (FogMNW), including network architecture, system capacity and resource management. First, this article analyzes the heterogeneity of FogMNW with both advanced communication techniques and fog computing. Then a heterogeneous communication and hierarchical fog computing network architecture is proposed. With both communication and computing resources, FogMNW is enabled to achieve much higher capacity than conventional communication networks. This has been well demonstrated by the coded multicast scheme. Furthermore, a systematic management of communication and computing resources is necessary for FogMNW. By exploiting the communication load diversity in N cells, a communication load aware CLA scheme can achieve much higher computing resource efficiency than comparing schemes. The performance gap increases with N, and CLA can improve efficiency by more than 100 percent when there are 14 cells.

Journal Article•DOI•
TL;DR: The construction of green IoT systems in the whole life cycle of agri-products will have great impact on farmers' interest in IoT techniques, and with the life cycle framework, emerging finance, operation, and management (FOM) issues are observed.
Abstract: The increasing population in the world forces humans to improve farm yields using advanced technologies. The Internet of Things (IoT) is one promising technique to achieve precision agriculture, which is expected to greatly increase yields. However, the large-scale application of IoT systems in agriculture is facing challenges such as huge investment in agriculture IoT systems and non-tech-savvy farmers. To identify these challenges, we summarize the applications of IoT techniques in agriculture in four categories: controlled environment planting, open-field planting, livestock breeding, and aquaculture and aquaponics. The focus on implementing agriculture IoT systems is suggested to be expanded from the growth cycle to the agri-products life cycle. Meanwhile, the energy concern should be considered in the implementation of agriculture IoT systems. The construction of green IoT systems in the whole life cycle of agri-products will have great impact on farmers' interest in IoT techniques. With the life cycle framework, emerging finance, operation, and management (FOM) issues in the implementation of green IoT systems in agriculture are observed, such as IoT finance, supply chain and big data financing, network nodes recharging and repairing, and IoT data management. These FOM issues call for innovative farm production modes and new types of agribusiness enterprises.

Journal Article•DOI•
TL;DR: This article outlines three battery charging options that may be considered by a network operator and uses simulations to demonstrate the performance impact of incorporating those options into a cellular network where UAV infrastructure provides wireless service.
Abstract: UAVs can play an important role in next generation cellular networks, acting as flying infrastructure that can serve ground users when regular infrastructure is overloaded or unavailable. As these devices operate wirelessly, they rely on an internal battery for their power supply, which limits the amount of time they can operate over an area of interest before having to recharge. To accommodate this limitation, UAV networks will have to rely on dedicated infrastructure to recharge the UAV in between deployments. In this article, we outline three battery charging options that may be considered by a network operator and use simulations to demonstrate the performance impact of incorporating those options into a cellular network where UAV infrastructure provides wireless service.

Journal Article•DOI•
TL;DR: In this article, the authors present the main objectives and timelines of this new 802.11be amendment, thoroughly describe its main candidate features and enhancements, and cover the important issue of coexistence with other wireless technologies.
Abstract: Wi-Fi technology is continuously innovating to cater to the growing customer demands, driven by the digitalization of everything, in the home as well as in enterprise and hotspot spaces. In this article, we introduce to the wireless community the next generation Wi-Fi, based on IEEE 802.11be Extremely High Throughput (EHT), present the main objectives and timelines of this new 802.11be amendment, thoroughly describe its main candidate features and enhancements, and cover the important issue of coexistence with other wireless technologies. We also provide simulation results to assess the potential throughput gains brought by 802.11be with respect to 802.11ax.

Journal Article•DOI•
TL;DR: In this paper, the authors proposed a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons, which exhibits a certain level of sparsity and leads to a decrease in computational complexity.
Abstract: Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.

Journal Article•DOI•
TL;DR: In this article, the authors provide a comprehensive simulation study of TCP considering various factors such as the congestion control algorithm, including the recently proposed TCP BBR, edge vs. remote servers, handover and multi-connectivity, TCP packet size, and 3GPP stack parameters.
Abstract: The vast available spectrum in the millimeter- wave (mmWave) bands offers the possibility of multi-gigabit-per-second data rates for fifth generation cellular networks. However, mmWave capacity can be highly intermittent due to the vulnerability of mmWave signals to blockages and delays in directional searching. Such highly variable links present unique challenges for adaptive control mechanisms in transport layer protocols and end-to-end applications. This article considers the fundamental question of whether TCP, the most widely used transport protocol, will work in mmWave cellular systems. The article provides a comprehensive simulation study of TCP considering various factors such as the congestion control algorithm, including the recently proposed TCP BBR, edge vs. remote servers, handover and multi-connectivity, TCP packet size, and 3GPPstack parameters. We show that the performance of TCP on mmWave links is highly dependent on different combinations of these parameters, and identify the open challenges in this area.

Journal Article•DOI•
TL;DR: In this paper, a distributed risk-aware radio resource management (RRM) solution is proposed for coexistence of scheduled and non-scheduled URLLC traffic. And the proposed solution benefits from hybrid orthogonal/non-orthogonal radio resource slicing, and proactively regulates the spectrum needed for satisfying the delay/reliability requirement of each URLLc traffic type.
Abstract: Supporting ultra-reliable low-latency communications (URLLC) is a major challenge of 5G wireless networks. Stringent delay and reliability requirements need to be satisfied for both scheduled and non-scheduled URLLC traffic to enable a diverse set of 5G applications. Although physical and media access control layer solutions have been investigated to satisfy scheduled URLLC traffic, there is lack of study on enabling transmission of non-scheduled URLLC traffic, especially in coexistence with the scheduled URLLC traffic. Machine learning (ML) is an important enabler to manage this coexistence scenario due to its ability to exploit spatial/ temporal correlation in user behaviors and use of radio resources. In this article, we first study the coexistence design challenges, especially the radio resource management (RRM) problem, and propose a distributed risk- aware ML solution for RRM. The proposed solution benefits from hybrid orthogonal/non-orthogonal radio resource slicing, and proactively regulates the spectrum needed for satisfying the delay/reliability requirement of each URLLC traffic type. A case study is introduced to investigate the potential of the proposed RRM in serving coexisting URLLC traffic types. The results further provide insights on the benefits of leveraging intelligent RRM. For example, a 75 percent increase in data rate with respect to the conservative design approach for the scheduled traffic is achieved, while the 99.99 percent reliability of both scheduled and non-scheduled traffic types is satisfied.

Journal Article•DOI•
TL;DR: Numerical studies show that the network throughput can increase by eight times through adopting stochastic online learning as compared to existing offline implementations of MapReduce, the widely adopted big data analytic framework.
Abstract: ML has been increasingly adopted in wireless communications, with popular techniques, such as supervised, unsupervised, and reinforcement learning, applied to traffic classification, channel encoding/ decoding, and cognitive radio. This article discusses a different class of ML technique, stochastic online learning, and its promising applications to MEC. Based on stochastic gradient descent, stochastic online learning learns from the changes of dynamic systems (i.e., the gradient of the Lagrange multipliers) rather than training data, decouples tasks between time slots and edge devices, and asymptotically minimizes the time-averaged operational cost of MEC in a fully distributed fashion with the increase of the learning time. By taking the widely adopted big data analytic framework MapReduce as an example, numerical studies show that the network throughput can increase by eight times through adopting stochastic online learning as compared to existing offline implementations.

Journal Article•DOI•
TL;DR: This article describes how to replicate data from the cloud to the edge, and then to mobile devices to provide faster data access for users, and shows how services can be composed in crowded environments using service-specific overlays.
Abstract: Densely crowded environments such as stadiums and metro stations have shown shortcomings when users request data and services simultaneously. This is due to the excessive amount of requested and generated traffic from the user side. Based on the wide availability of user smart-mobile devices, and noting their technological advancements, devices are not being categorized only as data/service requesters anymore, but are readily being transformed to data/ service providing network-side tools. In essence, to offload some of the workload burden from the cloud, data can be either fully or partially replicated to edge and mobile devices for faster and more efficient data access in such dense environments. Moreover, densely crowded environments provide an opportunity to deliver, in a timely manner, through node collaboration, enriched user-specific services using the replicated data and device-specific capabilities. In this article, we first highlight the challenges that arise in densely crowded environments in terms of data/service management and delivery. Then we show how data replication and service composition are considered promising solutions for data and service management in densely crowded environments. Specifically, we describe how to replicate data from the cloud to the edge, and then to mobile devices to provide faster data access for users. We also discuss how services can be composed in crowded environments using service-specific overlays. We conclude the article with most of the open research areas that remain to be investigated.

Journal Article•DOI•
TL;DR: A novel centralized and converged analog Fiber-Wireless Fronthaul architecture is proposed, specifically designed to facilitate mmWave access in the above scenarios and can facilitate Gb/s-enabled data transport while abiding to the 5G low-latency KPIs in various network traffic conditions.
Abstract: mmWave radio, although instrumental for achieving the required 5G capacity KPIs, necessitates the need for a very large number of access points, which places an immense strain on the current network infrastructure. In this article, we try to identify the major challenges that inhibit the design of the Next Generation Fronthaul Interface in two upcoming distinctively highly dense environments: in Urban 5G deployments in metropolitan areas, and in ultra-dense Hotspot scenarios. Second, we propose a novel centralized and converged analog Fiber-Wireless Fronthaul architecture, specifically designed to facilitate mmWave access in the above scenarios. The proposed architecture leverages optical transceivers, optical add/drop multiplexers and optical beamforming integrated photonics towards a Digital Signal Processing analog fronthaul. The functional administration of the fronthaul infrastructure is achieved by means of a packetized Medium Transparent Dynamic Bandwidth Allocation protocol. Preliminary results show that the protocol can facilitate Gb/s-enabled data transport while abiding to the 5G low-latency KPIs in various network traffic conditions.

Journal Article•DOI•
TL;DR: A flexible, rapidly deployable, and cross-layer artificial intelligence (AI)-based framework to enable the imminent and future demands on 5G and beyond and the value of AI for enabling network evolution is discussed.
Abstract: Massive multiple-input multiple-output antenna systems, millimeter-wave communications, and ultra-dense networks have been widely perceived as the three key enablers that facilitate the development and deployment of 5G systems. This article discusses the intelligent agent that combines sensing, learning, and optimizing to facilitate these enablers. We present a flexible, rapidly deployable, and cross-layer artificial intelligence (AI)-based framework to enable the imminent and future demands on 5G and beyond. We present example AI-enabled 5G use cases that accommodate important 5G-specific capabilities and discuss the value of AI for enabling network evolution.

Journal Article•DOI•
TL;DR: A hierarchical system architecture is proposed, which aims at synthesizing the paradigms of software defined networking and fog computing in IoV and best exploiting their synergistic effects on information services.
Abstract: Recent advances in wireless communication, sensing, computation and control technologies have paved the way for the development of a new era of Internet of Vehicles (IoV). Demanded by the requirements of information-centric and data-driven intelligent transportation systems (ITS), it is of great significance to explore new paradigms of IoV in supporting large-scale, real-time, and reliable information services. In this article, we propose a hierarchical system architecture, which aims at synthesizing the paradigms of software defined networking and fog computing in IoV and best exploiting their synergistic effects on information services. Specifically, a four-layer architecture is designed, comprising the application layer, the control layer, the virtualization layer, and the data layer, with objectives of enabling logically centralized control via the separation of the control plane and the data plane; facilitating adaptive resource allocation and QoS oriented services based on network functions virtualization and network slicing, and enhancing system scalability, responsiveness, and reliability by exploiting the networking, computation, communication, and storage capacities of fog-based services. On this basis, we further analyze newly arising challenges and discuss future research directions by presenting a cross-layer protocol stack. Finally, for the proof of concept, we implement the system prototype and give two case studies in real-world IoV environments. The results of field tests not only demonstrate the great potential of the new architecture, but also give insight into the development of future ITS.

Journal Article•DOI•
TL;DR: The distribution of a typical additive white Gaussian noise channel is successfully approximated by using the proposed GAN-based channel modeling framework, thus verifying its good performance and effectiveness.
Abstract: In modern wireless communication systems, wireless channel modeling has always been a fundamental task in system design and performance optimization. Traditional channel modeling methods, such as ray-tracing and geometry- based stochastic channel models, require in-depth domain-specific knowledge and technical expertise in radio signal propagations across electromagnetic fields. To avoid these difficulties and complexities, a novel generative adversarial network (GAN) framework is proposed for the first time to address the problem of autonomous wireless channel modeling without complex theoretical analysis or data processing. Specifically, the GAN is trained by raw measurement data to reach the Nash equilibrium of a MinMax game between a channel data generator and a channel data discriminator. Once this process converges, the resulting channel data generator is extracted as the target channel model for a specific application scenario. To demonstrate, the distribution of a typical additive white Gaussian noise channel is successfully approximated by using the proposed GAN-based channel modeling framework, thus verifying its good performance and effectiveness.

Journal Article•DOI•
TL;DR: This paper proposes SecureNet, the first verifiable and privacy-preserving prediction protocol to protect model integrity and user privacy in DNNs, and shows the superior performance of SecureNet for detecting various integrity attacks against DNN models.
Abstract: Benefiting from the advancement of algorithms in massive data and powerful computing resources, deep learning has been explored in a wide variety of fields and produced unparalleled performance results. It plays a vital role in daily applications and is also subtly changing the rules, habits, and behaviors of society. However, inevitably, data-based learning strategies are bound to cause potential security and privacy threats, and arouse public as well as government concerns about its promotion to the real world. In this article, we mainly focus on data security issues in deep learning. We first investigate the potential threats of deep learning in this area, and then present the latest countermeasures based on various underlying technologies, where the challenges and research opportunities on offense and defense are also discussed. Then, we propose SecureNet, the first verifiable and privacy-preserving prediction protocol to protect model integrity and user privacy in DNNs. It can significantly resist various security and privacy threats during the prediction process. We simulate SecureNet under a real dataset, and the experimental results show the superior performance of SecureNet for detecting various integrity attacks against DNN models.

Journal Article•DOI•
TL;DR: It is shown that the effects of self-interference and inter-user interference due to full-duplex operation can be effectively mitigated by optimizing/enhancing the beamforming, power control, and link scheduling techniques.
Abstract: In this article, we study the combination of NOMA and full-duplex operation as a promising solution to improve the capacity of next-generation wireless systems. We study the application of full-duplex NOMA transmission in wireless cellular, relay and cognitive radio networks, and demonstrate achievable performance gains. It is shown that the effects of self-interference and inter-user interference due to full-duplex operation can be effectively mitigated by optimizing/enhancing the beamforming, power control, and link scheduling techniques. We also discuss research challenges and future directions so that full-duplex NOMA can be made practical in the near future.

Journal Article•DOI•
TL;DR: A novel spectrum management (SM) architecture for UAV-assisted cellular networks with special feature of mmWave is proposed and SM techniques for opportunistic utilization of low-altitude UAV swarm using multi-mode radio access technologies (RATs).
Abstract: In wireless communications, as the spatio-temporal distribution of the traffic is dynamic, the performance degradation of cellular networks becomes inevitable. Especially in catastrophic scenarios or hot-spot areas, terrestrial base stations may be poorly functioning and/or congested; thus, deploying ASCs that were carried by a UAV swarm is reasonable and cost-effective. Moreover, services such as disaster evaluation and live broadcasting require high-definition video streams, which undoubtedly need broad-band wireless transmission. To achieve this goal, the mmWave approach is introduced in the UAV swarm. However, wireless backhaul links, the mobility of UAVs and system coexistence hamper performance. In this article, we propose a novel spectrum management (SM) architecture for UAV-assisted cellular networks. Considering the special feature of mmWave, we also study SM techniques for opportunistic utilization of low-altitude UAV swarm using multi-mode radio access technologies (RATs). Both motivations and challenges of the proposed SM architecture for the UAV swarm are analyzed. To evaluate the performances of the proposed mmWave based wireless backhaul in UAV-assisted cellular networks, different SM schemes have been discussed and verified with numerical results in five typical scenarios.

Journal Article•DOI•
TL;DR: A deep-reinforcement- learning-based smart routing algorithm to make the distributed computing and communication infrastructure thoroughly viable while simultaneously satisfying the latency constraints of service requests from the crowd is proposed.
Abstract: The concept of smart city has been flourishing based on the prosperous development of various advanced technologies: mobile edge computing (MEC), ultra-dense networking, and software defined networking. However, it becomes increasingly complicated to design routing strategies to meet the stringent and ever changing network requirements due to the dynamic distribution of the crowd in different sectors of smart cities. To alleviate the network congestion and balance the network load for supporting smart city services with dramatic disparities, we design a deep-reinforcement- learning-based smart routing algorithm to make the distributed computing and communication infrastructure thoroughly viable while simultaneously satisfying the latency constraints of service requests from the crowd. Besides the proposed algorithm, extensive numerical results are also presented to validate its efficacy.

Journal Article•DOI•
TL;DR: This article proposes a fog-computing- enabled cognitive network functions virtualization approach for an information-centric future Internet, and proposes an on-demand caching function virtualization scheme and a communication scheme between the fog nodes and the future Internet nodes for the forwarding process.
Abstract: Information-centric networking (ICN) is an important trend that will impact the future of the Internet. ICN caters to large content consumption patterns while achieving high performance. New features in the information-centric future Internet, such as caching, name-based routing, and content-based security, bring novel challenges to a decentralized environment. On one hand, the processing capabilities on the edge in an information-centric future Internet need to implement smart analysis for large quantities of content. On the other hand, the computational and storage resources need to be configured and controlled on demand and based on cognition of the content from users. To address these challenges, this article proposes a fog-computing- enabled cognitive network functions virtualization approach for an information-centric future Internet. We first propose an on-demand caching function virtualization scheme and design a communication scheme between the fog nodes and the future Internet nodes for the forwarding process. Then, to attain smart control for related operations (i.e., routing, cache policy, and security), we propose a control function virtualization approach. Finally, a cognitive resource configuration mechanism is proposed. The simulation results show the advantages and efficiency of the proposed approach.

Journal Article•DOI•
TL;DR: This article embraces UAV to V2X communications to enhance the V2x security from the physical layer security's perspective and presents security problems in V2Z systems, highlight security threats in V 2X networks, and illustrate some potential applications of UAVs in V1X security.
Abstract: Enabling vehicles to communicate with other vehicles, infrastructure, pedestrians, and everything for use cases such as road safety, automatic driving, and infotainment is important for the future cellular networks. However, some vehicle-to-everything (V2X) services need secure transmissions and should be prevented from eavesdropping. Physical layer security, an information- theoretical framework, utilizes the randomness of underlying channels to ensure secrecy in the physical layer. Different from ground communications, unmanned aerial vehicles (UAVs) create line-of-sight connections to vehicles and mobile users making them an ideal platform to conduct physical layer security strategies. In this article, we embrace UAV to V2X communications to enhance the V2X security from the physical layer security's perspective. To be specific, we present security problems in V2X systems, highlight security threats in V2X networks, and illustrate some potential applications of UAVs in V2X security. Open problems for UAV-assisted V2X secure communications are also discussed.

Journal Article•DOI•
TL;DR: SPB eliminates the reliance on TTP to ensure both energy producer and consumer commit to their obligations by introducing atomic meta-transactions and reduces monetary cost and delay compared to existing solutions.
Abstract: Blockchain is increasingly being used to provide a distributed, secure, trusted, and private framework for energy trading in smart grids. However, existing solutions suffer from a lack of privacy, processing and packet overheads, and reliance on trusted third party (TTP) to secure the trade. To address these challenges, we propose a secure private blockchain (SPB) framework. SPB enables energy producers and consumers to directly negotiate the energy price. To reduce the associated overheads, we propose a routing method which routes packets based on the destination public key (PK). SPB eliminates the reliance on TTP to ensure both energy producer and consumer commit to their obligations by introducing atomic meta-transactions. The latter consists of two transactions: first the consumer generates a CTP transaction, committing to pay the energy price to the producer. On receipt of the energy, the smart meter of the consumer generates an energy receipt confirmation (ERC) which triggers a smart contract to transfer the committed price in CTP to the energy producer. To verify that the ERC is generated by a genuine smart meter, SPB supports authentication of anonymous smart meters to prevent malicious nodes from linking ERC transactions and thus enhance the user privacy. Qualitative security analysis shows the resilience of SPB against a range of attacks. Implementation results demonstrate that SPB reduces monetary cost and delay compared to existing solutions.