scispace - formally typeset
Search or ask a question

Showing papers on "Wireless published in 2018"


Posted Content
TL;DR: This tutorial provides key guidelines on how to analyze, optimize, and design UAV-based wireless communication systems on the basis of 3D deployment, performance analysis, channel modeling, and energy efficiency.
Abstract: The use of flying platforms such as unmanned aerial vehicles (UAVs), popularly known as drones, is rapidly growing. In particular, with their inherent attributes such as mobility, flexibility, and adaptive altitude, UAVs admit several key potential applications in wireless systems. On the one hand, UAVs can be used as aerial base stations to enhance coverage, capacity, reliability, and energy efficiency of wireless networks. On the other hand, UAVs can operate as flying mobile terminals within a cellular network. Such cellular-connected UAVs can enable several applications ranging from real-time video streaming to item delivery. In this paper, a comprehensive tutorial on the potential benefits and applications of UAVs in wireless communications is presented. Moreover, the important challenges and the fundamental tradeoffs in UAV-enabled wireless networks are thoroughly investigated. In particular, the key UAV challenges such as three-dimensional deployment, performance analysis, channel modeling, and energy efficiency are explored along with representative results. Then, open problems and potential research directions pertaining to UAV communications are introduced. Finally, various analytical frameworks and mathematical tools such as optimization theory, machine learning, stochastic geometry, transport theory, and game theory are described. The use of such tools for addressing unique UAV problems is also presented. In a nutshell, this tutorial provides key guidelines on how to analyze, optimize, and design UAV-based wireless communication systems.

1,071 citations


Journal ArticleDOI
TL;DR: An in depth study on the performance of deep learning based radio signal classification for radio communications signals considers a rigorous baseline method using higher order moments and strong boosted gradient tree classification, and compares performance between the two approaches across a range of configurations and channel impairments.
Abstract: We conduct an in depth study on the performance of deep learning based radio signal classification for radio communications signals. We consider a rigorous baseline method using higher order moments and strong boosted gradient tree classification, and compare performance between the two approaches across a range of configurations and channel impairments. We consider the effects of carrier frequency offset, symbol rate, and multipath fading in simulation, and conduct over-the-air measurement of radio classification performance in the lab using software radios, and we compare performance and training strategies for both. Finally, we conclude with a discussion of remaining problems, and design considerations for using such techniques.

865 citations


Journal ArticleDOI
TL;DR: This article proposes a radically different approach, enabling deterministic, programmable control over the behavior of wireless environments, using the so-called HyperSurface tile, a novel class of planar meta-materials that can interact with impinging electromagnetic waves in a controlled manner.
Abstract: Electromagnetic waves undergo multiple uncontrollable alterations as they propagate within a wireless environment. Free space path loss, signal absorption, as well as reflections, refractions, and diffractions caused by physical objects within the environment highly affect the performance of wireless communications. Currently, such effects are intractable to account for and are treated as probabilistic factors. This article proposes a radically different approach, enabling deterministic, programmable control over the behavior of wireless environments. The key enabler is the so-called HyperSurface tile, a novel class of planar meta-materials that can interact with impinging electromagnetic waves in a controlled manner. The HyperSurface tiles can effectively re-engineer electromagnetic waves, including steering toward any desired direction, full absorption, polarization manipulation, and more. Multiple tiles are employed to coat objects such as walls, furniture, and overall, any objects in indoor and outdoor environments. An external software service calculates and deploys the optimal interaction types per tile to best fit the needs of communicating devices. Evaluation via simulations highlights the potential of the new concept.

860 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a unified MEC-WPT design by considering a wireless powered multiuser MEC system, where a multiantenna access point (AP) integrated with an MEC server broadcasts wireless power to charge multiple users and each user node relies on the harvested energy to execute computation tasks.
Abstract: Mobile-edge computing (MEC) and wireless power transfer (WPT) have been recognized as promising techniques in the Internet of Things era to provide massive low-power wireless devices with enhanced computation capability and sustainable energy supply. In this paper, we propose a unified MEC-WPT design by considering a wireless powered multiuser MEC system, where a multiantenna access point (AP) (integrated with an MEC server) broadcasts wireless power to charge multiple users and each user node relies on the harvested energy to execute computation tasks. With MEC, these users can execute their respective tasks locally by themselves or offload all or part of them to the AP based on a time-division multiple access protocol. Building on the proposed model, we develop an innovative framework to improve the MEC performance, by jointly optimizing the energy transmit beamforming at the AP, the central processing unit frequencies and the numbers of offloaded bits at the users, as well as the time allocation among users. Under this framework, we address a practical scenario where latency-limited computation is required. In this case, we develop an optimal resource allocation scheme that minimizes the AP’s total energy consumption subject to the users’ individual computation latency constraints. Leveraging the state-of-the-art optimization techniques, we derive the optimal solution in a semiclosed form. Numerical results demonstrate the merits of the proposed design over alternative benchmark schemes.

752 citations


Journal ArticleDOI
TL;DR: In this paper, the authors study the potential advantages of allowing for non-orthogonal sharing of RAN resources in uplink communications from a set of eMBB, mMTC, and URLLC devices to a common base station.
Abstract: The grand objective of 5G wireless technology is to support three generic services with vastly heterogeneous requirements: enhanced mobile broadband (eMBB), massive machine-type communications (mMTCs), and ultra-reliable low-latency communications (URLLCs). Service heterogeneity can be accommodated by network slicing, through which each service is allocated resources to provide performance guarantees and isolation from the other services. Slicing of the radio access network (RAN) is typically done by means of orthogonal resource allocation among the services. This paper studies the potential advantages of allowing for non-orthogonal sharing of RAN resources in uplink communications from a set of eMBB, mMTC, and URLLC devices to a common base station. The approach is referred to as heterogeneous non-orthogonal multiple access (H-NOMA), in contrast to the conventional NOMA techniques that involve users with homogeneous requirements and hence can be investigated through a standard multiple access channel. The study devises a communication-theoretic model that accounts for the heterogeneous requirements and characteristics of the three services. The concept of reliability diversity is introduced as a design principle that leverages the different reliability requirements across the services in order to ensure performance guarantees with non-orthogonal RAN slicing. This paper reveals that H-NOMA can lead, in some regimes, to significant gains in terms of performance tradeoffs among the three generic services as compared to orthogonal slicing.

654 citations


Journal ArticleDOI
TL;DR: This paper aims to provide a contemporary and comprehensive literature review on fundamentals, applications, challenges, and research efforts/progress of ambient backscatter communications.
Abstract: Recently, ambient backscatter communication has been introduced as a cutting-edge technology which enables smart devices to communicate by utilizing ambient radio frequency (RF) signals without requiring active RF transmission. This technology is especially effective in addressing communication and energy efficiency problems for low-power communications systems such as sensor networks, and thus it is expected to realize numerous Internet-of-Things applications. Therefore, this paper aims to provide a contemporary and comprehensive literature review on fundamentals, applications, challenges, and research efforts/progress of ambient backscatter communications. In particular, we first present fundamentals of backscatter communications and briefly review bistatic backscatter communications systems. Then, the general architecture, advantages, and solutions to address existing issues and limitations of ambient backscatter communications systems are discussed. Additionally, emerging applications of ambient backscatter communications are highlighted. Finally, we outline some open issues and future research directions.

650 citations


Journal ArticleDOI
TL;DR: This paper identifies and provides a detailed description of various potential emerging technologies for the fifth generation communications with SWIPT/WPT and provides some interesting research challenges and recommendations with the objective of stimulating future research in this emerging domain.
Abstract: Initial efforts on wireless power transfer (WPT) have concentrated toward long-distance transmission and high power applications. Nonetheless, the lower achievable transmission efficiency and potential health concerns arising due to high power applications, have caused limitations in their further developments. Due to tremendous energy consumption growth with ever-increasing connected devices, alternative wireless information and power transfer techniques have been important not only for theoretical research but also for the operational costs saving and for the sustainable growth of wireless communications. In this regard, radio frequency energy harvesting (RF-EH) for a wireless communications system presents a new paradigm that allows wireless nodes to recharge their batteries from the RF signals instead of fixed power grids and the traditional energy sources. In this approach, the RF energy is harvested from ambient electromagnetic sources or from the sources that directionally transmit RF energy for EH purposes. Notable research activities and major advances have occurred over the last decade in this direction. Thus, this paper provides a comprehensive survey of the state-of-art techniques, based on advances and open issues presented by simultaneous wireless information and power transfer (SWIPT) and WPT assisted technologies. More specifically, in contrast to the existing works, this paper identifies and provides a detailed description of various potential emerging technologies for the fifth generation communications with SWIPT/WPT. Moreover, we provide some interesting research challenges and recommendations with the objective of stimulating future research in this emerging domain.

621 citations


Journal ArticleDOI
TL;DR: In this article, the authors considered a multi-user MEC network powered by the WPT, where each energy-harvesting WD follows a binary computation offloading policy, i.e., the data set of a task has to be executed as a whole either locally or remotely at the MEC server via task offloading.
Abstract: Finite battery lifetime and low computing capability of size-constrained wireless devices (WDs) have been longstanding performance limitations of many low-power wireless networks, e.g., wireless sensor networks and Internet of Things. The recent development of radio frequency-based wireless power transfer (WPT) and mobile edge computing (MEC) technologies provide a promising solution to fully remove these limitations so as to achieve sustainable device operation and enhanced computational capability. In this paper, we consider a multi-user MEC network powered by the WPT, where each energy-harvesting WD follows a binary computation offloading policy, i.e., the data set of a task has to be executed as a whole either locally or remotely at the MEC server via task offloading. In particular, we are interested in maximizing the (weighted) sum computation rate of all the WDs in the network by jointly optimizing the individual computing mode selection (i.e., local computing or offloading) and the system transmission time allocation (on WPT and task offloading). The major difficulty lies in the combinatorial nature of the multi-user computing mode selection and its strong coupling with the transmission time allocation. To tackle this problem, we first consider a decoupled optimization, where we assume that the mode selection is given and propose a simple bi-section search algorithm to obtain the conditional optimal time allocation. On top of that, a coordinate descent method is devised to optimize the mode selection. The method is simple in implementation but may suffer from high computational complexity in a large-size network. To address this problem, we further propose a joint optimization method based on the alternating direction method of multipliers (ADMM) decomposition technique, which enjoys a much slower increase of computational complexity as the networks size increases. Extensive simulations show that both the proposed methods can efficiently achieve a near-optimal performance under various network setups, and significantly outperform the other representative benchmark methods considered.

563 citations


Proceedings ArticleDOI
05 Sep 2018
TL;DR: In this paper, an RIS-enhanced point-to-point multiple-input-single-output (MISO) wireless system where one RIS is deployed to assist in the communication from an access point (AP) to a single-antenna user is considered, where the user simultaneously receives the signal sent directly from the AP as well as that reflected by the RIS.
Abstract: Intelligent reflecting surface (IRS) is envisioned to have abundant applications in future wireless networks by smartly reconfiguring the signal propagation for performance enhance- ment. Specifically, an IRS consists of a large number of low- cost passive elements each reflecting the incident signal with a certain phase shift to collaboratively achieve beamforming and suppress interference at one or more designated receivers. In this paper, we study an IRS-enhanced point-to-point multiple- input single-output (MISO) wireless system where one IRS is deployed to assist in the communication from a multi-antenna access point (AP) to a single-antenna user. As a result, the user simultaneously receives the signal sent directly from the AP as well as that reflected by the IRS. We aim to maximize the total received signal power at the user by jointly optimizing the (active) transmit beamforming at the AP and (passive) reflect beamforming by the phase shifters at the IRS. We first propose a centralized algorithm based on the technique of semidefinite relaxation (SDR) by assuming the global channel state information (CSI) available at the IRS. Since the centralized implementation requires excessive channel estimation and signal exchange overheads, we further propose a low-complexity distributed algorithm where the AP and IRS independently adjust the transmit beamforming and the phase shifts in an alternating manner until the convergence is reached. Simulation results show that significant performance gains can be achieved by the proposed algorithms as compared to benchmark schemes. Moreover, it is verified that the IRS is able to drastically enhance the link quality and/or coverage over the conventional setup without the IRS.

557 citations


Journal ArticleDOI
TL;DR: The suitability of hybrid beamforming methods, both, existing and proposed till first quarter of 2017, are explored, and the exciting future challenges in this domain are identified.
Abstract: The increasing wireless data traffic demands have driven the need to explore suitable spectrum regions for meeting the projected requirements. In the light of this, millimeter wave (mmWave) communication has received considerable attention from the research community. Typically, in fifth generation (5G) wireless networks, mmWave massive multiple-input multiple-output (MIMO) communications is realized by the hybrid transceivers which combine high dimensional analog phase shifters and power amplifiers with lower-dimensional digital signal processing units. This hybrid beamforming design reduces the cost and power consumption which is aligned with an energy-efficient design vision of 5G. In this paper, we track the progress in hybrid beamforming for massive MIMO communications in the context of system models of the hybrid transceivers’ structures, the digital and analog beamforming matrices with the possible antenna configuration scenarios and the hybrid beamforming in heterogeneous wireless networks. We extend the scope of the discussion by including resource management issues in hybrid beamforming. We explore the suitability of hybrid beamforming methods, both, existing and proposed till first quarter of 2017, and identify the exciting future challenges in this domain.

505 citations


Journal ArticleDOI
TL;DR: In this paper, a UAV-enabled MEC wireless powered system is investigated under both partial and binary computation offloading modes, subject to the energy harvesting causal constraint and the UAV's speed constraint.
Abstract: Mobile-edge computing (MEC) and wireless power transfer are two promising techniques to enhance the computation capability and to prolong the operational time of low-power wireless devices that are ubiquitous in Internet of Things. However, the computation performance and the harvested energy are significantly impacted by the severe propagation loss. In order to address this issue, an unmanned aerial vehicle (UAV)-enabled MEC wireless-powered system is studied in this paper. The computation rate maximization problems in a UAV-enabled MEC wireless powered system are investigated under both partial and binary computation offloading modes, subject to the energy-harvesting causal constraint and the UAV’s speed constraint. These problems are non-convex and challenging to solve. A two-stage algorithm and a three-stage alternative algorithm are, respectively, proposed for solving the formulated problems. The closed-form expressions for the optimal central processing unit frequencies, user offloading time, and user transmit power are derived. The optimal selection scheme on whether users choose to locally compute or offload computation tasks is proposed for the binary computation offloading mode. Simulation results show that our proposed resource allocation schemes outperform other benchmark schemes. The results also demonstrate that the proposed schemes converge fast and have low computational complexity.

Journal ArticleDOI
TL;DR: The preliminary outcomes of extensive research on mmWave massive MIMO are presented and emerging trends together with their respective benefits, challenges, and proposed solutions are highlighted to point out current trends, evolving research issues and future directions on this technology.
Abstract: Several enabling technologies are being explored for the fifth-generation (5G) mobile system era. The aim is to evolve a cellular network that remarkably pushes forward the limits of legacy mobile systems across all dimensions of performance metrics. One dominant technology that consistently features in the list of the 5G enablers is the millimeter-wave (mmWave) massive multiple-input-multiple-output (massive MIMO) system. It shows potentials to significantly raise user throughput, enhance spectral and energy efficiencies and increase the capacity of mobile networks using the joint capabilities of the huge available bandwidth in the mmWave frequency bands and high multiplexing gains achievable with massive antenna arrays. In this survey, we present the preliminary outcomes of extensive research on mmWave massive MIMO (as research on this subject is still in the exploratory phase) and highlight emerging trends together with their respective benefits, challenges, and proposed solutions. The survey spans broad areas in the field of wireless communications, and the objective is to point out current trends, evolving research issues and future directions on mmWave massive MIMO as a technology that will open up new frontiers of services and applications for next-generation cellular networks.

Journal ArticleDOI
TL;DR: In this article, a new data-driven model for automatic modulation classification based on long short term memory (LSTM) is proposed, which learns from the time domain amplitude and phase information of the modulation schemes present in the training data without requiring expert features like higher order cyclic moments.
Abstract: This paper looks into the modulation classification problem for a distributed wireless spectrum sensing network. First, a new data-driven model for automatic modulation classification based on long short term memory (LSTM) is proposed. The model learns from the time domain amplitude and phase information of the modulation schemes present in the training data without requiring expert features like higher order cyclic moments. Analyses show that the proposed model yields an average classification accuracy of close to 90% at varying signal-to-noise ratio conditions ranging from 0 dB to 20 dB. Further, we explore the utility of this LSTM model for a variable symbol rate scenario. We show that a LSTM based model can learn good representations of variable length time domain sequences, which is useful in classifying modulation signals with different symbol rates. The achieved accuracy of 75% on an input sample length of 64 for which it was not trained, substantiates the representation power of the model. To reduce the data communication overhead from distributed sensors, the feasibility of classification using averaged magnitude spectrum data and on-line classification on the low-cost spectrum sensors are studied. Furthermore, quantized realizations of the proposed models are analyzed for deployment on sensors with low processing power.

Journal ArticleDOI
21 Nov 2018-Sensors
TL;DR: An overview of five Co-CPS use cases, as introduced in the SafeCOP EU project, and a comprehensive analysis of the main existing wireless communication technologies giving details about the protocols developed within particular standardization bodies are provided.
Abstract: Cooperative Cyber-Physical Systems (Co-CPSs) can be enabled using wireless communication technologies, which in principle should address reliability and safety challenges. Safety for Co-CPS enabled by wireless communication technologies is a crucial aspect and requires new dedicated design approaches. In this paper, we provide an overview of five Co-CPS use cases, as introduced in our SafeCOP EU project, and analyze their safety design requirements. Next, we provide a comprehensive analysis of the main existing wireless communication technologies giving details about the protocols developed within particular standardization bodies. We also investigate to what extent they address the non-functional requirements in terms of safety, security and real time, in the different application domains of each use case. Finally, we discuss general recommendations about the use of different wireless communication technologies showing their potentials in the selected real-world use cases. The discussion is provided under consideration in the 5G standardization process within 3GPP, whose current efforts are inline to current gaps in wireless communications protocols for Co-CPSs including many future use cases.

Posted Content
TL;DR: Simulation results show that the proposed resource allocation schemes outperform other benchmark schemes and converge fast and have low computational complexity.
Abstract: Mobile edge computing (MEC) and wireless power transfer (WPT) are two promising techniques to enhance the computation capability and to prolong the operational time of low-power wireless devices that are ubiquitous in Internet of Things. However, the computation performance and the harvested energy are significantly impacted by the severe propagation loss. In order to address this issue, an unmanned aerial vehicle (UAV)-enabled MEC wireless powered system is studied in this paper. The computation rate maximization problems in a UAV-enabled MEC wireless powered system are investigated under both partial and binary computation offloading modes, subject to the energy harvesting causal constraint and the UAV's speed constraint. These problems are non-convex and challenging to solve. A two-stage algorithm and a three-stage alternative algorithm are respectively proposed for solving the formulated problems. The closed-form expressions for the optimal central processing unit frequencies, user offloading time, and user transmit power are derived. The optimal selection scheme on whether users choose to locally compute or offload computation tasks is proposed for the binary computation offloading mode. Simulation results show that our proposed resource allocation schemes outperforms other benchmark schemes. The results also demonstrate that the proposed schemes converge fast and have low computational complexity.

Journal ArticleDOI
TL;DR: In this article, a Deep Reinforcement Learning-based Online Offloading (DROO) framework is proposed to optimize task offloading decisions and wireless resource allocation to the time-varying wireless channel conditions.
Abstract: Wireless powered mobile-edge computing (MEC) has recently emerged as a promising paradigm to enhance the data processing capability of low-power networks, such as wireless sensor networks and internet of things (IoT). In this paper, we consider a wireless powered MEC network that adopts a binary offloading policy, so that each computation task of wireless devices (WDs) is either executed locally or fully offloaded to an MEC server. Our goal is to acquire an online algorithm that optimally adapts task offloading decisions and wireless resource allocations to the time-varying wireless channel conditions. This requires quickly solving hard combinatorial optimization problems within the channel coherence time, which is hardly achievable with conventional numerical optimization methods. To tackle this problem, we propose a Deep Reinforcement learning-based Online Offloading (DROO) framework that implements a deep neural network as a scalable solution that learns the binary offloading decisions from the experience. It eliminates the need of solving combinatorial optimization problems, and thus greatly reduces the computational complexity especially in large-size networks. To further reduce the complexity, we propose an adaptive procedure that automatically adjusts the parameters of the DROO algorithm on the fly. Numerical results show that the proposed algorithm can achieve near-optimal performance while significantly decreasing the computation time by more than an order of magnitude compared with existing optimization methods. For example, the CPU execution latency of DROO is less than $0.1$ second in a $30$-user network, making real-time and optimal offloading truly viable even in a fast fading environment.

Journal ArticleDOI
TL;DR: A novel integrated machine learning and coordinated beamforming solution is developed to overcome challenges and enable highly-mobile mmWave applications with reliable coverage, low latency, and negligible training overhead.
Abstract: Supporting high mobility in millimeter wave (mmWave) systems enables a wide range of important applications, such as vehicular communications and wireless virtual/augmented reality. Realizing this in practice, though, requires overcoming several challenges. First, the use of narrow beams and the sensitivity of mmWave signals to blockage greatly impact the coverage and reliability of highly-mobile links. Second, highly-mobile users in dense mmWave deployments need to frequently hand-off between base stations (BSs), which is associated with critical control and latency overhead. Furthermore, identifying the optimal beamforming vectors in large antenna array mmWave systems requires considerable training overhead, which significantly affects the efficiency of these mobile systems. In this paper, a novel integrated machine learning and coordinated beamforming solution is developed to overcome these challenges and enable highly-mobile mmWave applications. In the proposed solution, a number of distributed yet coordinating BSs simultaneously serve a mobile user. This user ideally needs to transmit only one uplink training pilot sequence that will be jointly received at the coordinating BSs using omni or quasi-omni beam patterns. These received signals draw a defining signature not only for the user location, but also for its interaction with the surrounding environment. The developed solution then leverages a deep learning model that learns how to use these signatures to predict the beamforming vectors at the BSs. This renders a comprehensive solution that supports highly mobile mmWave applications with reliable coverage, low latency, and negligible training overhead. Extensive simulation results based on accurate ray-tracing, show that the proposed deep-learning coordinated beamforming strategy approaches the achievable rate of the genie-aided solution that knows the optimal beamforming vectors with no training overhead. Compared with traditional beamforming solutions, the results show that the proposed deep learning-based strategy attains higher rates, especially in high-mobility large-array regimes.

Journal ArticleDOI
TL;DR: Deep learning is used to detect physical-layer attributes for the identification of cognitive radio devices, and the method is based on the empirical principle that manufacturing variability among wireless transmitters that conform to the same standard creates unique, repeatable signatures in each transmission.
Abstract: With the increasing presence of cognitive radio networks as a means to address limited spectral resources, improved wireless security has become a necessity. In particular, the potential of a node to impersonate a licensed user demonstrates the need for techniques to authenticate a radio's true identity. In this paper, we use deep learning to detect physical-layer attributes for the identification of cognitive radio devices, and demonstrate the performance of our method on a set of IEEE 802.15.4 devices. Our method is based on the empirical principle that manufacturing variability among wireless transmitters that conform to the same standard creates unique, repeatable signatures in each transmission, which can then be used as a fingerprint for device identification and verification. We develop a framework for training a convolutional neural network using the time-domain complex baseband error signal and demonstrate 92.29% identification accuracy on a set of seven 2.4 GHz commercial ZigBee devices. We also demonstrate the robustness of our method over a wide range of signal-to-noise ratios.

Posted Content
TL;DR: An overview of the essentials of the state of the art in 5G wireless technology represented by the 3GPP NR technical specifications is provided, with a focus on the physical layer.
Abstract: The 5th generation (5G) wireless access technology, known as new radio (NR), will address a variety of usage scenarios from enhanced mobile broadband to ultra-reliable low-latency communications to massive machine type communications. Key technology features include ultra-lean transmission, support for low latency, advanced antenna technologies, and spectrum flexibility including operation in high frequency bands and inter-working between high and low frequency bands. This article provides an overview of the essentials of the state of the art in 5G wireless technology represented by the 3GPP NR technical specifications, with a focus on the physical layer. We describe the fundamental concepts of 5G NR, explain in detail the design of physical channels and reference signals, and share the various design rationales influencing standardization.

Journal ArticleDOI
TL;DR: This survey presents the state-of-the-art wireless network design and optimization for WNCS, while highlighting the tradeoff between the achievable performance and complexity of various approaches.
Abstract: Wireless networked control systems (WNCSs) are composed of spatially distributed sensors, actuators, and controllers communicating through wireless networks instead of conventional point-to-point wired connections. Due to their main benefits in the reduction of deployment and maintenance costs, large flexibility and possible enhancement of safety, WNCS are becoming a fundamental infrastructure technology for critical control systems in automotive electrical systems, avionics control systems, building management systems, and industrial automation systems. The main challenge in WNCS is to jointly design the communication and control systems considering their tight interaction to improve the control performance and the network lifetime. In this survey, we make an exhaustive review of the literature on wireless network design and optimization for WNCS. First, we discuss what we call the critical interactive variables including sampling period, message delay, message dropout, and network energy consumption. The mutual effects of these communication and control variables motivate their joint tuning. We discuss the analysis and design of control systems taking into account the effect of the interactive variables on the control system performance. Moreover, we discuss the effect of controllable wireless network parameters at all layers of the communication protocols on the probability distribution of these interactive variables. We also review the current wireless network standardization for WNCS and their corresponding methodology for adapting the network parameters. Finally, we present the state-of-the-art wireless network design and optimization for WNCS, while highlighting the tradeoff between the achievable performance and complexity of various approaches. We conclude the survey by highlighting major research issues and identifying future research directions.

Journal ArticleDOI
TL;DR: In this article, the authors provide a technology overview and a review on optical wireless technologies, such as visible light communication, light fidelity, optical camera communication, free space optical communication, and light detection and ranging.
Abstract: New high-data-rate multimedia services and applications are evolving continuously and exponentially increasing the demand for wireless capacity of fifth-generation (5G) and beyond. The existing radio frequency (RF) communication spectrum is insufficient to meet the demands of future high-data-rate 5G services. Optical wireless communication (OWC), which uses an ultra-wide range of unregulated spectrum, has emerged as a promising solution to overcome the RF spectrum crisis. It has attracted growing research interest worldwide in the last decade for indoor and outdoor applications. OWC offloads huge data traffic applications from RF networks. A 100 Gb/s data rate has already been demonstrated through OWC. It offers services indoors as well as outdoors, and communication distances range from several nm to more than 10 000 km. This paper provides a technology overview and a review on optical wireless technologies, such as visible light communication, light fidelity, optical camera communication, free space optical communication, and light detection and ranging. We survey the key technologies for understanding OWC and present state-of-the-art criteria in aspects, such as classification, spectrum use, architecture, and applications. The key contribution of this paper is to clarify the differences among different promising optical wireless technologies and between these technologies and their corresponding similar existing RF technologies.

Journal ArticleDOI
TL;DR: The principles for supporting URLLC are discussed from the perspective of the traditional assumptions and models applied in communication/information theory, and how these principles are applied in various elements of system design, such as use of various diversity sources, design of packets, and access protocols.
Abstract: URLLC is an important new feature brought by 5G, with a potential to support a vast set of applications that rely on mission-critical links. In this article, we first discuss the principles for supporting URLLC from the perspective of the traditional assumptions and models applied in communication/information theory. We then discuss how these principles are applied in various elements of system design, such as use of various diversity sources, design of packets, and access protocols. The important message is that there is a need to optimize the transmission of signaling information, as well as a need for lean use of various sources of diversity.

Journal ArticleDOI
TL;DR: A user authentication protocol scheme with privacy protection for IIoT is proposed and the security of the proposed scheme is proved under a random oracle model, and other security discussions show that the proposed protocol is robust to various attacks.
Abstract: Wireless sensor networks (WSNs) play an important role in the industrial Internet of Things (IIoT) and have been widely used in many industrial fields to gather data of monitoring area. However, due to the open nature of wireless channel and resource-constrained feature of sensor nodes, how to guarantee that the sensitive sensor data can only be accessed by a valid user becomes a key challenge in IIoT environment. Some user authentication protocols for WSNs have been proposed to address this issue. However, previous works more or less have their own weaknesses, such as not providing user anonymity and other ideal functions or being vulnerable to some attacks. To provide secure communication for IIoT, a user authentication protocol scheme with privacy protection for IIoT has been proposed. The security of the proposed scheme is proved under a random oracle model, and other security discussions show that the proposed protocol is robust to various attacks. Furthermore, the comparison results with other related protocols and the simulation by NS-3 show that the proposed protocol is secure and efficient for IIoT.

Posted Content
TL;DR: The HyperSurface tiles as discussed by the authors can effectively re-engineer electromagnetic waves, including steering towards any desired direction, full absorption, polarization manipulation, and more, by using planar meta-materials.
Abstract: Electromagnetic waves undergo multiple uncontrollable alterations as they propagate within a wireless environment. Free space path loss, signal absorption, as well as reflections, refractions and diffractions caused by physical objects within the environment highly affect the performance of wireless communications. Currently, such effects are intractable to account for and are treated as probabilistic factors. The paper proposes a radically different approach, enabling deterministic, programmable control over the behavior of the wireless environments. The key-enabler is the so-called HyperSurface tile, a novel class of planar meta-materials which can interact with impinging electromagnetic waves in a controlled manner. The HyperSurface tiles can effectively re-engineer electromagnetic waves, including steering towards any desired direction, full absorption, polarization manipulation and more. Multiple tiles are employed to coat objects such as walls, furniture, overall, any objects in the indoor and outdoor environments. An external software service calculates and deploys the optimal interaction types per tile, to best fit the needs of communicating devices. Evaluation via simulations highlights the potential of the new concept.

Journal ArticleDOI
TL;DR: In this paper, UAV assisted secure transmission for scalable videos in hyper-dense networks via caching is studied and the feasibility conditions of the proposed scheme are derived, and the secrecy performance is analyzed.
Abstract: Unmanned aerial vehicles (UAVs) can help small-cell base stations (SBSs) offload traffic via wireless backhaul to improve coverage and increase rate. However, the capacity of backhaul is limited. In this paper, UAV assisted secure transmission for scalable videos in hyper-dense networks via caching is studied. In the proposed scheme, UAVs can act as SBSs to provide videos to mobile users in some small cells. To reduce the pressure of wireless backhaul, UAVs and SBSs are both equipped with caches to store videos at off-peak time. To facilitate UAVs, a single antenna is equipped at each UAV and thus, only the precoding matrices of SBSs should be cooperatively designed to manage interference by exploiting the principle of interference alignment. On the other hand, the SBSs replaced by UAVs will be idle. Thus, in order to guarantee secure transmission, the idle SBSs can be further exploited to generate jamming signal to disrupt eavesdropping. The jamming signal is zero-forced at the legitimate users through the precoding of the idle SBSs, without affecting the legitimate transmission. The feasibility conditions of the proposed scheme are derived, and the secrecy performance is analyzed. Finally, simulation results are presented to verify the effectiveness of the proposed scheme.

Journal ArticleDOI
15 Oct 2018-Nature
TL;DR: Contrary to current expectation, eavesdropping on terahertz wireless data links is shown to be easier than expected, by placing an object in the path of the signal that scatters part of it to a receiver located elsewhere.
Abstract: Resiliency against eavesdropping and other security threats has become one of the key design considerations for communication systems. As wireless systems become ubiquitous, there is an increasing need for security protocols at all levels, including software (such as encryption), hardware (such as trusted platform modules) and the physical layer (such as wave-front engineering)1–5. With the inevitable shift to higher carrier frequencies, especially in the terahertz range (above 100 gigahertz), an important consideration is the decreased angular divergence (that is, the increased directionality) of transmitted signals, owing to the reduced effects of diffraction on waves with shorter wavelengths. In recent years, research on wireless devices6–8 and systems9–11 that operate at terahertz frequencies has ramped up markedly. These high-frequency, narrow-angle broadcasts present a more challenging environment for eavesdroppers compared to the wide-area broadcasts used at lower frequencies12,13. However, despite the widespread assumption of improved security for high-frequency wireless data links14–16, the possibility of terahertz eavesdropping has not yet been characterized. A few recent studies have considered the issue at lower frequencies5,12,13,17,18, but generally with the idea that the eavesdropper’s antenna must be located within the broadcast sector of the transmitting antenna, leading to the conclusion that eavesdropping becomes essentially impossible when the transmitted signal has sufficiently high directionality15. Here we demonstrate that, contrary to this expectation, an eavesdropper can intercept signals in line-of-sight transmissions, even when they are transmitted at high frequencies with narrow beams. The eavesdropper’s techniques are different from those for lower-frequency transmissions, as they involve placing an object in the path of the transmission to scatter radiation towards the eavesdropper. We also discuss one counter-measure for this eavesdropping technique, which involves characterizing the backscatter of the channel. We show that this counter-measure can be used to detect some, although not all, eavesdroppers. Our work highlights the importance of physical-layer security in terahertz wireless networks and the need for transceiver designs that incorporate new counter-measures. Contrary to current expectation, eavesdropping on terahertz wireless data links is shown to be easier than expected, by placing an object in the path of the signal that scatters part of it to a receiver located elsewhere.

Proceedings ArticleDOI
01 Dec 2018
TL;DR: In this article, the suitability of LIS for green communications in terms of energy efficiency was investigated, which is expressed as the number of bits per Joule, and the transmit powers per user and the values for the surface elements that jointly maximize the system's EE performance.
Abstract: We consider a multi-user Multiple-Input Single-Output (MISO) communication system comprising of a multiantenna base station communicating in the downlink simultaneously with multiple single-antenna mobile users. This communication is assumed to be assisted by a Large Intelligent Surface (LIS) that consists of many nearly passive antenna elements, whose parameters can be tuned according to desired objectives. The latest design advances on these surfaces suggest cheap elements effectively acting as low resolution (even 1-bit resolution) phase shifters, whose joint configuration affects the electromagnetic behavior of the wireless propagation channel. In this paper, we investigate the suitability of LIS for green communications in terms of Energy Efficiency (EE), which is expressed as the number of bits per Joule. In particular, for the considered multi-user MISO system, we design the transmit powers per user and the values for the surface elements that jointly maximize the system's EE performance. Our representative simulation results show that LIS-assisted communication, even with nearly passive 1-bit resolution antenna elements, provides significant EE gains compared to conventional relay-assisted communication.

Journal ArticleDOI
TL;DR: A survey of the existing methodologies related to aspects such as interference management, network discovery, proximity services, and network security in D2D networks is presented and new dimensions with regard to D1D communication are introduced.
Abstract: The increasing number of mobile users has given impetus to the demand for high data rate proximity services. The fifth-generation (5G) wireless systems promise to improve the existing technology according to the future demands and provide a road map for reliable and resource-efficient solutions. Device-to-device (D2D) communication has been envisioned as an allied technology of 5G wireless systems for providing services that include live data and video sharing. A D2D communication technique opens new horizons of device-centric communications, i.e., exploiting direct D2D links instead of relying solely on cellular links. Offloading traffic from traditional network-centric entities to D2D network enables low computational complexity at the base station besides increasing the network capacity. However, there are several challenges associated with D2D communication. In this paper, we present a survey of the existing methodologies related to aspects such as interference management, network discovery, proximity services, and network security in D2D networks. We conclude by introducing new dimensions with regard to D2D communication and delineate aspects that require further research.

Proceedings ArticleDOI
20 May 2018
TL;DR: An analytical characterization of the achievable throughput of three different communication modes, namely, instantaneous transmission, delay-constrained transmission, and delay tolerant transmission is provided and it is shown that the instantaneous transmission mode attains the highest throughput.
Abstract: In this paper, we propose an innovative spatial-modulation (SM) based full-duplex (FD) decode-and-forward (DF) relaying protocol where the energy-constrained dual-antenna relay is powered by the radio frequency (RF) energy from the single-antenna source using the time-switching (TS) architecture. In this system, either one or both of the relay antennas receive the energy signal from the source in the energy harvesting phase. In the information transmission phase, one of the two relay antennas is selected to be active to decode and forward the information transmitted from the source and the other relay antenna receives the information from the source at the same time. In this way, the throughput of the information transmission between the relay and the destination can be significantly improved by the additional information mapped to the active antenna index which consequently leads to the improvement of the overall system throughput. Since the current SM capacity solution is not in a closed-form, we propose two tight SM capacity upper bounds and present the solution of the optimal time split ratio for the maximum system throughput according to the proposed upper bound. Monte-carlo simulations are conducted to verify the analysis and reveal the throughput gain of the proposed SM-FD relaying protocol in comparison with conventional FD relaying protocol.

Journal ArticleDOI
TL;DR: A comprehensive tutorial on technologies, requirements, architectures, challenges, and potential solutions on means of achieving an efficient C-RAN optical fronthaul for the next-generation network such as the fifth generation network and beyond is presented.
Abstract: The exponential traffic growth, demand for high speed wireless data communications, as well as incessant deployment of innovative wireless technologies, services, and applications, have put considerable pressure on the mobile network operators (MNOs). Consequently, cellular access network performance in terms of capacity, quality of service, and network coverage needs further considerations. In order to address the challenges, MNOs, as well as equipment vendors, have given significant attention to the small-cell schemes based on cloud radio access network (C-RAN). This is due to its beneficial features in terms of performance optimization, cost-effectiveness, easier infrastructure deployment, and network management. Nevertheless, the C-RAN architecture imposes stringent requirements on the fronthaul link for seamless connectivity. Digital radio over fiber-based common public radio interface (CPRI) is the fundamental means of distributing baseband samples in the C-RAN fronthaul. However, optical links which are based on CPRI have bandwidth and flexibility limitations. Therefore, these limitations might constrain or make them impractical for the next generation mobile systems which are envisaged not only to support carrier aggregation and multi-band but also envisioned to integrate technologies like millimeter-wave (mm-wave) and massive multiple-input multiple-output antennas into the base stations. In this paper, we present comprehensive tutorial on technologies, requirements, architectures, challenges, and proffer potential solutions on means of achieving an efficient C-RAN optical fronthaul for the next-generation network such as the fifth generation network and beyond. A number of viable fronthauling technologies such as mm-wave and wireless fidelity are considered and this paper mainly focuses on optical technologies such as optical fiber and free-space optical. We also present feasible means of reducing the system complexity, cost, bandwidth requirement, and latency in the fronthaul. Furthermore, means of achieving the goal of green communication networks through reduction in the power consumption by the system are considered.