scispace - formally typeset
Search or ask a question

Showing papers on "Base station published in 2020"


Journal ArticleDOI
TL;DR: This paper develops a DRL based algorithm, in which the joint design is obtained through trial-and-error interactions with the environment by observing predefined rewards, in the context of continuous state and action, and obtains the comparable performance compared with two state-of-the-art benchmarks.
Abstract: Recently, the reconfigurable intelligent surface (RIS), benefited from the breakthrough on the fabrication of programmable meta-material, has been speculated as one of the key enabling technologies for the future six generation (6G) wireless communication systems scaled up beyond massive multiple input multiple output (Massive-MIMO) technology to achieve smart radio environments. Employed as reflecting arrays, RIS is able to assist MIMO transmissions without the need of radio frequency chains resulting in considerable reduction in power consumption. In this paper, we investigate the joint design of transmit beamforming matrix at the base station and the phase shift matrix at the RIS, by leveraging recent advances in deep reinforcement learning (DRL). We first develop a DRL based algorithm, in which the joint design is obtained through trial-and-error interactions with the environment by observing predefined rewards, in the context of continuous state and action. Unlike the most reported works utilizing the alternating optimization techniques to alternatively obtain the transmit beamforming and phase shifts, the proposed DRL based algorithm obtains the joint design simultaneously as the output of the DRL neural network. Simulation results show that the proposed algorithm is not only able to learn from the environment and gradually improve its behavior, but also obtains the comparable performance compared with two state-of-the-art benchmarks. It is also observed that, appropriate neural network parameter settings will improve significantly the performance and convergence rate of the proposed algorithm.

575 citations


Journal ArticleDOI
TL;DR: Both theoretical analysis and numerical validations show that the RIS-based system can achieve good sum-rate performance by setting a reasonable size of the RIS and a small number of discrete phase shifts.
Abstract: Reconfigurable intelligent surfaces (RISs) have drawn considerable attention from the research community recently. RISs create favorable propagation conditions by controlling the phase shifts of reflected waves at the surface, thereby enhancing wireless transmissions. In this paper, we study a downlink multi-user system where the transmission from a multi-antenna base station (BS) to various users is achieved by an RIS reflecting the incident signals of the BS towards the users. Unlike most existing works, we consider the practical case where only a limited number of discrete phase shifts can be realized by a finite-sized RIS. A hybrid beamforming scheme is proposed and the sum-rate maximization problem is formulated. Specifically, continuous digital beamforming and discrete RIS-based analog beamforming are performed at the BS and the RIS, respectively, and an iterative algorithm is designed to solve this problem. Both theoretical analysis and numerical validations show that the RIS-based system can achieve good sum-rate performance by setting a reasonable size of the RIS and a small number of discrete phase shifts.

435 citations


Journal ArticleDOI
TL;DR: Results show that intelligent reflecting surface (IRS) can help create effective virtual line-of-sight (LOS) paths and thus substantially improve robustness against blockages in mmWave communications.
Abstract: Millimeter wave (MmWave) communications is capable of supporting multi-gigabit wireless access thanks to its abundant spectrum resource. However, severe path loss and high directivity make it vulnerable to blockage events, which can be frequent in indoor and dense urban environments. To address this issue, in this paper, we introduce intelligent reflecting surface (IRS) as a new technology to provide effective reflected paths to enhance the coverage of mmWave signals. In this framework, we study joint active and passive precoding design for IRS-assisted mmWave systems, where multiple IRSs are deployed to assist the data transmission from a base station (BS) to a single-antenna receiver. Our objective is to maximize the received signal power by jointly optimizing the BS's transmit precoding vector and IRSs’ phase shift coefficients. Although such an optimization problem is generally non-convex, we show that, by exploiting some important characteristics of mmWave channels, an optimal closed-form solution can be derived for the single IRS case and a near-optimal analytical solution can be obtained for the multi-IRS case. Our analysis reveals that the received signal power increases quadratically with the number of reflecting elements for both the single IRS and multi-IRS cases. Simulation results are included to verify the optimality and near-optimality of our proposed solutions. Results also show that IRSs can help create effective virtual line-of-sight (LOS) paths and thus substantially improve robustness against blockages in mmWave communications.

391 citations


Journal ArticleDOI
TL;DR: In this paper, the robust beamforming based on the imperfect cascaded BS-IRS-user channels at the transmitter was studied, where the transmit power minimization problems were formulated subject to the worst-case rate constraints under the bounded CSI error model, and the rate outage probability constraint under the statistical CSI estimation model, respectively.
Abstract: Intelligent reflection surface (IRS) has recently been recognized as a promising technique to enhance the performance of wireless systems due to its ability of reconfiguring the signal propagation environment. However, the perfect channel state information (CSI) is challenging to obtain at the base station (BS) due to the lack of radio frequency (RF) chains at the IRS. Since most of the existing channel estimation methods were developed to acquire the cascaded BS-IRS-user channels, this paper is the first work to study the robust beamforming based on the imperfect cascaded BS-IRS-user channels at the transmitter (CBIUT). Specifically, the transmit power minimization problems are formulated subject to the worst-case rate constraints under the bounded CSI error model, and the rate outage probability constraints under the statistical CSI error model, respectively. After approximating the worst-case rate constraints by using the S-procedure and the rate outage probability constraints by using the Bernstein-type inequality, the reformulated problems can be efficiently solved. Numerical results show that the negative impact of the CBIUT error on the system performance is greater than that of the direct CSI error.

334 citations


Proceedings ArticleDOI
04 May 2020
TL;DR: Small cell base stations are introduced orchestrating FEEL among MUs within their cells, and periodically exchanging model updates with the MBS for global consensus, and it is shown that this hierarchical federated learning (HFL) scheme significantly reduces the communication latency without sacrificing the accuracy.
Abstract: We consider federated edge learning (FEEL), where mobile users (MUs) collaboratively learn a global model by sharing local updates on the model parameters rather than their datasets, with the help of a mobile base station (MBS). We optimize the resource allocation among MUs to reduce the communication latency in learning iterations. Observing that the performance in this centralized setting is limited due to the distance of the cell-edge users to the MBS, we introduce small cell base stations (SBSs) orchestrating FEEL among MUs within their cells, and periodically exchanging model updates with the MBS for global consensus. We show that this hierarchical federated learning (HFL) scheme significantly reduces the communication latency without sacrificing the accuracy.

271 citations


Journal ArticleDOI
TL;DR: Simulation results validate the effectiveness of system security enhancement via an IRS via the block coordinate descent (BCD) algorithm to solve the secrecy rate maximization (SRM) problem.
Abstract: This article considers an artificial noise (AN)-aided secure MIMO wireless communication system. To enhance the system security performance, the advanced intelligent reflecting surface (IRS) is invoked, and the base station (BS), legitimate information receiver (IR) and eavesdropper (Eve) are equipped with multiple antennas. With the aim for maximizing the secrecy rate (SR), the transmit precoding (TPC) matrix at the BS, covariance matrix of AN and phase shifts at the IRS are jointly optimized subject to constrains of transmit power limit and unit modulus of IRS phase shifts. Then, the secrecy rate maximization (SRM) problem is formulated, which is a non-convex problem with multiple coupled variables. To tackle it, we propose to utilize the block coordinate descent (BCD) algorithm to alternately update the variables while keeping SR non-decreasing. Specifically, the optimal TPC matrix and AN covariance matrix are derived by Lagrangian multiplier method, and the optimal phase shifts are obtained by Majorization-Minimization (MM) algorithm. Since all variables can be calculated in closed form, the proposed algorithm is very efficient. We also extend the SRM problem to the more general multiple-IRs scenario and propose a BCD algorithm to solve it. Simulation results validate the effectiveness of system security enhancement via an IRS.

259 citations


Journal ArticleDOI
TL;DR: This survey provides a comprehensive overview of several emerging technologies for 5G systems, such as massive multiple-input multiple-output (MIMO) technologies, multiple access technologies, hybrid analog-digital precoding and combining, non-orthogonal multiple access (NOMA), cell-free massive MIMO, and simultaneous wireless information and power transfer (SWIPT) technologies.
Abstract: Fifth-generation (5G) cellular networks will almost certainly operate in the high-bandwidth, underutilized millimeter-wave (mmWave) frequency spectrum, which offers the potentiality of high-capacity wireless transmission of multi-gigabit-per-second (Gbps) data rates. Despite the enormous available bandwidth potential, mmWave signal transmissions suffer from fundamental technical challenges like severe path loss, sensitivity to blockage, directivity, and narrow beamwidth, due to its short wavelengths. To effectively support system design and deployment, accurate channel modeling comprising several 5G technologies and scenarios is essential. This survey provides a comprehensive overview of several emerging technologies for 5G systems, such as massive multiple-input multiple-output (MIMO) technologies, multiple access technologies, hybrid analog-digital precoding and combining, non-orthogonal multiple access (NOMA), cell-free massive MIMO, and simultaneous wireless information and power transfer (SWIPT) technologies. These technologies induce distinct propagation characteristics and establish specific requirements on 5G channel modeling. To tackle these challenges, we first provide a survey of existing solutions and standards and discuss the radio-frequency (RF) spectrum and regulatory issues for mmWave communications. Second, we compared existing wireless communication techniques like sub-6-GHz WiFi and sub-6 GHz 4G LTE over mmWave communications which come with benefits comprising narrow beam, high signal quality, large capacity data transmission, and strong detection potential. Third, we describe the fundamental propagation characteristics of the mmWave band and survey the existing channel models for mmWave communications. Fourth, we track evolution and advancements in hybrid beamforming for massive MIMO systems in terms of system models of hybrid precoding architectures, hybrid analog and digital precoding/combining matrices, with the potential antenna configuration scenarios and mmWave channel estimation (CE) techniques. Fifth, we extend the scope of the discussion by including multiple access technologies for mmWave systems such as non-orthogonal multiple access (NOMA) and space-division multiple access (SDMA), with limited RF chains at the base station. Lastly, we explore the integration of SWIPT in mmWave massive MIMO systems, with limited RF chains, to realize spectrally and energy-efficient communications.

234 citations


Journal ArticleDOI
TL;DR: In this paper, a system for serving paired power-domain non-orthogonal multiple access (NOMA) users by designing the passive beamforming weights at the reconfigurable intelligent surfaces (RISs) is proposed.
Abstract: Reconfigurable intelligent surfaces (RISs) constitute a promising performance enhancement for next-generation (NG) wireless networks in terms of enhancing both their spectral efficiency (SE) and energy efficiency (EE). We conceive a system for serving paired power-domain non-orthogonal multiple access (NOMA) users by designing the passive beamforming weights at the RISs. In an effort to evaluate the network performance, we first derive the best-case and worst-case of new channel statistics for characterizing the effective channel gains. Then, we derive the best-case and worst-case of our closed-form expressions derived both for the outage probability and for the ergodic rate of the prioritized user. For gleaning further insights, we investigate both the diversity orders of the outage probability and the high-signal-to-noise (SNR) slopes of the ergodic rate. We also derive both the SE and EE of the proposed network. Our analytical results demonstrate that the base station (BS)-user links have almost no impact on the diversity orders attained when the number of RISs is high enough. Numerical results are provided for confirming that: i) the high-SNR slope of the RIS-aided network is one; ii) the proposed RIS-aided NOMA network has superior network performance compared to its orthogonal counterpart.

213 citations


Posted Content
TL;DR: In this article, a channel estimation framework based on the PARAllel FACtor (PARAFAC) decomposition is proposed to unfold the resulting cascaded channel model for the downlink of a RIS-empowered multi-user MISO downlink communication systems.
Abstract: Reconfigurable Intelligent Surfaces (RISs) have been recently considered as an energy-efficient solution for future wireless networks due to their fast and low-power configuration, which has increased potential in enabling massive connectivity and low-latency communications. Accurate and low-overhead channel estimation in RIS-based systems is one of the most critical challenges due to the usually large number of RIS unit elements and their distinctive hardware constraints. In this paper, we focus on the downlink of a RIS-empowered multi-user Multiple Input Single Output (MISO) downlink communication systems and propose a channel estimation framework based on the PARAllel FACtor (PARAFAC) decomposition to unfold the resulting cascaded channel model. We present two iterative estimation algorithms for the channels between the base station and RIS, as well as the channels between RIS and users. One is based on alternating least squares (ALS), while the other uses vector approximate message passing to iteratively reconstruct two unknown channels from the estimated vectors. To theoretically assess the performance of the ALS-based algorithm, we derived its estimation Cramer-Rao Bound (CRB). We also discuss the achievable sum-rate computation with estimated channels and different precoding schemes for the base station. Our extensive simulation results show that our algorithms outperform benchmark schemes and that the ALS technique achieve the CRB. It is also demonstrated that the sum rate using the estimated channels reached that of perfect channel estimation under various settings, thus, verifying the effectiveness and robustness of the proposed estimation algorithms.

198 citations


Journal ArticleDOI
TL;DR: Simulation results demonstrate that the proposed deep PDS-PER learning based secure beamforming approach can significantly improve the system secrecy rate and QoS satisfaction probability in IRS-aided secure communication systems.
Abstract: In this paper, we study an intelligent reflecting surface (IRS)-aided wireless secure communication system for physical layer security, where an IRS is deployed to adjust its surface reflecting elements to guarantee secure communication of multiple legitimate users in the presence of multiple eavesdroppers. Aiming to improve the system secrecy rate, a design problem for jointly optimizing the base station (BS)'s beamforming and the IRS's reflecting beamforming is formulated given the different quality of service (QoS) requirements and time-varying channel condition. As the system is highly dynamic and complex, and it is challenging to address the non-convex optimization problem, a novel deep reinforcement learning (DRL)-based secure beamforming approach is firstly proposed to achieve the optimal beamforming policy against eavesdroppers in dynamic environments. Furthermore, post-decision state (PDS) and prioritized experience replay (PER) schemes are utilized to enhance the learning efficiency and secrecy performance. Specifically, PDS is capable of tracing the environment dynamic characteristics and adjust the beamforming policy accordingly. Simulation results demonstrate that the proposed deep PDS-PER learning-based secure beamforming approach can significantly improve the system secrecy rate and QoS satisfaction probability in IRS-aided secure communication systems.

161 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the resource allocation design for intelligent reflecting surface (IRS)-assisted full-duplex (FD) cognitive radio systems, where an RIS is deployed to enhance the performance of the secondary network while helping to mitigate the interference caused to the primary users (PUs).
Abstract: In this article, we investigate the resource allocation design for intelligent reflecting surface (IRS)-assisted full-duplex (FD) cognitive radio systems. In particular, a secondary network employs an FD base station (BS) for serving multiple half-duplex downlink (DL) and uplink (UL) users simultaneously. An IRS is deployed to enhance the performance of the secondary network while helping to mitigate the interference caused to the primary users (PUs). The DL transmit beamforming vectors and the UL receive beamforming vectors at the FD BS, the transmit power of the UL users, and the phase shift matrix at the IRS are jointly optimized for maximization of the total spectral efficiency of the secondary system. The design task is formulated as a non-convex optimization problem taking into account the imperfect knowledge of the PUs’ channel state information (CSI) and their maximum interference tolerance. Since the maximum interference tolerance constraint is intractable, we apply a safe approximation to transform it into a convex constraint. To efficiently handle the resulting approximated optimization problem, which is still non-convex, we develop an iterative block coordinate descent (BCD)-based algorithm. This algorithm exploits semidefinite relaxation, a penalty method, and successive convex approximation and is guaranteed to converge to a stationary point of the approximated optimization problem. Our simulation results do not only reveal that the proposed scheme yields a substantially higher system spectral efficiency for the secondary system than several baseline schemes, but also confirm its robustness against CSI uncertainty. Besides, our results illustrate the tremendous potential of IRS for managing the various types of interference arising in FD cognitive radio networks.

Journal ArticleDOI
TL;DR: This work proposes a load balancing scheme in a fog network to minimize the latency of data flows in the communications and processing procedures by associating IoT devices to suitable BSs and proves the convergence and the optimality of the proposed workload balancing scheme.
Abstract: As latency is the key performance metric for IoT applications, fog nodes co-located with cellular base stations can move the computing resources close to IoT devices Therefore, data flows of IoT devices can be offloaded to fog nodes in their proximity, instead of the remote cloud, for processing However, the latency of data flows in IoT devices consist of both the communications latency and computing latency Owing to the spatial and temporal dynamics of IoT device distributions, some BSs and fog nodes are lightly loaded, while others, which may be overloaded, may incur congestion Thus, the traffic load allocation among base stations (BSs) and computing load allocation among fog nodes affect the communications latency and computing latency of data flows, respectively To solve this problem, we propose a workload balancing scheme in a fog network to minimize the latency of data flows in the communications and processing procedures by associating IoT devices to suitable BSs We further prove the convergence and the optimality of the proposed workload balancing scheme Through extensive simulations, we have compared the performance of the proposed load balancing scheme with other schemes and verified its advantages for fog networking

Journal ArticleDOI
TL;DR: This article considers millimeter-wave (mmWave) communication on a UAV platform, where the UAV base station (UAV-BS) serves multiple ground users, which generate big sensor data.
Abstract: Unmanned aerial vehicle (UAV) with flexible mobility and low cost has been a promising technology for wireless communication. Thus, it can be used for wireless data collection in Internet of Things (IoT). In this article, we consider millimeter-wave (mmWave) communication on a UAV platform, where the UAV base station (UAV-BS) serves multiple ground users, which generate big sensor data. Both the deployment of the UAV-BS and the beamforming design have essential impact on the throughput of the system. Thus, we formulate a problem to maximize the achievable sum rate of all the users, subject to a minimum rate constraint for each user, a position constraint of the UAV-BS, and a constant-modulus (CM) constraint for the beamforming vector. We solve the nonconvex problem with two steps. First, by introducing the approximate beam pattern, we solve the deployment and beam gain allocation subproblem. Then, we utilize the artificial bee colony (ABC) algorithm to solve the beamforming subproblem. For the global optimization problem, we find the near-optimal position of the UAV-BS and the beamforming vector to steer toward each user, subject to an analog beamforming structure. The simulation results demonstrate that the proposed solution can achieve a more superior performance than the present random steering beamforming strategy in terms of achievable sum rate.

Posted Content
TL;DR: In this paper, the authors investigated the joint design of transmit beamforming matrix at the base station and the phase shift matrix at RIS, by leveraging recent advances in deep reinforcement learning (DRL).
Abstract: Recently, the reconfigurable intelligent surface (RIS), benefited from the breakthrough on the fabrication of programmable meta-material, has been speculated as one of the key enabling technologies for the future six generation (6G) wireless communication systems scaled up beyond massive multiple input multiple output (Massive-MIMO) technology to achieve smart radio environments. Employed as reflecting arrays, RIS is able to assist MIMO transmissions without the need of radio frequency chains resulting in considerable reduction in power consumption. In this paper, we investigate the joint design of transmit beamforming matrix at the base station and the phase shift matrix at the RIS, by leveraging recent advances in deep reinforcement learning (DRL). We first develop a DRL based algorithm, in which the joint design is obtained through trial-and-error interactions with the environment by observing predefined rewards, in the context of continuous state and action. Unlike the most reported works utilizing the alternating optimization techniques to alternatively obtain the transmit beamforming and phase shifts, the proposed DRL based algorithm obtains the joint design simultaneously as the output of the DRL neural network. Simulation results show that the proposed algorithm is not only able to learn from the environment and gradually improve its behavior, but also obtains the comparable performance compared with two state-of-the-art benchmarks. It is also observed that, appropriate neural network parameter settings will improve significantly the performance and convergence rate of the proposed algorithm.

Journal ArticleDOI
TL;DR: Numerical analysis results indicate that the proposed IRS-aided network outperforms the benchmark system without IRSs when the IRS installation positions are optimally determined, and is designed to maximize the network performance in terms of the spatial signal-to-interference-plus-noise ratio (SINR).
Abstract: Intelligent reflecting surfaces (IRSs) have emerged as a key enabler for beyond fifth-generation (B5G) communication technology and for realizing sixth-generation (6G) cellular communication. In addition, B5G and 6G networks are expected to support aerial user communications in accordance with the expanded requirements of data transmission for an aerial user. However, there are challenges in providing wireless communication for aerial users owing to the different radio wave propagation properties between terrestrial areas and aerial areas. In this article, we propose an IRS-aided cellular network coverage extension for aerial users. In our proposed network, IRS and base stations (BSs) cooperate with each other to provide air-ground communication to aerial users (AUs), the aim of which is to prevent interference signals from spreading to a wide area. Furthermore, IRS placement is designed to maximize the network performance in terms of the spatial signal-to-interference-plus-noise ratio (SINR) while mitigating inter-cell interference. Numerical analysis results indicate that the proposed IRS-aided network outperforms the benchmark system without IRSs when the IRS installation positions are optimally determined.

Journal ArticleDOI
TL;DR: This work provides a general introduction of FDL application for UAVs-enabled wireless networks by addressing the suitability and how to use FDL to deal with target challenges, and discusses about key technical challenges, open issues, and future research directions on FDL-based approaches.
Abstract: The use of Unmanned Aerial Vehicles (UAVs) for wireless networks is rapidly growing as key enablers of new applications, including: surveillance and monitoring, military, delivery of medical supplies, telecommunications, etc. In particular, due to their unique proprieties such as flexibility, mobility, and adaptive altitude, UAVs can act as mobile base stations to improve capacity, coverage, and energy efficiency of wireless networks. On the other hand, UAVs can operate as mobile terminals to enable many applications such as item delivery and real-time video streaming. In such context, data-driven Deep Learning-assisted (DL) approaches are gaining a growing interest to not only exploit the huge amount of generated data, but also to optimize the network operations, and hence ensure the QoS requirements of these emerging wireless networks. However, UAVs are resource-constrained devices especially in terms of computing and power resources, and traditional DL-assisted schemes are cloud-centric, which require UAVs’ data to be sent and stored in a centralized server. This represents a critical issue since it generates a huge network communication overhead to send raw data towards the centralized entity, and hence may lead to network bandwidth and energy inefficiency of UAV devices. In addition, the transferred data may contain personnel data such as UAVs’ localization and identity, which can directly affect UAVs’ privacy concerns. As a solution, Federated Deep Learning (FDL), or distributed DL, was introduced, where the basic idea is to keep raw data where it is generated, while sending only users’ local trained DL models to the centralized entity for aggregation. Due to its privacy-preserving and low communication overhead and latency, FDL is much more adequate for many UAVs-enabled wireless applications. In this work, we provide a general introduction of FDL application for UAV-enabled wireless networks. We first introduce the FDL concept and its fundamentals. Then, we highlight the possible applications of FDL in UAVs-enabled wireless networks by addressing the suitability and how to use FDL to deal with target challenges. Finally, we discuss about key technical challenges, open issues, and future research directions on FDL-based approaches in such context.

Posted Content
TL;DR: A probabilistic user selection scheme is proposed such that the BS is connected to the users whose local FL models have significant effects on the global FL model with high probabilities, which enables the BS to improve the global model, the FL convergence speed, and the training loss.
Abstract: In this paper, the convergence time of federated learning (FL), when deployed over a realistic wireless network, is studied. In particular, a wireless network is considered in which wireless users transmit their local FL models (trained using their locally collected data) to a base station (BS). The BS, acting as a central controller, generates a global FL model using the received local FL models and broadcasts it back to all users. Due to the limited number of resource blocks (RBs) in a wireless network, only a subset of users can be selected to transmit their local FL model parameters to the BS at each learning step. Moreover, since each user has unique training data samples, the BS prefers to include all local user FL models to generate a converged global FL model. Hence, the FL performance and convergence time will be significantly affected by the user selection scheme. Therefore, it is necessary to design an appropriate user selection scheme that enables users of higher importance to be selected more frequently. This joint learning, wireless resource allocation, and user selection problem is formulated as an optimization problem whose goal is to minimize the FL convergence time while optimizing the FL performance. To solve this problem, a probabilistic user selection scheme is proposed such that the BS is connected to the users whose local FL models have significant effects on its global FL model with high probabilities. Given the user selection policy, the uplink RB allocation can be determined. To further reduce the FL convergence time, artificial neural networks (ANNs) are used to estimate the local FL models of the users that are not allocated any RBs for local FL model transmission at each given learning step, which enables the BS to enhance its global FL model and improve the FL convergence speed and performance.

Journal ArticleDOI
TL;DR: Simulation results demonstrate that the proposed JCC-UA algorithm can effectively reduce the latency of user content downloading and improve the hit rates of contents cached at the BSs as compared to several baseline schemes.
Abstract: Deploying small cell base stations (SBS) under the coverage area of a macro base station (MBS), and caching popular contents at the SBSs in advance, are effective means to provide high-speed and low-latency services in next generation mobile communication networks. In this paper, we investigate the problem of content caching (CC) and user association (UA) for edge computing. A joint CC and UA optimization problem is formulated to minimize the content download latency. We prove that the joint CC and UA optimization problem is NP-hard. Then, we propose a CC and UA algorithm (JCC-UA) to reduce the content download latency. JCC-UA includes a smart content caching policy (SCCP) and dynamic user association (DUA). SCCP utilizes the exponential smoothing method to predict content popularity and cache contents according to prediction results. DUA includes a rapid association (RA) method and a delayed association (DA) method. Simulation results demonstrate that the proposed JCC-UA algorithm can effectively reduce the latency of user content downloading and improve the hit rates of contents cached at the BSs as compared to several baseline schemes.

Proceedings ArticleDOI
01 Nov 2020
TL;DR: This paper introduces a distributed IRS aided mmW system to support multi-user transmission and designs a quantitative projection method for the IRS with discrete phase shifts, thereby reducing power consumption whilst improving communication performance.
Abstract: Intelligent reflecting surface (IRS) is envisioned as a promising solution for controlling radio propagation environments in future wireless systems. In this paper, we propose a distributed intelligent reflecting surface (IRS) assisted multi-user millimeter wave (mmWave) system, where IRSs are exploited to enhance the mmWave signal coverage when direct links between base station and users are unavailable. First, a joint active and passive beamforming problem is established for weighted sum-rate maximization. Then, an alternating iterative algorithm with closed-form expressions is proposed to tackle the challenging non-convex problem, thereby decoupling the active and passive beamforming variables. Moreover, we design a constraint relaxation technique to address the unit modulus constraints pertaining to the IRS. Numerical results demonstrate that the distributed IRS can potentially enhance the communication performance of existing wireless systems.

Posted Content
TL;DR: This article presents an alternative application of metasurfaces for wireless communications as active reconfigurable antennas with advanced analog signal processing capabilities for next generation transceivers.
Abstract: Next generation wireless base stations and access points will transmit and receive using extremely massive numbers of antennas. A promising technology for realizing such massive arrays in a dynamically controllable and scalable manner with reduced cost and power consumption utilizes surfaces of radiating metamaterial elements, known as metasurfaces. To date, metasurfaces are mainly considered in the context of wireless communications as passive reflecting devices, aiding conventional transceivers in shaping the propagation environment. This article presents an alternative application of metasurfaces for wireless communications as active reconfigurable antennas with advanced analog signal processing capabilities for next generation transceivers. We review the main characteristics of metasurfaces used for radiation and reception, and analyze their main advantages as well as their effect on the ability to reliably communicate in wireless networks. As current studies unveil only a portion of the potential of metasurfaces, we detail a list of exciting research and implementation challenges which arise from the application of metasurface antennas for wireless transceivers.

Proceedings ArticleDOI
08 Jun 2020
TL;DR: In this paper, the authors focus on the downlink of a RIS-assisted multi-user MISO communication system and present a method based on the PARAllel FACtor (PARAFAC) decomposition to unfold the resulting cascaded channel model.
Abstract: Reconfigurable Intelligent Surfaces (RISs) have been recently considered as an energy-efficient solution for future wireless networks due to their fast and low power configuration enabling massive connectivity and low latency communications. Channel estimation in RIS-based systems is one of the most critical challenges due to the large number of reflecting unit elements and their distinctive hardware constraints. In this paper, we focus on the downlink of a RIS-assisted multi-user Multiple Input Single Output (MISO) communication system and present a method based on the PARAllel FACtor (PARAFAC) decomposition to unfold the resulting cascaded channel model. The proposed method includes an alternating least squares algorithm to iteratively estimate the channel between the base station and RIS, as well as the channels between RIS and users. Our selective simulation results show that the proposed iterative channel estimation method outperforms a benchmark scheme using genie-aided information. We also provide insights on the impact of different RIS settings on the proposed algorithm.

Journal ArticleDOI
TL;DR: A new method for cooperative vehicle positioning and mapping of the radio environment is proposed, comprising a multiple-model probability hypothesis density filter and a map fusion routine, which is able to consider different types of objects and different fields of views.
Abstract: 5G millimeter wave (mmWave) signals can enable accurate positioning in vehicular networks when the base station and vehicles are equipped with large antenna arrays. However, radio-based positioning suffers from multipath signals generated by different types of objects in the physical environment. Multipath can be turned into a benefit, by building up a radio map (comprising the number of objects, object type, and object state) and using this map to exploit all available signal paths for positioning. We propose a new method for cooperative vehicle positioning and mapping of the radio environment, comprising a multiple-model probability hypothesis density filter and a map fusion routine, which is able to consider different types of objects and different fields of views. Simulation results demonstrate the performance of the proposed method.

Journal ArticleDOI
TL;DR: The research shows that the method proposed has a certain energy-saving effect, can meet the energy efficiency requirements of 5G ultra dense base station, and in the ultra densebase station group, the complexity can also meet the system operation requirements, which has aCertain degree of practicality, and can provide reference for the follow-up related research.
Abstract: For time and space constraints, 5G base stations will have more serious energy consumption problems in some time periods, so it needs corresponding sleep strategies to reduce energy consumption. Based on the analysis of 5G super dense base station network structure, through the analysis of current situation and user demand, a cluster sleep method based on genetic algorithm is constructed under the support of genetic algorithm, which can realize the dynamic matching of energy consumption in time domain and space, and the low load base station enters the sleep state. In order to verify the performance of the algorithm, the simulation network structure is built on the MATLAB platform, and the advantages of the algorithm in this study are obtained through comparative analysis, and the relevant test parameters are set for the technical performance analysis of this study. The research shows that the method proposed in this paper has a certain energy-saving effect, can meet the energy efficiency requirements of 5G ultra dense base station, and in the ultra dense base station group, the complexity can also meet the system operation requirements, which has a certain degree of practicality, and can provide reference for the follow-up related research.

Proceedings ArticleDOI
07 Jun 2020
TL;DR: Simulation results show that the proposed CRNet outperforms the state-of-the-art CsiNet under the same computational complexity without any extra information.
Abstract: In massive multiple-input multiple-output (MIMO) system, user equipment (UE) needs to send downlink channel state information (CSI) back to base station (BS). However, the feedback becomes expensive with the growing complexity of CSI in massive MIMO system. Recently, deep learning (DL) approaches are used to improve the reconstruction efficiency of CSI feedback. In this paper, a novel feedback network named CRNet is proposed to achieve better performance via extracting CSI features on multiple resolutions. An advanced training scheme that further boosts the network performance is also introduced. Simulation results show that the proposed CRNet outperforms the state-of-the-art CsiNet under the same computational complexity without any extra information. The open source codes are available at https://github.com/Kylin9511/CRNet.

Journal ArticleDOI
TL;DR: An adaptive computation offloading method based on deep reinforcement learning (ACORL) that can address the continuous action space is proposed and the numerical results illustrate that the proposed ACORL can effectively learn the optimal policy, which outperforms the Dueling DQN and greedy policy in the stochastic environment.
Abstract: The vehicular network needs efficient and reliable data communication technology to maintain low latency. It is very challenging to minimize the energy consumption and data communication delay while the vehicle is moving and wireless channels and bandwidth are time-varying. With the help of the emerging mobile edge computing (MEC) server, vehicles and roadside units (RSUs) can offload computing tasks to MEC associated with base station (BS). However, the environment for offloading tasks to MEC, e.g., wireless channel states and available bandwidth, is unstable. Therefore, ensuring the efficiency of computation offloading under such an unstable environment is a challenge. In this work, we design a task computation offloading model in a heterogeneous vehicular network; this model takes into account multiple stochastic tasks, the variety of wireless channels and bandwidth. To obtain the tradeoff between the cost of energy consumption and the cost of data transmission delay and avoid curse of dimensionality caused by the complexity of the large action space, we propose an adaptive computation offloading method based on deep reinforcement learning (ACORL) that can address the continuous action space. ACORL adds the Ornstein-Uhlenbeck (OU) noise vector to the action space with different factors for each action to validate the exploration. Multi transmission equipment can execute local processing and computation offloading to MEC. Nevertheless, ACORL considers the variety of wireless channels and available bandwidth between adjacent time slots. The numerical results illustrate that the proposed ACORL can effectively learn the optimal policy, which outperforms the Dueling DQN and greedy policy in the stochastic environment.

Posted Content
TL;DR: This paper considers an uplink reconfigurable intelligent surface (RIS)-aided massive multiple-input multiple-output (MIMO) system, where the phase shifts of the RIS are designed relying on statistical channel state information (CSI).
Abstract: This paper considers an uplink reconfigurable intelligent surface (RIS)-aided massive multiple-input multiple-output (MIMO) system with statistical channel state information (CSI). The RIS is deployed to help conventional massive MIMO networks serve the users in the dead zone. We consider the Rician channel model and exploit the long-time statistical CSI to design the phase shifts of the RIS, while the maximum ratio combination (MRC) technique is applied for the active beamforming at the base station (BS) relying on the instantaneous CSI. Firstly, we reveal the power scaling laws and derive the closed-form expressions for the uplink achievable rate which holds for arbitrary numbers of base station (BS) antennas. Based on the theoretical expressions, we discuss the rate performance under some special cases and provide the average asymptotic rate when using random phase shifts. Then, we consider the sum-rate maximization and the minimum user rate maximization problems by optimizing the phase shifts at the RIS. However, these two optimization problems are challenging to solve due to the complicated data rate expression. To solve these problems, we propose a novel genetic algorithm (GA) with low complexity but can achieve considerable performance. Finally, extensive simulations are provided to validate the benefits by integrating RIS into conventional massive MIMO systems. Besides, our simulations demonstrate the feasibility of deploying large-size but low-resolution RIS in massive MIMO systems.

Journal ArticleDOI
TL;DR: A framework integrating energy, computation and communication (ECC), and a joint beamforming design algorithm for the BS and the IoT devices to improve the overall performance of ECC are designed.
Abstract: To jointly address the critical issues of the sixth-generation (6G) cellular internet of things (IoT), i.e., energy supply, data aggregation, and information transmission, we design a framework integrating energy, computation and communication (ECC). Firstly, the base station (BS) charges massive IoT devices simultaneously via wireless power transfer (WPT) in the downlink. Then, IoT devices with the harvested energy carry out the computation task and the communication task in the uplink over the same spectrum. To improve the overall performance of ECC, we propose a joint beamforming design algorithm for the BS and the IoT devices. Finally, simulation results validate the effectiveness of the proposed algorithm in 6G cellular IoT.

Posted Content
TL;DR: This article investigates the resource allocation design for intelligent reflecting surface (IRS)-assisted full-duplex (FD) cognitive radio systems and develops an iterative block coordinate descent (BCD)-based algorithm to efficiently handle the resulting approximated optimization problem.
Abstract: In this paper, we investigate the resource allocation design for intelligent reflecting surface (IRS)-assisted full-duplex (FD) cognitive radio systems. In particular, a secondary network employs an FD base station (BS) for serving multiple half-duplex downlink (DL) and uplink (UL) users simultaneously. An IRS is deployed to enhance the performance of the secondary network while helping to mitigate the interference caused to the primary users (PUs). The DL transmit beamforming vectors and the UL receive beamforming vectors at the FD BS, the transmit power of the UL users, and the phase shift matrix at the IRS are jointly optimized for maximization of the total sum rate of the secondary system. The design task is formulated as a non-convex optimization problem taking into account the imperfect knowledge of the PUs' channel state information (CSI) and their maximum interference tolerance. Since the maximum interference tolerance constraint is intractable, we apply a safe approximation to transform it into a convex constraint. To efficiently handle the resulting approximated optimization problem, which is still non-convex, we develop an iterative block coordinate descent (BCD)-based algorithm. This algorithm exploits semidefinite relaxation, a penalty method, and successive convex approximation and is guaranteed to converge to a stationary point of the approximated optimization problem. Our simulation results do not only reveal that the proposed scheme yields a substantially higher system sum rate for the secondary system than several baseline schemes, but also confirm its robustness against CSI uncertainty. Besides, our results illustrate the tremendous potential of IRS for managing the various types of interference arising in FD cognitive radio networks.

Proceedings ArticleDOI
25 May 2020
TL;DR: In this article, the authors analyze the potential of RIS in enhancing cellular communications for UAVs, which currently suffers from poor signal strength due to the down-tilt of base station antennas optimized to serve ground users.
Abstract: Intelligent reflective surfaces (IRSs) capable of reconfiguring their electromagnetic absorption and reflection properties in real-time are offering unprecedented opportunities to enhance wireless communication experience in challenging environments In this paper, we analyze the potential of IRS in enhancing cellular communications for UAVs, which currently suffers from poor signal strength due to the down-tilt of base station antennas optimized to serve ground users We consider deployment of IRS on building walls, which can be remotely configured by cellular base stations to coherently direct the reflected radio waves towards specific UAVs in order to increase their received signal strengths Using the recently released 3GPP ground-to-air channel models, we analyze the signal gains at UAVs due to the IRS deployments as a function of UAV height as well as various IRS parameters including size, altitude, and distance from base station Our analysis suggests that even with a small IRS, we can achieve significant signal gain for UAVs flying above the cellular base station We also find that the maximum gain can be achieved by optimizing the location of IRS including its altitude and distance to BS

Journal ArticleDOI
TL;DR: In this paper, an unmanned aerial vehicle (UAV) aided cellular framework against jamming is presented, in which an UAV uses reinforcement learning to choose the relay policy for a mobile user whose serving base station is attacked by a jammer.
Abstract: Cellular systems have to resist smart jammers that can optimize their selection of jamming channels and powers based on the estimated ongoing network states. In this article, we present an unmanned aerial vehicle (UAV) aided cellular framework against jamming, in which an UAV uses reinforcement learning to choose the relay policy for a mobile user whose serving base station is attacked by a jammer. More specifically, the UAV applies deep reinforcement learning and transfer learning to help cellular systems resist smart jamming without knowing the cellular topology, the message generation model, the server computation model and the jamming model, based on the previous anti-jamming relay experiences and the observed current communication status. The performance bound in terms of the bit error rate and the UAV energy consumption is derived from the Nash equilibrium of the studied dynamic relay game and verified via simulations. Simulation results show that this scheme can reduce the bit error rate and save the UAV energy consumption in comparison with the benchmark.