scispace - formally typeset
Search or ask a question
Book

4G: LTE/LTE-Advanced for Mobile Broadband

TL;DR: In this article, the authors focus on LTE with full updates including LTE-Advanced to provide a complete picture of the LTE system, including the physical layer, access procedures, broadcast, relaying, spectrum and RF characteristics, and system performance.
Abstract: Based on the bestseller "3G Evolution - HSPA and LTE for mobile broadband" and reflecting the ongoing success of LTE throughout the world, this book focuses on LTE with full updates including LTE-Advanced to provide a complete picture of the LTE system. Overview and detailed explanations are given for the latest LTE standards for radio interface architecture, the physical layer, access procedures, broadcast, relaying, spectrum and RF characteristics, and system performance. Key technologies presented include multi-carrier transmission, advanced single-carrier transmission, advanced receivers, OFDM, MIMO and adaptive antenna solutions, advanced radio resource management and protocols, and different radio network architectures. Their role and use in the context of mobile broadband access in general is explained. Both a high-level overview and more detailed step-by-step explanations of the LTE/LTE-Advanced implementation are given. An overview of other related systems such as GSM/EDGE, HSPA, CDMA2000, and WIMAX is also provided. This book is a 'must-have' resource for engineers and other professionals in the telecommunications industry, working with cellular or wireless broadband technologies, giving an understanding of how to utilize the new technology in order to stay ahead of the competition. The authors of the book all work at Ericsson Research and have been deeply involved in 3G and 4G development and standardisation since the early days of 3G research. They are leading experts in the field and are today still actively contributing to the standardisation of LTE within 3GPP. Includes full details of the latest additions to the LTE Radio Access standards and technologies up to and including 3GPP Release 10Clear explanations of the role of the underlying technologies for LTE, including OFDM and MIMO Full coverage of LTE-Advanced, including LTE carrier aggregation, extended multi-antenna transmission, relaying functionality and heterogeneous deploymentsLTE radio interface architecture, physical layer, access procedures, MBMS, RF characteristics and system performance covered in detail
Citations
More filters
Patent
21 Jan 2013
TL;DR: In this article, a terminal device is described which communicates with a base station supporting a host carrier comprising radio resource segments distributed in time, each radio resource segment providing radio resources across a first frequency band.
Abstract: A terminal device is described which communicates with a base station supporting a host carrier comprising radio resource segments distributed in time, each radio resource segment providing radio resources across a first frequency band. The terminal device is operable to communicate with said base station using a subordinate carrier comprising radio resources within a second frequency band of a subset of the radio resource segments of the host carrier, the second frequency band being nan-ower than and contained within the first frequency band. In this way, an efficient and flexible mechanism for specifying and controlling an amount and location of radio resources to be dedicated to a subordinate carrier for use by a particular (e.g. reduced capability) type of device can be provided.

20 citations

Posted Content
TL;DR: An experimental evaluation of several aspects of the use of deep neural networks in the context of channel state information (CSI)-based localization for Massive MIMO cellular systems, including localization accuracy, generalization capability, and data aging.
Abstract: We consider the use of deep neural networks (DNNs) in the context of channel state information (CSI)-based localization for Massive MIMO cellular systems. We discuss the practical impairments that are likely to be present in practical CSI estimates, and introduce a principled approach to feature design for CSI-based DNN applications based on the objective of making the features invariant to the considered impairments. We demonstrate the efficiency of this approach by applying it to a dataset constituted of geo-tagged CSI measured in an outdoors campus environment, and training a DNN to estimate the position of the UE on the basis of the CSI. We provide an experimental evaluation of several aspects of that learning approach, including localization accuracy, generalization capability, and data aging.

20 citations

Journal ArticleDOI
TL;DR: This paper develops an analytical framework targeting the downlink performance evaluation of FFR-aided orthogonal frequency division multiple access-based two-tier heterogeneous networks and proposes different optimization designs of the FFR component that allow a tradeoff between throughput performance and fairness.
Abstract: Two-tier networks combining an operator-managed infrastructure of macrocell base stations combined with a user-deployed network of femtocells have recently emerged in the context of modern wireless standards as a solution to meet the ambitious performance requirements envisaged in 4G/5G architectures. Most often, these systems require interference coordination schemes that allow near universal frequency reuse while maintaining a considerably high signal-to-interference-plus-noise ratio levels across the coverage area. In particular, fractional frequency reuse (FFR) and its variants are deemed to play a fundamental role in the next generation of cellular systems. This paper develops an analytical framework targeting the downlink performance evaluation of FFR-aided orthogonal frequency division multiple access-based two-tier heterogeneous networks. In the considered scenario, macrocell and femtocell tiers are assumed to be uncoordinated and co-channel deployed, thus representing a worst-case scenario in terms of inter-tier interference. The proposed framework allows the evaluation of the impact produced by both inter- and co-tier interferences on the performance of either the macro-users (MUs) or the femto-users. Analytical results are used to optimize the FFR parameters as a function of, for example, the density of MUs per cell, the resource block scheduling policy, the density of femto base stations per area unit, or the degree of isolation provided by wall penetration losses. Moreover, different optimization designs of the FFR component are proposed that allow a tradeoff between throughput performance and fairness by suitably dimensioning the FFR inner and outer areas and the corresponding frequency allocation.

20 citations

Journal ArticleDOI
TL;DR: A decoupled learning strategy is developed to jointly and dynamically adapt the access control factors of those three access schemes, where a Recurrent Neural Network model is first employed to predict the real-time traffic values of the network environment, and multiple DRL agents are employed to cooperatively configure parameters of each RACH scheme.
Abstract: Cellular-based networks are expected to offer connectivity for massive Internet of Things (mIoT) systems. However, their Random Access CHannel (RACH) procedure suffers from unreliability, due to the collision from the simultaneous massive access. Despite that this collision problem has been treated in existing RACH schemes, these schemes usually organize IoT devices’ transmission and re-transmission along with fixed parameters, thus can hardly adapt to time-varying traffic patterns. Without adaptation, the RACH procedure easily suffers from high access delay, high energy consumption, or even access unavailability. With the goal of improving the RACH procedure, this paper targets to optimize the RACH procedure in real-time by maximizing a long-term hybrid multi-objective function, which consists of the number of access success devices, the average energy consumption, and the average access delay. To do so, we first optimize the long-term objective in the number of access success devices by using Deep Reinforcement Learning (DRL) algorithms for different RACH schemes, including Access Class Barring (ACB), Back-Off (BO), and Distributed Queuing (DQ). The converging capability and efficiency of different DRL algorithms including Policy Gradient (PG), Actor-Critic (AC), Deep Q-Network (DQN), and Deep Deterministic Policy Gradient (DDPG) are compared. Inspired by the results from this comparison, a decoupled learning strategy is developed to jointly and dynamically adapt the access control factors of those three access schemes. This decoupled strategy integrates predicted traffic into the learning process to improve training efficiency, where a Recurrent Neural Network (RNN) model is first employed to predict the real-time traffic values of the network environment, and then multiple DRL agents are employed to cooperatively configure parameters of each RACH scheme. Our results demonstrate that the decoupled strategy remarkably accelerate the training speedy.

20 citations


Additional excerpts

  • ...may encompass mixtures of these schemes [21], as each is...

    [...]

  • ...This collision comes from the fact that the BS cannot decode the Msg 3 of RACH, due to the overlapped transmissions from collided devices using the same channel at the same time [21]....

    [...]

Journal ArticleDOI
Xiao-Ya Li1, Jiandong Li1, Wei Liu1, Yan Zhang1, Heng-Sheng Shan1 
TL;DR: A joint power and resource block (RB) allocation (JPRBA) algorithm with low complexity is proposed, which addresses the intra-and-inter-cell interference management problem for a multicell device-to-device (D2D) communication underlaying LTE-Advanced network.
Abstract: In this paper, a joint power and resource block (RB) allocation (JPRBA) algorithm with low complexity is proposed, which addresses the intra-and-inter-cell interference management problem for a multicell device-to-device (D2D) communication underlaying LTE-Advanced network. We first introduce a power control and resource allocation vector (PORAVdm) to each D2D transmitter, and the set of all PORAVdm has two functions: one is to select appropriate reused RBs for each D2D link, whereas the other is to determine the optimal power for D2D transmitters on each selected RB. To obtain the appropriate PORAVdms, we exploit the group sparse structure to formulate a sum rate maximization problem (referred to as the group least absolute shrinkage and selection operator programming). Then we derive the stationary solution by solving its equivalent sparse weighted mean square error minimization problem. Finally, simulation results show that the proposed JPRBA algorithm can efficiently improve the total throughput.

20 citations


Cites background from "4G: LTE/LTE-Advanced for Mobile Bro..."

  • ...where P2dk,max is the maximum permitted transmit power for 2dkth UEdm [28]....

    [...]