scispace - formally typeset
Search or ask a question
Topic

LTE Advanced

About: LTE Advanced is a research topic. Over the lifetime, 4055 publications have been published within this topic receiving 74262 citations. The topic is also known as: Long-Term Evolution Advanced & LTE-A.


Papers
More filters
Proceedings ArticleDOI
06 Apr 2014
TL;DR: This paper proposes a very-tight coupling solution between LTE and Wi-Fi, which can be used to enhance the offloading procedures, and describes how user packets are transmitted.
Abstract: Wi-Fi access points are now widely deployed by customers or by the operators. This represents an interesting solution to offload the LTE networks. In this paper we propose a very-tight coupling solution between LTE and Wi-Fi, which can be used to enhance the offloading procedures. In this architecture PDCP (Packet Data Convergence Protocol) is used as the common layer between LTE andWi-Fi and the security procedures defined for LTE are reused for Wi-Fi transmission. It is thus possible to use Wi-Fi transmissions even when a terminal is covered by a Wi-Fi access point for a short period. We describe the entities of this architecture, the protocol stack and describe how user packets are transmitted.

37 citations

Proceedings ArticleDOI
TL;DR: In this paper, the authors proposed a patch for the LTE module of ns-3, one of the most prominent open-source network simulators, to improve the accuracy of the routine that simulates the LTE Random Access Channel (RACH).
Abstract: Several studies assert that the random access procedure of the Long Term Evolution (LTE) cellular standard may not be effective whenever a massive number of simultaneous connection attempts are performed by terminals, as may happen in a typical Internet of Things or Smart City scenario. Nevertheless, simulation studies in real deployment scenarios are missing because many system-level simulators do not implement the LTE random access procedure in detail. In this paper, we propose a patch for the LTE module of ns-3, one of the most prominent open-source network simulators, to improve the accuracy of the routine that simulates the LTE Random Access Channel (RACH). The patched version of the random access procedure is compared with the default one and the issues arising from massive simultaneous access from mobile terminals in LTE are assessed via a simulation campaign.

37 citations

Journal ArticleDOI
TL;DR: Novel models to estimate PAD, PLR, and energy consumption for MIDs, specifically for the group paging mechanism are devised and the superiority of the proposed approach over random grouping approach concerning significant energy consumption of MIDs is demonstrated.
Abstract: The latest evolution of cellular technologies, i.e., 5G including long term evolution-advanced (LTE-A) Pro and 5G new radio promises enhancement to mobile technologies for the Internet of Things (IoT). Despite 5G’s vision to cater to IoT, yet some of the aspects are still optimized for human-to-human (H2H) communication. More specifically, the existing group paging mechanism in LTE-A Pro has not yet clearly defined approaches to group, mobile IoT devices (MIDs) having diverse characteristics, such as discontinuous reception (DRX) and data transmission frequency (DTF) with various mobility patterns. Inappropriate grouping of MIDs may lead to increased energy consumption and degraded quality of service, especially in terms of packet arrival delay (PAD) and packet loss rate (PLR). Therefore, in this paper, we devise novel models to estimate PAD, PLR, and energy consumption for MIDs, specifically for the group paging mechanism. Based on the proposed models, we formulate an optimization problem with the objective to minimize energy consumption of MIDs, while providing required PAD and PLR. The nonlinear convex optimization problem addressed herein is solved using the Lagrangian approach, and the Karush–Kuhn–Tucker conditions have been applied to derive optimal characteristics for MIDs to join the group, namely, DRX and DTF. The extensive numerical results presented verify the effectiveness of the proposed method, and the mathematical models demonstrate the superiority of our proposed approach over random grouping approach concerning significant energy consumption of MIDs.

37 citations

Proceedings ArticleDOI
Qinlong Qiu1, Jian Chen1, Lingdi Ping1, Qifei Zhang1, Xuezeng Pan1 
14 Dec 2009
TL;DR: How to build an accurate enough LTE/SAE model in NS 2 so that other optimization features can be tested and one example is given to demonstrate how to use the model.
Abstract: Expectation and requirements for future wireless communication systems continue to grow and evolve. Thus, 3GPP has considered LTE/SAE to ensure its competitiveness in the future. In LTE/SAE, one of the recurring problems is dimensioning and testing. Modeling is an effective way to solve the problem because model is easy to generate test scenarios and inexpensive in changing test configurations and running test cases. This paper is to introduce how to build an accurate enough LTE/SAE model in NS 2 so that other optimization features can be tested. This open source model includes traffic model and network model. The network model concentrates on the air interface and S1 interface. In the end of the paper, one example is given to demonstrate how to use the model.

37 citations

Journal ArticleDOI
TL;DR: The potential benefit of global cell selection versus the current local mobile SNR-based decision protocol is studied, and the new possibility available in OFDMA-based systems of satisfying the minimal demand of a mobile station simultaneously by more than one base station is studied.
Abstract: Cell selection is the process of determining the cell(s) that provide service to each mobile station. Optimizing these processes is an important step toward maximizing the utilization of current and future cellular networks. We study the potential benefit of global cell selection versus the current local mobile SNR-based decision protocol. In particular, we study the new possibility available in OFDMA-based systems, such as IEEE 802.16m and LTE-Advanced, of satisfying the minimal demand of a mobile station simultaneously by more than one base station. We formalize the problem as an optimization problem, and show that in the general case this problem is not only NP-hard but also cannot be approximated within any reasonable factor. In contrast, under the very practical assumption that the maximum required bandwidth of a single mobile station is at most an r-fraction of the capacity of a base station, we present two different algorithms for cell selection. The first algorithm produces a (1-r)-approximate solution, where a mobile station can be covered simultaneously by more than one base station. The second algorithm produces a 1-r/2-r-approximate solution, while every mobile station is covered by at most one base station. We complete our study by an extensive simulation study demonstrating the benefits of using our algorithms in high-loaded capacity-constrained future 4G networks, compared to currently used methods. Specifically, our algorithms obtain up to 20 percent better usage of the network's capacity, in comparison with the current cell selection algorithms.

37 citations


Network Information
Related Topics (5)
Wireless network
122.5K papers, 2.1M citations
92% related
Wireless ad hoc network
49K papers, 1.1M citations
92% related
Wireless
133.4K papers, 1.9M citations
91% related
Network packet
159.7K papers, 2.2M citations
90% related
Fading
55.4K papers, 1M citations
89% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202316
202242
202156
202082
2019135
2018192