scispace - formally typeset
Search or ask a question

Showing papers by "Preben Mogensen published in 2012"


Proceedings ArticleDOI
31 Dec 2012
TL;DR: A novel LTE user equipment (UE) power consumption model was developed for LTE system level optimization, because it is important to understand how network settings like scheduling of resources and transmit power control affect the UE's battery life.
Abstract: In this work a novel LTE user equipment (UE) power consumption model is presented. It was developed for LTE system level optimization, because it is important to understand how network settings like scheduling of resources and transmit power control affect the UE's battery life. The proposed model is based on a review of the major power consuming parts in an LTE UE radio modem. The model includes functions of UL and DL power and data rate. Measurements on a commercial LTE USB dongle were used to assign realistic power consumption values to each model parameter. Verification measurements on the dongle show that the model results in an average error of 2.6 %. The measurements show that UL transmit power and DL data rate determines the overall power consumption, while UL data rate and DL receive power have smaller impact.

130 citations


Journal ArticleDOI
TL;DR: This paper discusses the various degrees of freedom involved, and points out which design paradigms appear most promising and which major fields of future research remain.
Abstract: This article highlights particular challenges inherent in the design and operation of future mobile communications systems This paper also discusses the various degrees of freedom involved, and points out which design paradigms appear most promising and which major fields of future research remain.

73 citations


Journal ArticleDOI
TL;DR: This paper addresses the interference management problem in the context of LTE-Advanced femtocells with a distributed carrier-based inter-cell interference coordination scheme that represents one step towards cognitive radio networks.
Abstract: This paper addresses the interference management problem in the context of LTE-Advanced femtocells. Due to the expected large number of user-deployed cells, centralized network planning becomes increasingly less viable. Consequently, we consider an architecture of autonomous decision makers. Our main contribution in this paper, denominated Generalized Autonomous Component Carrier Selection (G-ACCS), is a distributed carrier-based inter-cell interference coordination scheme that represents one step towards cognitive radio networks. The algorithm relies on expected rather than sensed interference levels. This approach facilitates scheduler-independent decisions, however, it can lead to overestimation of the interference coupling among cells when the resources are not fully utilized. Acknowledging this fact, G-ACCS leverages the power domain to circumvent the restrictive nature of expected interference coupling. This work focuses on the downlink and also provides an extensive characterization of the network performance as a function of the topology as well as the often overlooked temporal traits of traffic. We compare G-ACCS with other carrier-based solutions, including the simplest universal reuse strategy. The encouraging simulation results demonstrate that G-ACCS achieves an efficient and fair distribution of resources in all considered traffic and deployment conditions. More importantly, this is attained in a truly autonomous fashion, without any explicit parametrization.

58 citations


Proceedings ArticleDOI
01 Dec 2012
TL;DR: The research motivations and high level requirements for a B4G local area concept are discussed and suggestions on the design of the B 4G system as well as on the choice of its key technology components are presented.
Abstract: A next generation Beyond 4G (B4G) radio access technology is expected to become available around 2020 in order to cope with the exponential increase of mobile data traffic. In this paper, research motivations and high level requirements for a B4G local area concept are discussed. Our suggestions on the design of the B4G system as well as on the choice of its key technology components are also presented.

50 citations


Journal ArticleDOI
TL;DR: It is concluded that designs of scheduling algorithms for NRT services for OFDMA systems carried out under the full buffer model assumption may fail to provide the desired performance benefits in realistic scenarios.
Abstract: This article studies the impact on the design of scheduling algorithms for Orthogonal Frequency Division Multiple Access (OFDMA) systems of two traffic models described in the evaluation methodology proposals from standardization bodies: the full buffer and the finite buffer traffic models. The analysis concentrates on utility-based scheduling with an α-fair utility function for Non-Real Time (NRT) services. The results show that a gradient scheduling algorithm is able to maximize the aggregate utility over all the users when the less realistic full buffer model is adopted; but not when the finite buffer model is applied. The results also show that with the full buffer model a gradient scheduler exhibits a trade-off between average user throughput and the user throughput at 5% outage, but it does not when the more realistic finite buffer is used. Therefore, it is concluded that designs of scheduling algorithms for NRT services for OFDMA systems carried out under the full buffer model assumption may fail to provide the desired performance benefits in realistic scenarios. Based on the results presented, a recommendation on scheduling design is provided.

48 citations


Proceedings ArticleDOI
31 Dec 2012
TL;DR: This study shows that a simple geometrical-based extension to standard empirical path loss prediction models can give quite reasonable accuracy in predicting the signal strength from tilted base station antennas in small urban macro-cells.
Abstract: Base station antenna downtilt is one of the most important parameters for optimizing a cellular network with tight frequency reuse. By downtilting, inter-site interference is reduced, which leads to an improved performance of the network. In this study we show that a simple geometrical-based extension to standard empirical path loss prediction models can give quite reasonable accuracy in predicting the signal strength from tilted base station antennas in small urban macro-cells. Our evaluation is based on measurements on several sectors in a 2.6 GHz Long Term Evolution (LTE) cellular network, with electrical antenna downtilt in the range from 0 to 10 degrees, as well as predictions based on ray-tracing and 3D building databases covering the measurement area. Although the calibrated ray-tracing predictions are highly accurate compared with the measured data, the combined LOS/NLOS COST-WI model with downtilt correction performs similarly for distances above a few hundred meters. Generally, predicting the effect of base station antenna tilt close to the base station is difficult due to multiple vertical sidelobes.

18 citations


Proceedings ArticleDOI
Liang Hu1, Claudio Coletti1, Nguyen Huan1, Preben Mogensen1, Jan Elling2 
06 May 2012
TL;DR: This paper is envisaged to provide a first quantitative study on how much indoor deployed Wi-Fi can offload the operator's 3G HSPA macro cellular networks in a real large-scale dense-urban scenario.
Abstract: this paper is envisaged to provide a first quantitative study on how much indoor deployed Wi-Fi can offload the operator's 3G HSPA macro cellular networks in a real large-scale dense-urban scenario. Wi-Fi has been perceived as a cost-effective mean of adding wireless capacity by leveraging low-cost access points and unlicensed spectrum. However, the quantitative offloading gain that Wi-Fi can achieve is still unknown. We studied the Wi-Fi offloading gain as a function of access point density, where it is shown that 10 access points/km2 can already boost average user throughput by 300% and the gain increases linearly proportional to the access point density. Indoor Wi-Fi deployment also significantly reduces the number of users in outage, especially for indoor area. A user is considered to be in outage if they have a user throughput less than 512 kbps. We also propose three Wi-Fi deployment algorithms: Traffic-centric, Outage-centric, Uniform Random. Simulation results show that Traffic-centric performs best in boosting average user throughput while Outage-centric performs best in reducing user outage. Finally, Wi-Fi offloading solution is compared with another offloading solution - HSPA Femto cell. We show that Wi-Fi provides both much higher average user throughput and network outage reduction than HSPA Femto cells by exploring 20 MHz unlicensed ISM band.

18 citations


Proceedings ArticleDOI
01 Apr 2012
TL;DR: A novel method for autonomous selection of spectrum/ channels in Femtocells that effectively mitigates cotier interference with no signaling at all across different femtocells is proposed.
Abstract: The traffic in cellular networks has been growing at an accelerated rate. In order to meet the rising demand for large data volumes, shrinking the cell size may be the only viable option. In fact, locally deployed small cells, namely picocells and femtocells, will certainly play a major role in meeting the IMT-Advanced requirements for the next generation of cellular networks. Notwithstanding, several aspects of femtocell deployment are very challenging, especially in closed subscriber group femtocells: massive deployment, user definition of access point location and high density. When such characteristics are combined the traditional network planning and optimization of cellular networks fails to be cost effective. Therefore, a greater deal of automation is needed in femtocells. In particular, this paper proposes a novel method for autonomous selection of spectrum/ channels in femtocells. This method effectively mitigates cotier interference with no signaling at all across different femtocells. Still, the method has a remarkable simple implementation. The efficiency of the proposed method was evaluated by system level simulations. The results show large throughput gains for the cells with unfavorable geometry, up to nearly 300% when compared to the unmanaged situation (reuse 1).

16 citations


Proceedings ArticleDOI
31 Dec 2012
TL;DR: Results have shown that the proposed MLB scheme can significantly improve the overall network resources utilization by eliminating potential load imbalances amongst the deployment layers and consequently enhance user experience, however this occurs at the cost of increased Radio Link Failures (RLF).
Abstract: This paper analyzes the behavior of a distributed Mobility Load Balancing (MLB) scheme in a multi-layer 3GPP (3rd Generation Partnership Project) Long Term Evolution (LTE) deployment with different User Equipment (UE) densities in certain network areas covered with pico cells. Target of the study is to evaluate MLB in terms of efficient pico cell utilization and macro layer load balancing (LB). The analysis focuses on video streaming traffic due to specific service characteristics (e.g. play-out buffer delay/ jitter protection) that might make any mobility performance degradation transparent to the end user performance. Results have shown that the proposed MLB scheme can significantly improve the overall network resources utilization by eliminating potential load imbalances amongst the deployment layers and consequently enhance user experience. However this occurs at the cost of increased Radio Link Failures (RLF), a fact that might be critical for further applying MLB in real-time conversational services without additional mobility optimization and interference management techniques.

16 citations


Proceedings ArticleDOI
06 May 2012
TL;DR: This work examines the energy saving potential of powering down RRC_connected but unscheduled User Equipment (UE) when it is determined by Fast Control Channel Decoding (FCCD) that the UE is not scheduled to receive downlink data in the current TTI.
Abstract: This work examines the energy saving potential of powering down RRC_connected but unscheduled User Equipment (UE). The idea is to power down energy consuming circuits in RF and BB, when it is determined by Fast Control Channel Decoding (FCCD) that the UE is not scheduled to receive downlink data in the current TTI. The cost is that some reference signals are not received leading to a degraded channel estimate. Calculations show that this causes an SINR degradation of approximately 0.5 dB, which will result in maximum 4 % throughput loss. Comparing this with energy saving potentials of 5 %-25 % it is concluded that the FCCD method is a valuable aid to prolong LTE phones' battery lifetime. The results are generated using a two state Markov chain model to simulate traffic and scheduling, and verified mathematically. The work also includes an examination of various data traffic types' on/off relation and an evaluation of how the relation affects power consumption. The FCCD method can complement DRX sleep mode since it is applicable when the signal is too aperiodic or fast switching for DRX.

15 citations


MonographDOI
01 Jan 2012
TL;DR: This chapter provides architecture designers with a set of guidelines synthesized from an analysis of the state of the art, but enriched with the perspective of the development of future generations of communication systems such as Cognitive Radio.
Abstract: Wireless communications are a fast growing part of the telecommunication market. While new types of traffic and challenges related to the wireless medium are appearing, the methodologies for designing system architectures substantially remain the same. Under the increasing pressure of market, users, and physical medium issues, designers are in need for new approaches. Cross Layer becomes, then, a handy solution for coping with such problems. In fact, it allows both tighter optimizations of the existing functionalities, and the introduction of new ones that do not fit within the traditional protocol stack design methodology. However, Cross Layer also carries a risk due to possibly unexpected and undesired effects. In this chapter, the authors provide architecture designers with a set of guidelines synthesized from an analysis of the state of the art, but enriched with the perspective of the development of future generations of communication systems such as Cognitive Radio.

Proceedings ArticleDOI
Liang Hu, Claudio Coletti, Huan Cong Nguyen, Preben Mogensen, Jan Elling1 
01 Jan 2012


Proceedings ArticleDOI
31 Dec 2012
TL;DR: The aim is to identify the most cost economical solution for indoor wireless deployment which meets the indoor capacity need and provides ubiquitous indoor coverage and to suggest when DAS becomes more economical for large buildings with low data traffic need.
Abstract: As small cell solutions for efficient in-building wireless deployment, the distributed antenna system (DAS) and Femtocells constitute two major options. In this work, we study the Total Cost of Ownership (TCO) in combination with system performance in order to reach technology-economical conclusions regarding of these deployment options. Our aim is to identify the most cost economical solution for indoor wireless deployment which meets the indoor capacity need and provides ubiquitous indoor coverage. In the analysis we focus on LTE enhancement in the enterprise buildings. We discuss the unique cost features of DAS and Femto systems and highlight the importance of backhaul visibility in the cost analysis. For enterprise building solutions, we show the cost advantage of low-cost Femtocells over DAS in all scenarios regardless of coverage building size. We also suggest when DAS becomes more economical for large buildings with low data traffic need.

Proceedings ArticleDOI
06 May 2012
TL;DR: In this paper, year-by-year network performance is simulated for an example TD-LTE network by using four networks upgrade approaches, including carrier upgrade, macro densification, micro deployment in dedicated carrier and micro deployment re-using macro carrier.
Abstract: the wireless network gradually changes from voice service to broadband data access services and faces unprecedented growth in data traffic. The way to meet future capacity demand is by combining macro site expansion and outdoor micro layer deployment. Thus it is important for the operator to understand the basic cost and performance of different combinations of technologies according to their own network. In this paper, year-by-year network performance is simulated for an example TD-LTE network by using four networks upgrade approaches, including carrier upgrade, macro densification, micro deployment in dedicated carrier and micro deployment re-using macro carrier. Downlink TD-LTE network capacity evolution is analyzed in terms of the needed network upgrades. That is, we examine the number of macro and micro site need to be upgraded or added in each year for different network evolution paths, while a given network key performance indication (KPI) is maintained. The network cost in terms of relative Total Cost of Ownership (TCO) is then compared among network evolution paths.

Journal ArticleDOI
TL;DR: Results show that network operators can get relatively close to their targets, with energy reductions of up to 40% noted, and further CO2 emissions can be offset through the use of carbon-neutral energy sources.
Abstract: While operators have to upgrade the capacity of their networks, they have committed themselves to reduce their CO2 emissions, partly by reducing their energy consumption. This article investigates the challenges faced by operators and quantifies, through a number of case studies, the impact of specific solutions and how the energy consumption trend can be expected to develop over the next decade. With different options for upgrading capacity, studies show that a hybrid macro-pico upgrade is more energy-efficient than a macro or pico only solution. The study is extended further by quantifying the possible savings by adopting an energy-efficient capacity evolution together with an equipment replacement and site upgrade strategy. Results show that network operators can get relatively close to their targets, with energy reductions of up to 40% noted. While this can be improved further through software-based energy saving features, further CO2 emissions can be offset through the use of carbon-neutral energy sources.

Proceedings ArticleDOI
31 Dec 2012
TL;DR: It is concluded that femto interference does not affect macro downlink (DL) performance as long as the macro Received Signal Code Power (RSCP) is stronger than femto RSCP, and that a macro escape carrier is a robust DL interference management solution.
Abstract: In this paper macro performance in a co-channel macro and femto setup is studied. Measurements are performed in a live Universal Mobile Telecommunication System (UMTS) network. It is concluded that femto interference does not affect macro downlink (DL) performance as long as the macro Received Signal Code Power (RSCP) is stronger than femto RSCP. We also conclude that a macro escape carrier is a robust DL interference management solution. In uplink (UL) direction it is shown that a single femto UE close to macro cell can potentially cause a noise rise of 6 dB in the surrounding macro cell. In order to limit the noise rise from femto UEs, femto UE power capping and lowering femto common pilot channel (CPICH) power is recommended. The consequence is less uplink interference towards the macro, but also decreased femto coverage. Measurements close to macro cell centre showed femto coverage radius smaller than 5 meter - with realistic power settings. This makes co-channel femto deployment less promising in dense macro environments with good macro RSCP coverage.


Proceedings ArticleDOI
16 Jul 2012
TL;DR: This paper addresses carrier-based inter-cell interference coordination (CB-ICIC) among LTE femtocells operating on a single carrier by showing that conservative methods can achieve most of the benefits of unrestricted off-line coloring algorithms with a very modest number of CCs to choose from.
Abstract: This paper addresses carrier-based inter-cell interference coordination (CB-ICIC) among LTE femtocells operating on a single carrier. CB-ICIC is in many ways linked to the widely investigated dynamic channel assignment problem, which is often studied in the context of graph coloring. The investigation revolves around the sensible definition of the underlying graph, i.e. the network model, rather than focusing on the coloring algorithms and their properties. Ultimately, we posit that improper online graph-coloring suffices and is actually preferable. In short, settling for less-than-optimal configurations avoids uncontrolled service interruptions. Such disruptions tend to raise understandable concerns when it comes to fully autonomous selection of operational CCs. Our results dispel such concerns by showing that conservative methods can achieve most of the benefits of unrestricted off-line coloring algorithms with a very modest number (2-3) of CCs to choose from.

01 Jan 2012
TL;DR: A novel LTE user equipment (UE) power consumption model was developed for LTE system level optimization, because it is important to understand how network settings like scheduling of resources and transmit power control affect the UE's battery life.
Abstract: In this work a novel LTE user equipment (UE) power consumption model is presented. It was developed for LTE system level optimization, because it is important to understand how network settings like scheduling of resources and transmit power control affect the UE's battery life. The proposed model is based on a review of the major power consuming parts in an LTE UE radio modem. The model includes functions of UL and DL power and data rate. Measurements on a commercial LTE USB dongle were used to assign realistic power consumption values to each model parameter. Verificat ion measurements on the dongle show that the model results in an average error of 2.6 %. The measurements show that UL transmit power and DL data rate determines the overall power consumption, while UL data rate and DL receive power have smaller impact.

Proceedings ArticleDOI
31 Dec 2012
TL;DR: This paper quantifies the power savings that can be achieved by assuming that the available spectrum for an operator can be reorganized within a single band, and have multiple carriers bundled together to fully exploit the capabilities of modern equipment.
Abstract: Technological improvements and evolving user requirements have led to operators running and supporting three distinct wireless access technologies, GSM, UMTS, and LTE. While the most recent layer (LTE) introduces improvements in spectral efficiency and peak data rates, the remaining layers are still required for supporting legacy devices and providing wider network coverage. In order to facilitate and reduce the cost of rolling out a new network, mobile operators often reuse existing sites. Radio frequency modules in base station sites house power amplifiers, which are designed to operate within a specific frequency band. Since some access technologies have spectrum split onto multiple bands, this results in operators installing multiple modules for each access technology. This paper quantifies the power savings that can be achieved by assuming that the available spectrum for an operator can be reorganized within a single band, and have multiple carriers bundled together to fully exploit the capabilities of modern equipment. These modifications are applied on all network layers, maintaining the same number of carriers and baseband capacity. For the presented case, this results in the elimination of at least four separate modules in each site, reducing the power consumption of by 31%. Indirectly, this also translates into a reduced site space of 40%. These savings are crucial for mobile network operators to reach the energy and carbon emission targets they have committed for.