Other affiliations: Bell Labs
Bio: Esa Tuomaala is an academic researcher from Nokia. The author has contributed to research in topics: Wireless network & Hidden node problem. The author has an hindex of 16, co-authored 37 publications receiving 1463 citations. Previous affiliations of Esa Tuomaala include Bell Labs.
••09 Jun 2013
TL;DR: This paper considers two of the most prominent wireless technologies available today, namely Long Term Evolution (LTE), and WiFi, and addresses some problems that arise from their coexistence in the same band, and proposes a simple coexistence scheme that reuses the concept of almost blank subframes in LTE.
Abstract: The recent development of regulatory policies that permit the use of TV bands spectrum on a secondary basis has motivated discussion about coexistence of primary (e.g. TV broadcasts) and secondary users (e.g. WiFi users in TV spectrum). However, much less attention has been given to coexistence of different secondary wireless technologies in the TV white spaces. Lack of coordination between secondary networks may create severe interference situations, resulting in less efficient usage of the spectrum. In this paper, we consider two of the most prominent wireless technologies available today, namely Long Term Evolution (LTE), and WiFi, and address some problems that arise from their coexistence in the same band. We perform exhaustive system simulations and observe that WiFi is hampered much more significantly than LTE in coexistence scenarios. A simple coexistence scheme that reuses the concept of almost blank subframes in LTE is proposed, and it is observed that it can improve the WiFi throughput per user up to 50 times in the studied scenarios.
••02 Jun 2013
TL;DR: A simulator-based system- level analysis in order to assess the network performance in an office scenario shows that LTE system performance is slightly affected by coexistence whereas Wi-Fi is significantly impacted by LTE transmissions.
Abstract: The deployment of modern mobile systems has faced severe challenges due to the current spectrum scarcity. The situation has been further worsened by the development of different wireless technologies and standards that can be used in the same frequency band. Furthermore, the usage of smaller cells (e.g. pico, femto and wireless LAN), coexistence among heterogeneous networks (including amongst different wireless technologies such as LTE and Wi-Fi deployed in the same frequency band) has been a big field of research in the academy and industry. In this paper, we provide a performance evaluation of coexistence between LTE and Wi-Fi systems and show some of the challenges faced by the different technologies. We focus on a simulator-based system- level analysis in order to assess the network performance in an office scenario. Simulation results show that LTE system performance is slightly affected by coexistence whereas Wi-Fi is significantly impacted by LTE transmissions. In coexistence, the Wi-Fi channel is most often blocked by LTE interference, making the Wi-Fi nodes to stay on the LISTEN mode more than 96% of the time. This reflects directly on the Wi-Fi user throughput, that decreases from 70% to ≈100% depending on the scenario. Finally, some of the main issues that limit the LTE/Wi-Fi coexistence and some pointers on the mutual interference management of both the systems are provided.
TL;DR: The issues that arise from the concurrent operation of LTE and Wi-Fi in the same unlicensed bands from the point of view of radio resource management are discussed and it is shown that Wi-fi is severely impacted by LTE transmissions.
Abstract: The expansion of wireless broadband access network deployments is resulting in increased scarcity of available radio spectrum. It is very likely that in the near future, cellular technologies and wireless local area networks will need to coexist in the same unlicensed bands. However, the two most prominent technologies, LTE and Wi-Fi, were designed to work in different bands and not to coexist in a shared band. In this article, we discuss the issues that arise from the concurrent operation of LTE and Wi-Fi in the same unlicensed bands from the point of view of radio resource management. We show that Wi-Fi is severely impacted by LTE transmissions; hence, the coexistence of LTE and Wi-Fi needs to be carefully investigated. We discuss some possible coexistence mechanisms and future research directions that may lead to successful joint deployment of LTE and Wi-Fi in the same unlicensed band.
••08 Oct 2007
TL;DR: Simulation results show that semi-persistent scheduling can support high system capacity while at the same time guaranteeing the QoS requirements such as packet delay and packet loss rate of VoIP.
Abstract: This paper presents an effective scheduling scheme called semi-persistent scheduling for VoIP service in LTE system. The main challenges of effectively supporting VoIP service in LTE system are 1) the tight delay requirement combined with the frequent arrival of small packets of VoIP traffic and 2) the scarcity of radio resources along with control channel restriction in LTE system. Simulation results show that semi-persistent scheduling can support high system capacity while at the same time guaranteeing the QoS requirements such as packet delay and packet loss rate of VoIP. Furthermore, semi- persistent scheduling requires less control signaling overhead which is very important for efficient resources utilization in a practical system.
27 Jun 2012
TL;DR: In this article, the authors propose a signaling mechanism for wireless networks composed of a large number of stations, where a wireless terminal device receives a first message from an access point, the first message comprising information indicating a plurality of restricted access windows, each allocated for a different group of terminal devices associated to a wireless network managed by the access point.
Abstract: Embodiments of the invention provide signaling mechanisms for wireless networks composed of a large number of stations. An example method embodiment comprises: receiving by a wireless terminal device, a first message from an access point, the first message comprising information indicating a plurality of restricted access windows, each allocated for a different group of terminal devices associated to a wireless network managed by the access point; receiving by the terminal device, a second message from the access point, within a restricted access window of the plurality of restricted access windows, the restricted access window allocated to a group of terminal devices of which the terminal device is a member, the second message comprising information indicating that a communications channel is available; and determining by the terminal device, based on the second message, that the communications channel is not occupied by hidden ones of the terminal devices associated to the network.
TL;DR: An overview on the key issues that arise in the design of a resource allocation algorithm for LTE networks is provided, intended for a wide range of readers as it covers the topic from basics to advanced aspects.
Abstract: Future generation cellular networks are expected to provide ubiquitous broadband access to a continuously growing number of mobile users. In this context, LTE systems represent an important milestone towards the so called 4G cellular networks. A key feature of LTE is the adoption of advanced Radio Resource Management procedures in order to increase the system performance up to the Shannon limit. Packet scheduling mechanisms, in particular, play a fundamental role, because they are responsible for choosing, with fine time and frequency resolutions, how to distribute radio resources among different stations, taking into account channel condition and QoS requirements. This goal should be accomplished by providing, at the same time, an optimal trade-off between spectral efficiency and fairness. In this context, this paper provides an overview on the key issues that arise in the design of a resource allocation algorithm for LTE networks. It is intended for a wide range of readers as it covers the topic from basics to advanced aspects. The downlink channel under frequency division duplex configuration is considered as object of our study, but most of the considerations are valid for other configurations as well. Moreover, a survey on the most recent techniques is reported, including a classification of the different approaches presented in literature. Performance comparisons of the most well-known schemes, with particular focus on QoS provisioning capabilities, are also provided for complementing the described concepts. Thus, this survey would be useful for readers interested in learning the basic concepts before going into the details of a particular scheduling strategy, as well as for researchers aiming at deepening more specific aspects.
TL;DR: The design challenges and proposed solutions for the radio interface and network architecture to fulfill latency critical IoT applications requirements are discussed, which mainly benefit from flexibility and service-centric approaches.
Abstract: Next generation mobile networks not only envision enhancing the traditional MBB use case but also aim to meet the requirements of new use cases, such as the IoT. This article focuses on latency critical IoT applications and analyzes their requirements. We discuss the design challenges and propose solutions for the radio interface and network architecture to fulfill these requirements, which mainly benefit from flexibility and service-centric approaches. The article also discusses new business opportunities through IoT connectivity enabled by future networks.
TL;DR: In this paper, the potential gains and limitations of network densification and spectral efficiency enhancement techniques in ultra-dense small cell deployments are analyzed. And the top ten challenges to be addressed to bring ultra dense small-cell deployments to reality are discussed.
Abstract: Today's heterogeneous networks comprised of mostly macrocells and indoor small cells will not be able to meet the upcoming traffic demands. Indeed, it is forecasted that at least a $100\times$ network capacity increase will be required to meet the traffic demands in 2020. As a result, vendors and operators are now looking at using every tool at hand to improve network capacity. In this epic campaign, three paradigms are noteworthy, i.e., network densification, the use of higher frequency bands and spectral efficiency enhancement techniques. This paper aims at bringing further common understanding and analysing the potential gains and limitations of these three paradigms, together with the impact of idle mode capabilities at the small cells as well as the user equipment density and distribution in outdoor scenarios. Special attention is paid to network densification and its implications when transiting to ultra-dense small cell deployments. Simulation results show that comparing to the baseline case with an average inter site distance of 200 m and a 100 MHz bandwidth, network densification with an average inter site distance of 35 m can increase the average UE throughput by $7.56\times$ , while the use of the 10 GHz band with a 500 MHz bandwidth can further increase the network capacity up to $5\times$ , resulting in an average of 1.27 Gbps per UE. The use of beamforming with up to 4 antennas per small cell BS lacks behind with average throughput gains around 30% and cell-edge throughput gains of up to $2\times$ . Considering an extreme densification, an average inter site distance of 5 m can increase the average and cell-edge UE throughput by $18\times$ and $48\times$ , respectively. Our study also shows how network densification reduces multi-user diversity, and thus proportional fair alike schedulers start losing their advantages with respect to round robin ones. The energy efficiency of these ultra-dense small cell deployments is also analysed, indicating the benefits of energy harvesting approaches to make these deployments more energy-efficient. Finally, the top ten challenges to be addressed to bring ultra-dense small cell deployments to reality are also discussed.
TL;DR: Simulation results show that the proposed network architecture and interference avoidance schemes can significantly increase the capacity of 4G heterogeneous cellular networks while maintaining the service quality of Wi-Fi systems.
Abstract: As two major players in terrestrial wireless communications, Wi-Fi systems and cellular networks have different origins and have largely evolved separately. Motivated by the exponentially increasing wireless data demand, cellular networks are evolving towards a heterogeneous and small cell network architecture, wherein small cells are expected to provide very high capacity. However, due to the limited licensed spectrum for cellular networks, any effort to achieve capacity growth through network densification will face the challenge of severe inter-cell interference. In view of this, recent standardization developments have started to consider the opportunities for cellular networks to use the unlicensed spectrum bands, including the 2.4 GHz and 5 GHz bands that are currently used by Wi-Fi, Zigbee and some other communication systems. In this article, we look into the coexistence of Wi-Fi and 4G cellular networks sharing the unlicensed spectrum. We introduce a network architecture where small cells use the same unlicensed spectrum that Wi-Fi systems operate in without affecting the performance of Wi-Fi systems. We present an almost blank subframe (ABS) scheme without priority to mitigate the co-channel interference from small cells to Wi-Fi systems, and propose an interference avoidance scheme based on small cells estimating the density of nearby Wi-Fi access points to facilitate their coexistence while sharing the same unlicensed spectrum. Simulation results show that the proposed network architecture and interference avoidance schemes can significantly increase the capacity of 4G heterogeneous cellular networks while maintaining the service quality of Wi-Fi systems.
TL;DR: Simulation results demonstrate that LTE-U can provide better user experience to LTE users while well protecting the incumbent WiFi users' performance compared to two existing advanced technologies: cellular/WiFi interworking and licensed-only heterogeneous networks (Het-Nets).
Abstract: The phenomenal growth of mobile data demand has brought about increasing scarcity in available radio spectrum. Meanwhile, mobile customers pay more attention to their own experience, especially in communication reliability and service continuity on the move. To address these issues, LTE-Unlicensed, or LTEU, is considered one of the latest groundbreaking innovations to provide high performance and seamless user experience under a unified radio technology by extending LTE to the readily available unlicensed spectrum. In this article, we offer a comprehensive overview of the LTEU technology from both operator and user perspectives, and examine its impact on the incumbent unlicensed systems. Specifically, we first introduce the implementation regulations, principles, and typical deployment scenarios of LTE-U. Potential benefits for both operators and users are then discussed. We further identify three key challenges in bringing LTE-U into reality together with related research directions. In particular, the most critical issue of LTE-U is coexistence with other unlicensed systems, such as widely deployed WiFi. The LTE/WiFi coexistence mechanisms are elaborated in time, frequency, and power aspects, respectively. Simulation results demonstrate that LTE-U can provide better user experience to LTE users while well protecting the incumbent WiFi users’ performance compared to two existing advanced technologies: cellular/WiFi interworking and licensed-only heterogeneous networks (Het-Nets).