Rafael C. D. Paiva
Other affiliations: Aalto University
Bio: Rafael C. D. Paiva is an academic researcher from Nokia. The author has contributed to research in topics: GSM & GERAN. The author has an hindex of 13, co-authored 24 publications receiving 1084 citations. Previous affiliations of Rafael C. D. Paiva include Aalto University.
••09 Jun 2013
TL;DR: This paper considers two of the most prominent wireless technologies available today, namely Long Term Evolution (LTE), and WiFi, and addresses some problems that arise from their coexistence in the same band, and proposes a simple coexistence scheme that reuses the concept of almost blank subframes in LTE.
Abstract: The recent development of regulatory policies that permit the use of TV bands spectrum on a secondary basis has motivated discussion about coexistence of primary (e.g. TV broadcasts) and secondary users (e.g. WiFi users in TV spectrum). However, much less attention has been given to coexistence of different secondary wireless technologies in the TV white spaces. Lack of coordination between secondary networks may create severe interference situations, resulting in less efficient usage of the spectrum. In this paper, we consider two of the most prominent wireless technologies available today, namely Long Term Evolution (LTE), and WiFi, and address some problems that arise from their coexistence in the same band. We perform exhaustive system simulations and observe that WiFi is hampered much more significantly than LTE in coexistence scenarios. A simple coexistence scheme that reuses the concept of almost blank subframes in LTE is proposed, and it is observed that it can improve the WiFi throughput per user up to 50 times in the studied scenarios.
••02 Jun 2013
TL;DR: A simulator-based system- level analysis in order to assess the network performance in an office scenario shows that LTE system performance is slightly affected by coexistence whereas Wi-Fi is significantly impacted by LTE transmissions.
Abstract: The deployment of modern mobile systems has faced severe challenges due to the current spectrum scarcity. The situation has been further worsened by the development of different wireless technologies and standards that can be used in the same frequency band. Furthermore, the usage of smaller cells (e.g. pico, femto and wireless LAN), coexistence among heterogeneous networks (including amongst different wireless technologies such as LTE and Wi-Fi deployed in the same frequency band) has been a big field of research in the academy and industry. In this paper, we provide a performance evaluation of coexistence between LTE and Wi-Fi systems and show some of the challenges faced by the different technologies. We focus on a simulator-based system- level analysis in order to assess the network performance in an office scenario. Simulation results show that LTE system performance is slightly affected by coexistence whereas Wi-Fi is significantly impacted by LTE transmissions. In coexistence, the Wi-Fi channel is most often blocked by LTE interference, making the Wi-Fi nodes to stay on the LISTEN mode more than 96% of the time. This reflects directly on the Wi-Fi user throughput, that decreases from 70% to ≈100% depending on the scenario. Finally, some of the main issues that limit the LTE/Wi-Fi coexistence and some pointers on the mutual interference management of both the systems are provided.
TL;DR: The issues that arise from the concurrent operation of LTE and Wi-Fi in the same unlicensed bands from the point of view of radio resource management are discussed and it is shown that Wi-fi is severely impacted by LTE transmissions.
Abstract: The expansion of wireless broadband access network deployments is resulting in increased scarcity of available radio spectrum. It is very likely that in the near future, cellular technologies and wireless local area networks will need to coexist in the same unlicensed bands. However, the two most prominent technologies, LTE and Wi-Fi, were designed to work in different bands and not to coexist in a shared band. In this article, we discuss the issues that arise from the concurrent operation of LTE and Wi-Fi in the same unlicensed bands from the point of view of radio resource management. We show that Wi-Fi is severely impacted by LTE transmissions; hence, the coexistence of LTE and Wi-Fi needs to be carefully investigated. We discuss some possible coexistence mechanisms and future research directions that may lead to successful joint deployment of LTE and Wi-Fi in the same unlicensed band.
TL;DR: A new model for an ideal operational amplifier that does not include implicit equations and is thus suitable for implementation using wave digital filters (WDFs) is introduced and a novel WDF model for a diode is proposed using the Lambert W function.
Abstract: This brief presents a generic model to emulate distortion circuits using operational amplifiers and diodes. Distortion circuits are widely used for enhancing the sound of guitars and other musical instruments. This brief introduces a new model for an ideal operational amplifier that does not include implicit equations and is thus suitable for implementation using wave digital filters (WDFs). Furthermore, a novel WDF model for a diode is proposed using the Lambert W function. A comparison of output signals of the proposed models to those obtained from a reference simulation using SPICE shows that the distortion characteristics are accurately reproduced over a wide frequency range. Additionally, the proposed model enables real-time emulation of distortion circuits using ten multiplications, 22 additions, and two interpolations from a lookup table per output sample.
TL;DR: This paper proposes to simulate the audio transformer using a wave digital filter model, which is based on the gyrator-capacitor analogy, and shows that these practical transformer designs introduce distortion at low frequencies only, below about 100 Hz for the Fender and 30 Hz for the Hammond transformer.
Abstract: An audio transformer is used in a guitar amplifier to match the impedances of the power amplifier and a loudspeaker. It is important to understand the influence of the audio transformer on the overall sound quality for realistic tube amplifier emulation. This paper proposes to simulate the audio transformer using a wave digital filter model, which is based on the gyrator-capacitor analogy. The proposed model is two-directional in the sense that it outputs the loudspeaker current, but it also connects backward to the power amplifier thus affecting its behavior in a nonlinear manner. A practical parameter estimation procedure is introduced, which requires only the measurement of basic electrical quantities but no knowledge of material properties. Measurements of a Fender NSC041318 and a Hammond T1750V transformer are presented as case studies, as well as parameter fitting and simulation for the Fender transformer. The results show that these practical transformer designs introduce distortion at low frequencies only, below about 100 Hz for the Fender and 30 Hz for the Hammond transformer, and that the proposed model faithfully reproduces this effect. The proposed audio transformer model is implemented in real time using the BlockCompiler software. Parametric control allows varying and also exaggerating the model nonlinearities.
TL;DR: In this paper, the potential gains and limitations of network densification and spectral efficiency enhancement techniques in ultra-dense small cell deployments are analyzed. And the top ten challenges to be addressed to bring ultra dense small-cell deployments to reality are discussed.
Abstract: Today's heterogeneous networks comprised of mostly macrocells and indoor small cells will not be able to meet the upcoming traffic demands. Indeed, it is forecasted that at least a $100\times$ network capacity increase will be required to meet the traffic demands in 2020. As a result, vendors and operators are now looking at using every tool at hand to improve network capacity. In this epic campaign, three paradigms are noteworthy, i.e., network densification, the use of higher frequency bands and spectral efficiency enhancement techniques. This paper aims at bringing further common understanding and analysing the potential gains and limitations of these three paradigms, together with the impact of idle mode capabilities at the small cells as well as the user equipment density and distribution in outdoor scenarios. Special attention is paid to network densification and its implications when transiting to ultra-dense small cell deployments. Simulation results show that comparing to the baseline case with an average inter site distance of 200 m and a 100 MHz bandwidth, network densification with an average inter site distance of 35 m can increase the average UE throughput by $7.56\times$ , while the use of the 10 GHz band with a 500 MHz bandwidth can further increase the network capacity up to $5\times$ , resulting in an average of 1.27 Gbps per UE. The use of beamforming with up to 4 antennas per small cell BS lacks behind with average throughput gains around 30% and cell-edge throughput gains of up to $2\times$ . Considering an extreme densification, an average inter site distance of 5 m can increase the average and cell-edge UE throughput by $18\times$ and $48\times$ , respectively. Our study also shows how network densification reduces multi-user diversity, and thus proportional fair alike schedulers start losing their advantages with respect to round robin ones. The energy efficiency of these ultra-dense small cell deployments is also analysed, indicating the benefits of energy harvesting approaches to make these deployments more energy-efficient. Finally, the top ten challenges to be addressed to bring ultra-dense small cell deployments to reality are also discussed.
TL;DR: Simulation results show that the proposed network architecture and interference avoidance schemes can significantly increase the capacity of 4G heterogeneous cellular networks while maintaining the service quality of Wi-Fi systems.
Abstract: As two major players in terrestrial wireless communications, Wi-Fi systems and cellular networks have different origins and have largely evolved separately. Motivated by the exponentially increasing wireless data demand, cellular networks are evolving towards a heterogeneous and small cell network architecture, wherein small cells are expected to provide very high capacity. However, due to the limited licensed spectrum for cellular networks, any effort to achieve capacity growth through network densification will face the challenge of severe inter-cell interference. In view of this, recent standardization developments have started to consider the opportunities for cellular networks to use the unlicensed spectrum bands, including the 2.4 GHz and 5 GHz bands that are currently used by Wi-Fi, Zigbee and some other communication systems. In this article, we look into the coexistence of Wi-Fi and 4G cellular networks sharing the unlicensed spectrum. We introduce a network architecture where small cells use the same unlicensed spectrum that Wi-Fi systems operate in without affecting the performance of Wi-Fi systems. We present an almost blank subframe (ABS) scheme without priority to mitigate the co-channel interference from small cells to Wi-Fi systems, and propose an interference avoidance scheme based on small cells estimating the density of nearby Wi-Fi access points to facilitate their coexistence while sharing the same unlicensed spectrum. Simulation results show that the proposed network architecture and interference avoidance schemes can significantly increase the capacity of 4G heterogeneous cellular networks while maintaining the service quality of Wi-Fi systems.
TL;DR: Simulation results demonstrate that LTE-U can provide better user experience to LTE users while well protecting the incumbent WiFi users' performance compared to two existing advanced technologies: cellular/WiFi interworking and licensed-only heterogeneous networks (Het-Nets).
Abstract: The phenomenal growth of mobile data demand has brought about increasing scarcity in available radio spectrum. Meanwhile, mobile customers pay more attention to their own experience, especially in communication reliability and service continuity on the move. To address these issues, LTE-Unlicensed, or LTEU, is considered one of the latest groundbreaking innovations to provide high performance and seamless user experience under a unified radio technology by extending LTE to the readily available unlicensed spectrum. In this article, we offer a comprehensive overview of the LTEU technology from both operator and user perspectives, and examine its impact on the incumbent unlicensed systems. Specifically, we first introduce the implementation regulations, principles, and typical deployment scenarios of LTE-U. Potential benefits for both operators and users are then discussed. We further identify three key challenges in bringing LTE-U into reality together with related research directions. In particular, the most critical issue of LTE-U is coexistence with other unlicensed systems, such as widely deployed WiFi. The LTE/WiFi coexistence mechanisms are elaborated in time, frequency, and power aspects, respectively. Simulation results demonstrate that LTE-U can provide better user experience to LTE users while well protecting the incumbent WiFi users’ performance compared to two existing advanced technologies: cellular/WiFi interworking and licensed-only heterogeneous networks (Het-Nets).
TL;DR: This paper provides a comprehensive overview on the extensive on-going research efforts and categorize them based on the fundamental green tradeoffs and focuses on research progresses of 4G and 5G communications, such as orthogonal frequency division multiplexing and non-orthogonal aggregation, multiple input multiple output, and heterogeneous networks.
Abstract: With years of tremendous traffic and energy consumption growth, green radio has been valued not only for theoretical research interests but also for the operational expenditure reduction and the sustainable development of wireless communications. Fundamental green tradeoffs, served as an important framework for analysis, include four basic relationships: 1) spectrum efficiency versus energy efficiency; 2) deployment efficiency versus energy efficiency; 3) delay versus power; and 4) bandwidth versus power. In this paper, we first provide a comprehensive overview on the extensive on-going research efforts and categorize them based on the fundamental green tradeoffs. We will then focus on research progresses of 4G and 5G communications, such as orthogonal frequency division multiplexing and non-orthogonal aggregation, multiple input multiple output, and heterogeneous networks. We will also discuss potential challenges and impacts of fundamental green tradeoffs, to shed some light on the energy efficient research and design for future wireless networks.
07 Jan 2015
TL;DR: In this paper, the authors review the expected future WLAN scenarios and use-cases that justify the push for a new PHY/MAC IEEE 802.11 amendment and discuss some of the network-level functionalities that are required to fully improve the user experience in next-generation WLANs.
Abstract: IEEE 802.11ax-2019 will replace both IEEE 802.11n-2009 and IEEE 802.11ac-2013 as the next high-throughput Wireless Local Area Network (WLAN) amendment. In this paper, we review the expected future WLAN scenarios and use-cases that justify the push for a new PHY/MAC IEEE 802.11 amendment. After that, we overview a set of new technical features that may be included in the IEEE 802.11ax-2019 amendment and describe both their advantages and drawbacks. Finally, we discuss some of the network-level functionalities that are required to fully improve the user experience in next-generation WLANs and note their relation with other on-going IEEE 802.11 amendments.