scispace - formally typeset
Search or ask a question
Author

Amitava Ghosh

Other affiliations: Motorola, Motorola Solutions
Bio: Amitava Ghosh is an academic researcher from Nokia. The author has contributed to research in topics: Telecommunications link & Base station. The author has an hindex of 35, co-authored 103 publications receiving 5760 citations. Previous affiliations of Amitava Ghosh include Motorola & Motorola Solutions.


Papers
More filters
Journal ArticleDOI
TL;DR: An overview of the techniques being considered for LTE Release 10 (aka LTEAdvanced) is discussed, which includes bandwidth extension via carrier aggregation to support deployment bandwidths up to 100 MHz, downlink spatial multiplexing including single-cell multi-user multiple-input multiple-output transmission and coordinated multi point transmission, and heterogeneous networks with emphasis on Type 1 and Type 2 relays.
Abstract: LTE Release 8 is one of the primary broadband technologies based on OFDM, which is currently being commercialized. LTE Release 8, which is mainly deployed in a macro/microcell layout, provides improved system capacity and coverage, high peak data rates, low latency, reduced operating costs, multi-antenna support, flexible bandwidth operation and seamless integration with existing systems. LTE-Advanced (also known as LTE Release 10) significantly enhances the existing LTE Release 8 and supports much higher peak rates, higher throughput and coverage, and lower latencies, resulting in a better user experience. Additionally, LTE Release 10 will support heterogeneous deployments where low-power nodes comprising picocells, femtocells, relays, remote radio heads, and so on are placed in a macrocell layout. The LTE-Advanced features enable one to meet or exceed IMT-Advanced requirements. It may also be noted that LTE Release 9 provides some minor enhancement to LTE Release 8 with respect to the air interface, and includes features like dual-layer beamforming and time-difference- of-arrival-based location techniques. In this article an overview of the techniques being considered for LTE Release 10 (aka LTEAdvanced) is discussed. This includes bandwidth extension via carrier aggregation to support deployment bandwidths up to 100 MHz, downlink spatial multiplexing including single-cell multi-user multiple-input multiple-output transmission and coordinated multi point transmission, uplink spatial multiplexing including extension to four-layer MIMO, and heterogeneous networks with emphasis on Type 1 and Type 2 relays. Finally, the performance of LTEAdvanced using IMT-A scenarios is presented and compared against IMT-A targets for full buffer and bursty traffic model.

1,044 citations

Journal ArticleDOI
TL;DR: A case is made for using mmWave for a fifth generation (5G) wireless system for ultradense networks by presenting an overview of enhanced local area (eLA) technology at mmWave with emphasis on 5G requirements, spectrum considerations, propagation and channel modeling, air-interface and multiantenna design, and network architecture solutions.
Abstract: Wireless data traffic is projected to skyrocket 10 000 fold within the next 20 years. To tackle this incredible increase in wireless data traffic, a first approach is to further improve spectrally efficient systems such as 4G LTE in bands below 6 GHz by using more advanced spectral efficiency techniques. However, the required substantial increase in system complexity along with fundamental limits on hardware implementation and channel conditions may limit the viability of this approach. Furthermore, the end result would be an extremely spectrally efficient system with little room for future improvement to meet the ever-growing wireless data usage. The second approach is to move up in frequency, into an unused nontraditional spectrum where enormous bandwidths are available, such as at millimeter wave (mmWave). The mmWave option enables the use of simple air interfaces since large bandwidths can be exploited (e.g., 2 GHz) to achieve high data rates rather than relying on highly complex techniques originally aimed at achieving a high spectral efficiency with smaller bandwidths. In addition, mmWave systems will easily evolve to even higher system capacities, because there will be plenty of margin to improve the spectral efficiency as data demands further increase. In this paper, a case is made for using mmWave for a fifth generation (5G) wireless system for ultradense networks by presenting an overview of enhanced local area (eLA) technology at mmWave with emphasis on 5G requirements, spectrum considerations, propagation and channel modeling, air-interface and multiantenna design, and network architecture solutions.

793 citations

Proceedings ArticleDOI
15 May 2016
TL;DR: This document describes an initial 3D channel model which includes a baseline model for incorporating path loss, shadow fading, line of sight probability, penetration and blockage models for the typical scenarios of 5G channel models for bands up to 100 GHz.
Abstract: For the development of new 5G systems to operate in bands up to 100 GHz, there is a need for accurate radio propagation models at these bands that currently are not addressed by existing channel models developed for bands below 6 GHz. This document presents a preliminary overview of 5G channel models for bands up to 100 GHz. These have been derived based on extensive measurement and ray tracing results across a multitude of frequencies from 6 GHz to 100 GHz, and this document describes an initial 3D channel model which includes: 1) typical deployment scenarios for urban microcells (UMi) and urban macrocells (UMa), and 2) a baseline model for incorporating path loss, shadow fading, line of sight probability, penetration and blockage models for the typical scenarios. Various processing methodologies such as clustering and antenna decoupling algorithms are also presented.

281 citations

Proceedings ArticleDOI
15 May 2016
TL;DR: This paper presents and compares two candidate large-scale propagation path loss models, the alpha-beta-gamma (ABG) model and the close-in (CI) free space reference distance model, for the design of fifth generation (5G) wireless communication systems in urban micro- and macro-cellular scenarios.
Abstract: This paper presents and compares two candidate large-scale propagation path loss models, the alpha-beta-gamma (ABG) model and the close-in (CI) free space reference distance model, for the design of fifth generation (5G) wireless communication systems in urban micro- and macro-cellular scenarios. Comparisons are made using the data obtained from 20 propagation measurement campaigns or ray- tracing studies from 2 GHz to 73.5 GHz over distances ranging from 5 m to 1429 m. The results show that the one-parameter CI model has a very similar goodness of fit (i.e., the shadow fading standard deviation) in both line-of-sight and non-line-of-sight environments, while offering substantial simplicity and more stable behavior across frequencies and distances, as compared to the three-parameter ABG model. Additionally, the CI model needs only one very subtle and simple modification to the existing 3GPP floating-intercept path loss model (replacing a constant with a close-in free space reference value) in order to provide greater simulation accuracy, more simplicity, better repeatability across experiments, and higher stability across a vast range of frequencies.

269 citations

Patent
21 Mar 1996
TL;DR: In this article, a method and apparatus for determining the location of a communication unit in a CDMA system includes in a first embodiment, sending a location request via a spread spectrum signal to the subscriber (140), and receiving in return a subscriber signal including a response message showing a receive time of a particular symbol of the base's spreading sequence and a transmit time of the subscriber's transmission sequence.
Abstract: A method and apparatus for determining the location of a communication unit in a CDMA system includes in a first embodiment, sending a location request via a spread spectrum signal to the subscriber (140), and receiving in return a subscriber signal including a response message showing a receive time of a particular symbol of the base's spreading sequence and a transmit time of a particular symbol of the subscriber's spreading sequence. The base (130), along with other receiving base(s) (140), also receives a predetermined symbol of the subscriber spreading sequence, and each determines a respective receive time of the predetermined symbol. The received information is then processed, along with known base location and delay information, to determine the subscriber location. If insufficient number of bases are capable of communicating with the subscriber, for example due to high loading/interference, auxiliairy bases (121) are also provided for receiving from or transmitting to the subscriber.

254 citations


Cited by
More filters
Book
01 Jan 2005

9,038 citations

Journal ArticleDOI
TL;DR: This paper discusses all of these topics, identifying key challenges for future research and preliminary 5G standardization activities, while providing a comprehensive overview of the current literature, and in particular of the papers appearing in this special issue.
Abstract: What will 5G be? What it will not be is an incremental advance on 4G. The previous four generations of cellular technology have each been a major paradigm shift that has broken backward compatibility. Indeed, 5G will need to be a paradigm shift that includes very high carrier frequencies with massive bandwidths, extreme base station and device densities, and unprecedented numbers of antennas. However, unlike the previous four generations, it will also be highly integrative: tying any new 5G air interface and spectrum together with LTE and WiFi to provide universal high-rate coverage and a seamless user experience. To support this, the core network will also have to reach unprecedented levels of flexibility and intelligence, spectrum regulation will need to be rethought and improved, and energy and cost efficiencies will become even more critical considerations. This paper discusses all of these topics, identifying key challenges for future research and preliminary 5G standardization activities, while providing a comprehensive overview of the current literature, and in particular of the papers appearing in this special issue.

7,139 citations

Journal ArticleDOI
TL;DR: An overview of the Internet of Things with emphasis on enabling technologies, protocols, and application issues, and some of the key IoT challenges presented in the recent literature are provided and a summary of related research work is provided.
Abstract: This paper provides an overview of the Internet of Things (IoT) with emphasis on enabling technologies, protocols, and application issues. The IoT is enabled by the latest developments in RFID, smart sensors, communication technologies, and Internet protocols. The basic premise is to have smart sensors collaborate directly without human involvement to deliver a new class of applications. The current revolution in Internet, mobile, and machine-to-machine (M2M) technologies can be seen as the first phase of the IoT. In the coming years, the IoT is expected to bridge diverse technologies to enable new applications by connecting physical objects together in support of intelligent decision making. This paper starts by providing a horizontal overview of the IoT. Then, we give an overview of some technical details that pertain to the IoT enabling technologies, protocols, and applications. Compared to other survey papers in the field, our objective is to provide a more thorough summary of the most relevant protocols and application issues to enable researchers and application developers to get up to speed quickly on how the different protocols fit together to deliver desired functionalities without having to go through RFCs and the standards specifications. We also provide an overview of some of the key IoT challenges presented in the recent literature and provide a summary of related research work. Moreover, we explore the relation between the IoT and other emerging technologies including big data analytics and cloud and fog computing. We also present the need for better horizontal integration among IoT services. Finally, we present detailed service use-cases to illustrate how the different protocols presented in the paper fit together to deliver desired IoT services.

6,131 citations

Journal ArticleDOI
TL;DR: This paper considers transmit precoding and receiver combining in mmWave systems with large antenna arrays and develops algorithms that accurately approximate optimal unconstrained precoders and combiners such that they can be implemented in low-cost RF hardware.
Abstract: Millimeter wave (mmWave) signals experience orders-of-magnitude more pathloss than the microwave signals currently used in most wireless applications and all cellular systems. MmWave systems must therefore leverage large antenna arrays, made possible by the decrease in wavelength, to combat pathloss with beamforming gain. Beamforming with multiple data streams, known as precoding, can be used to further improve mmWave spectral efficiency. Both beamforming and precoding are done digitally at baseband in traditional multi-antenna systems. The high cost and power consumption of mixed-signal devices in mmWave systems, however, make analog processing in the RF domain more attractive. This hardware limitation restricts the feasible set of precoders and combiners that can be applied by practical mmWave transceivers. In this paper, we consider transmit precoding and receiver combining in mmWave systems with large antenna arrays. We exploit the spatial structure of mmWave channels to formulate the precoding/combining problem as a sparse reconstruction problem. Using the principle of basis pursuit, we develop algorithms that accurately approximate optimal unconstrained precoders and combiners such that they can be implemented in low-cost RF hardware. We present numerical results on the performance of the proposed algorithms and show that they allow mmWave systems to approach their unconstrained performance limits, even when transceiver hardware constraints are considered.

3,146 citations

Journal ArticleDOI
TL;DR: This survey makes an exhaustive review of wireless evolution toward 5G networks, including the new architectural changes associated with the radio access network (RAN) design, including air interfaces, smart antennas, cloud and heterogeneous RAN, and underlying novel mm-wave physical layer technologies.
Abstract: The vision of next generation 5G wireless communications lies in providing very high data rates (typically of Gbps order), extremely low latency, manifold increase in base station capacity, and significant improvement in users’ perceived quality of service (QoS), compared to current 4G LTE networks. Ever increasing proliferation of smart devices, introduction of new emerging multimedia applications, together with an exponential rise in wireless data (multimedia) demand and usage is already creating a significant burden on existing cellular networks. 5G wireless systems, with improved data rates, capacity, latency, and QoS are expected to be the panacea of most of the current cellular networks’ problems. In this survey, we make an exhaustive review of wireless evolution toward 5G networks. We first discuss the new architectural changes associated with the radio access network (RAN) design, including air interfaces, smart antennas, cloud and heterogeneous RAN. Subsequently, we make an in-depth survey of underlying novel mm-wave physical layer technologies, encompassing new channel model estimation, directional antenna design, beamforming algorithms, and massive MIMO technologies. Next, the details of MAC layer protocols and multiplexing schemes needed to efficiently support this new physical layer are discussed. We also look into the killer applications, considered as the major driving force behind 5G. In order to understand the improved user experience, we provide highlights of new QoS, QoE, and SON features associated with the 5G evolution. For alleviating the increased network energy consumption and operating expenditure, we make a detail review on energy awareness and cost efficiency. As understanding the current status of 5G implementation is important for its eventual commercialization, we also discuss relevant field trials, drive tests, and simulation experiments. Finally, we point out major existing research issues and identify possible future research directions.

2,624 citations