scispace - formally typeset
Search or ask a question
Author

Mamta Agiwal

Bio: Mamta Agiwal is an academic researcher from Sejong University. The author has contributed to research in topics: Wireless network & Random access. The author has an hindex of 9, co-authored 21 publications receiving 2126 citations. Previous affiliations of Mamta Agiwal include Hanyang University & Sungkyunkwan University.

Papers
More filters
Journal ArticleDOI
TL;DR: This survey makes an exhaustive review of wireless evolution toward 5G networks, including the new architectural changes associated with the radio access network (RAN) design, including air interfaces, smart antennas, cloud and heterogeneous RAN, and underlying novel mm-wave physical layer technologies.
Abstract: The vision of next generation 5G wireless communications lies in providing very high data rates (typically of Gbps order), extremely low latency, manifold increase in base station capacity, and significant improvement in users’ perceived quality of service (QoS), compared to current 4G LTE networks. Ever increasing proliferation of smart devices, introduction of new emerging multimedia applications, together with an exponential rise in wireless data (multimedia) demand and usage is already creating a significant burden on existing cellular networks. 5G wireless systems, with improved data rates, capacity, latency, and QoS are expected to be the panacea of most of the current cellular networks’ problems. In this survey, we make an exhaustive review of wireless evolution toward 5G networks. We first discuss the new architectural changes associated with the radio access network (RAN) design, including air interfaces, smart antennas, cloud and heterogeneous RAN. Subsequently, we make an in-depth survey of underlying novel mm-wave physical layer technologies, encompassing new channel model estimation, directional antenna design, beamforming algorithms, and massive MIMO technologies. Next, the details of MAC layer protocols and multiplexing schemes needed to efficiently support this new physical layer are discussed. We also look into the killer applications, considered as the major driving force behind 5G. In order to understand the improved user experience, we provide highlights of new QoS, QoE, and SON features associated with the 5G evolution. For alleviating the increased network energy consumption and operating expenditure, we make a detail review on energy awareness and cost efficiency. As understanding the current status of 5G implementation is important for its eventual commercialization, we also discuss relevant field trials, drive tests, and simulation experiments. Finally, we point out major existing research issues and identify possible future research directions.

2,624 citations

Journal ArticleDOI
TL;DR: Technical details of emerging 5G networks inline with pressing IoT requirements are presented, essential for the ultimate shaping of a connected living and delineate limitations of legacy networks to provide for peculiarities of IoT requirements.
Abstract: Connected living − the true vision of the Internet of Things (IoT) – offers improvement in the quality of life while presenting new business avenues. A combined effort by researchers, industries, m...

83 citations

Journal ArticleDOI
TL;DR: In this article, the authors survey and consolidate the 4G-5G inter working solutions that can assist in attaining the insight about various inter working possibilities and their challenges and discuss spectrum sharing possibilities between 4G and 5G wireless networks.
Abstract: Rising popularity of 5G communications is making tremendous demands on the cellular network operators for providing true 5G services to the users. With limited numbers of 5G users initially, the investments for 5G services can be very high. In the early stage of 5G deployments, the 5G cells would not be lavishly spread and there would be 5G coverage holes. The operators can provide seamless services to the 5G users by inter working with the existing 4G Long-Term Evolution (LTE) network. The 5G inter working with fully deployed LTE would not only provide fast and seamless coverage but would also provide economic viability to the network operators. In this paper we survey and consolidate the 4G-5G inter working solutions that can assist in attaining the insight about various inter working possibilities and their challenges. It is important that a network operator is able to optimize its deployed infrastructure while being able to guarantee fast and seamless transition to 5G for its subscribers. To this regard, we evaluate the performance and radio resource management challenges for different 4G-5G dual connectivity options proposed by 3rd Generation Partnership Project (3GPP) standardization. We also discuss spectrum sharing possibilities between 4G and 5G wireless networks. Finally, various research challenges and discussions on path for migration to 5G standalone networks are also presented.

83 citations

Journal ArticleDOI
TL;DR: This work proposes to exploit dual connectivity of UE, to both LTE eNB and NR nodeB, for effective 5G DRX, where beam searching is performed only when necessary, and achieves 13% improvement in power saving for HD-DR X compared with directional-DRX.
Abstract: For mmWave directional air interface expected in 5G communications, current discontinuous reception (DRX) mechanisms would be inadequate. Beam searching, for alignment of beams at User Equipment (UE) and 5G base station (NR nodeB), cannot be avoided in directional communication. We propose to exploit dual connectivity of UE, to both LTE eNB and NR nodeB, for effective 5G DRX. We present a novel hybrid directional-DRX (HD-DRX) mechanism, where beam searching is performed only when necessary. Probabilistic estimate of power saving and delay is conducted by capturing various states of UE through a semi-Markov process. Our numerical analysis achieves 13% improvement in power saving for HD-DRX compared with directional-DRX. We validate our numerical analysis with simulation studies on real traffic trace.

38 citations

Journal ArticleDOI
TL;DR: New Directional-Discontinuous Reception (DDRX) for directional air interface expected in mmWave enabled 5G communications is introduced and three new DDRX mechanisms are proposed to limit the impediments on power saving.
Abstract: Emerging mmWave enabled 5G wireless communications, offer tremendous increase in data demands and better quality expectations. However, high mmWave frequencies call for directional beamforming to overcome propagation limitations and enhance spatial capabilities. The establishment of new directional paradigm challenges the current Discontinuous Reception (DRX) mechanism for power saving in the User Equipment (UE). In this article, we introduce new Directional-Discontinuous Reception (DDRX) for directional air interface expected in mmWave enabled 5G communications. DDRX mechanism emphasizes the importance of beam searching for alignment of directional beams between UE and 5G base station (gNB), after every sleep cycle. Beam searching, though inevitable, reduces the effective sleep time. We propose three new DDRX mechanisms: Integrated DDRX (I-DDRX), Standalone DDRX (S-DDRX), and Cooperative DDRX (C-DDRX) to limit the impediments on power saving. Probabilistic estimation of UE's power saving and delay is performed for all the three different DDRX mechanisms using semi-Markov process. The power saving achieved in I-DDRX is 12.1 percent higher than S-DDRX. The power saving of C-DDRX is 6.1 percent higher than S-DDRX. Analytical results of DDRX proposals are validated by simulation studies, performed over real wireless trace.

28 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A comprehensive survey, analyzing how edge computing improves the performance of IoT networks and considers security issues in edge computing, evaluating the availability, integrity, and the confidentiality of security strategies of each group, and proposing a framework for security evaluation of IoT Networks with edge computing.
Abstract: The Internet of Things (IoT) now permeates our daily lives, providing important measurement and collection tools to inform our every decision. Millions of sensors and devices are continuously producing data and exchanging important messages via complex networks supporting machine-to-machine communications and monitoring and controlling critical smart-world infrastructures. As a strategy to mitigate the escalation in resource congestion, edge computing has emerged as a new paradigm to solve IoT and localized computing needs. Compared with the well-known cloud computing, edge computing will migrate data computation or storage to the network “edge,” near the end users. Thus, a number of computation nodes distributed across the network can offload the computational stress away from the centralized data center, and can significantly reduce the latency in message exchange. In addition, the distributed structure can balance network traffic and avoid the traffic peaks in IoT networks, reducing the transmission latency between edge/cloudlet servers and end users, as well as reducing response times for real-time IoT applications in comparison with traditional cloud services. Furthermore, by transferring computation and communication overhead from nodes with limited battery supply to nodes with significant power resources, the system can extend the lifetime of the individual nodes. In this paper, we conduct a comprehensive survey, analyzing how edge computing improves the performance of IoT networks. We categorize edge computing into different groups based on architecture, and study their performance by comparing network latency, bandwidth occupation, energy consumption, and overhead. In addition, we consider security issues in edge computing, evaluating the availability, integrity, and the confidentiality of security strategies of each group, and propose a framework for security evaluation of IoT networks with edge computing. Finally, we compare the performance of various IoT applications (smart city, smart grid, smart transportation, and so on) in edge computing and traditional cloud computing architectures.

1,008 citations

Journal ArticleDOI
TL;DR: This paper bridges the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas, and provides an encyclopedic review of mobile and Wireless networking research based on deep learning, which is categorize by different domains.
Abstract: The rapid uptake of mobile devices and the rising popularity of mobile applications and services pose unprecedented demands on mobile and wireless networking infrastructure. Upcoming 5G systems are evolving to support exploding mobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as to maximize user experience. Fulfilling these tasks is challenging, as mobile environments are increasingly complex, heterogeneous, and evolving. One potential solution is to resort to advanced machine learning techniques, in order to help manage the rise in data volumes and algorithm-driven applications. The recent success of deep learning underpins new and powerful tools that tackle problems in this space. In this paper, we bridge the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas. We first briefly introduce essential background and state-of-the-art in deep learning techniques with potential applications to networking. We then discuss several techniques and platforms that facilitate the efficient deployment of deep learning onto mobile systems. Subsequently, we provide an encyclopedic review of mobile and wireless networking research based on deep learning, which we categorize by different domains. Drawing from our experience, we discuss how to tailor deep learning to mobile environments. We complete this survey by pinpointing current challenges and open future directions for research.

975 citations

Journal ArticleDOI
TL;DR: This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT, such as 5G new radio, multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail.
Abstract: Recently, wireless technologies have been growing actively all around the world. In the context of wireless technology, fifth-generation (5G) technology has become a most challenging and interesting topic in wireless research. This article provides an overview of the Internet of Things (IoT) in 5G wireless systems. IoT in the 5G system will be a game changer in the future generation. It will open a door for new wireless architecture and smart services. Recent cellular network LTE (4G) will not be sufficient and efficient to meet the demands of multiple device connectivity and high data rate, more bandwidth, low-latency quality of service (QoS), and low interference. To address these challenges, we consider 5G as the most promising technology. We provide a detailed overview of challenges and vision of various communication industries in 5G IoT systems. The different layers in 5G IoT systems are discussed in detail. This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT. We consider the technology drivers for 5G wireless technology, such as 5G new radio (NR), multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail. We also provide a review on low-power wide-area networks (LPWANs), security challenges, and its control measure in the 5G IoT scenario. This article introduces the role of AR in the 5G IoT scenario. This article also discusses the research gaps and future directions. The focus is also on application areas of IoT in 5G systems. We, therefore, outline some of the important research directions in 5G IoT.

896 citations

Journal ArticleDOI
TL;DR: This survey makes an exhaustive review on the state-of-the-art research efforts on mobile edge networks, including definition, architecture, and advantages, and presents a comprehensive survey of issues on computing, caching, and communication techniques at the network edge.
Abstract: As the explosive growth of smart devices and the advent of many new applications, traffic volume has been growing exponentially. The traditional centralized network architecture cannot accommodate such user demands due to heavy burden on the backhaul links and long latency. Therefore, new architectures, which bring network functions and contents to the network edge, are proposed, i.e., mobile edge computing and caching. Mobile edge networks provide cloud computing and caching capabilities at the edge of cellular networks. In this survey, we make an exhaustive review on the state-of-the-art research efforts on mobile edge networks. We first give an overview of mobile edge networks, including definition, architecture, and advantages. Next, a comprehensive survey of issues on computing, caching, and communication techniques at the network edge is presented. The applications and use cases of mobile edge networks are discussed. Subsequently, the key enablers of mobile edge networks, such as cloud technology, SDN/NFV, and smart devices are discussed. Finally, open research challenges and future directions are presented as well.

782 citations

Journal ArticleDOI
TL;DR: This paper constitutes the first holistic tutorial on the development of ANN-based ML techniques tailored to the needs of future wireless networks and overviews how artificial neural networks (ANNs)-based ML algorithms can be employed for solving various wireless networking problems.
Abstract: In order to effectively provide ultra reliable low latency communications and pervasive connectivity for Internet of Things (IoT) devices, next-generation wireless networks can leverage intelligent, data-driven functions enabled by the integration of machine learning (ML) notions across the wireless core and edge infrastructure. In this context, this paper provides a comprehensive tutorial that overviews how artificial neural networks (ANNs)-based ML algorithms can be employed for solving various wireless networking problems. For this purpose, we first present a detailed overview of a number of key types of ANNs that include recurrent, spiking, and deep neural networks, that are pertinent to wireless networking applications. For each type of ANN, we present the basic architecture as well as specific examples that are particularly important and relevant wireless network design. Such ANN examples include echo state networks, liquid state machine, and long short term memory. Then, we provide an in-depth overview on the variety of wireless communication problems that can be addressed using ANNs, ranging from communication using unmanned aerial vehicles to virtual reality applications over wireless networks as well as edge computing and caching. For each individual application, we present the main motivation for using ANNs along with the associated challenges while we also provide a detailed example for a use case scenario and outline future works that can be addressed using ANNs. In a nutshell, this paper constitutes the first holistic tutorial on the development of ANN-based ML techniques tailored to the needs of future wireless networks.

666 citations