scispace - formally typeset
Search or ask a question
Author

Saeed R. Khosravirad

Bio: Saeed R. Khosravirad is an academic researcher from Bell Labs. The author has contributed to research in topics: Communication channel & Computer science. The author has an hindex of 12, co-authored 42 publications receiving 459 citations. Previous affiliations of Saeed R. Khosravirad include McGill University & Nokia Networks.


Papers
More filters
Journal ArticleDOI
TL;DR: Numerical examples obtained in a Rayleigh-fading channel show that rate adaptation provides notable gains over a rate allocation and non-adaptive HARQ, and, for high SNR, only a few transmissions are necessary to approach closely the ergodic capacity.
Abstract: This paper considers incremental redundancy hybrid ARQ (HARQ) transmission over independent block-fading channels. The transmitter, having no knowledge of the instantaneous channel state information (CSI) can or - allocate the transmission rate knowing the statistics of the channel, or - adapt the transmissions rates using the outdated CSI, i.e., the one experienced by the receiver in the past transmissions that resulted in a packet decoding failure. Aiming at throughput maximization problems under constraint on the outage probability, we show how to optimize the rate-adaptation and rate-allocation policies using dynamic programming framework. Numerical examples obtained in a Rayleigh-fading channel show that rate adaptation provides notable gains over a rate allocation and non-adaptive HARQ, and, for high SNR, only a few transmissions are necessary to approach closely the ergodic capacity.

114 citations

Proceedings ArticleDOI
01 Sep 2017
TL;DR: A punctured scheduling scheme for efficient transmission of low latency communication (LLC) traffic, multiplexed on a downlink shared channel with enhanced mobile broadband traffic (eMBB), and finds that it is advantageous to include an element of eMBB-awareness for the scheduling decisions of the LLC transmissions.
Abstract: In this paper, we present a punctured scheduling scheme for efficient transmission of low latency communication (LLC) traffic, multiplexed on a downlink shared channel with enhanced mobile broadband traffic (eMBB). Puncturing allows to schedule eMBB traffic on all shared channel resources, without prior reservation of transmission resources for sporadically arriving LLC traffic. When LLC traffic arrives, it is immediately scheduled with a short transmission by puncturing part of the ongoing eMBB transmissions. To have this working efficiently, we propose recovery mechanisms for punctured eMBB transmissions, and a service-specific scheduling policy and link adaptation. Among others, we find that it is advantageous to include an element of eMBB-awareness for the scheduling decisions of the LLC transmissions (i.e. those that puncture ongoing eMBB transmissions), to primarily puncture eMBB transmission(s) that are transmitted with low modulation and coding scheme index. System level simulations are presented to demonstrate the benefits of the proposed solution.

96 citations

Posted Content
TL;DR: In this paper, the authors provide a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks.
Abstract: The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.

69 citations

Journal ArticleDOI
TL;DR: This paper addresses the optimal problem of the latency/reliablity in digital twins-enabled metaverse by optimizing various communication and computation variables, namely, offloading portions, edge caching policies, bandwidth allocation, transmit power, computation resources of user devices and edge servers.
Abstract: In this letter, we propose a novel digital twin scheme supported metaverse by jointly considering the integrated model of communications, computing, and storage through the employment of mobile edge computing (MEC) and ultra-reliable and low latency communications (URLLC). The MEC-based URLLC digital twin architecture is proposed to provide powerful computing infrastructure by exploring task offloading, and task caching techniques in nearby edge servers to reduce the latency. In addition, the proposed digital twin scheme can guarantee stringent requirements of reliability and low latency, which are highly applicable for the future networked systems of metaverse. For this first time in the literature, our paper addresses the optimal problem of the latency/reliablity in digital twins-enabled metaverse by optimizing various communication and computation variables, namely, offloading portions, edge caching policies, bandwidth allocation, transmit power, computation resources of user devices and edge servers. The proposed scheme can improve the quality-of-experience of the digital twin in terms of latency and reliability with respect to metaverse applications.

44 citations

Proceedings ArticleDOI
15 May 2016
TL;DR: This paper investigates the feasibility of the usage of an early Hybrid Automatic Repeat reQuest (HARQ) feedback for reducing the latency of acknowledged transmissions without disregaring spectral efficiency.
Abstract: Besides coping with the increasing demand of broadband services, 5th Generation (5G) radio access technology is expected to support mission critical communication (MCC) services targeting very low latencies. In this paper, we investigate the feasibility of the usage of an early Hybrid Automatic Repeat reQuest (HARQ) feedback for reducing the latency of acknowledged transmissions without disregaring spectral efficiency. As enabler of such early feedback, a new technique for predicting the decoder outcome before decoding occurs, is proposed. This technique is intended to generate an early ACK/NACK, or an uncertain feedback in case a reliable prediction cannot be achieved. Simulation results show a very limited occurrence of false positives and false negatives, and uncertain feedback rates not exceeding 6% when adaptive modulation and coding (AMC) and a 10% Block Error Rate (BLER) target is assumed. A correct early ACK/NACK can be generated for around 90% of the transmissions or more.

43 citations


Cited by
More filters
Book Chapter
01 Jan 2017
TL;DR: Considering the trend in 5G, achieving significant gains in capacity and system throughput performance is a high priority requirement in view of the recent exponential increase in the volume of mobile traffic and the proposed system should be able to support enhanced delay-sensitive high-volume services.
Abstract: Radio access technologies for cellular mobile communications are typically characterized by multiple access schemes, e.g., frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA), and OFDMA. In the 4th generation (4G) mobile communication systems such as Long-Term Evolution (LTE) (Au et al., Uplink contention based SCMA for 5G radio access. Globecom Workshops (GC Wkshps), 2014. doi:10.1109/GLOCOMW.2014.7063547) and LTE-Advanced (Baracca et al., IEEE Trans. Commun., 2011. doi:10.1109/TCOMM.2011.121410.090252; Barry et al., Digital Communication, Kluwer, Dordrecht, 2004), standardized by the 3rd Generation Partnership Project (3GPP), orthogonal multiple access based on OFDMA or single carrier (SC)-FDMA is adopted. Orthogonal multiple access was a reasonable choice for achieving good system-level throughput performance with simple single-user detection. However, considering the trend in 5G, achieving significant gains in capacity and system throughput performance is a high priority requirement in view of the recent exponential increase in the volume of mobile traffic. In addition the proposed system should be able to support enhanced delay-sensitive high-volume services such as video streaming and cloud computing. Another high-level target of 5G is reduced cost, higher energy efficiency and robustness against emergencies.

635 citations

Journal ArticleDOI
TL;DR: In this paper, the major design aspects of such a cellular joint communication and sensing (JCAS) system are discussed, and an analysis of the choice of the waveform that points towards choosing the one that is best suited for communication also for radar sensing is presented.
Abstract: The 6G vision of creating authentic digital twin representations of the physical world calls for new sensing solutions to compose multi-layered maps of our environments. Radio sensing using the mobile communication network as a sensor has the potential to become an essential component of the solution. With the evolution of cellular systems to mmWave bands in 5G and potentially sub-THz bands in 6G, small cell deployments will begin to dominate. Large bandwidth systems deployed in small cell configurations provide an unprecedented opportunity to employ the mobile network for sensing. In this paper, we focus on the major design aspects of such a cellular joint communication and sensing (JCAS) system. We present an analysis of the choice of the waveform that points towards choosing the one that is best suited for communication also for radar sensing. We discuss several techniques for efficiently integrating the sensing capability into the JCAS system, some of which are applicable with NR air-interface for evolved 5G systems. Specifically, methods for reducing sensing overhead by appropriate sensing signal design or by configuring separate numerologies for communications and sensing are presented. Sophisticated use of the sensing signals is shown to reduce the signaling overhead by a factor of 2.67 for an exemplary road traffic monitoring use case. We then present a vision for future advanced JCAS systems building upon distributed massive MIMO and discuss various other research challenges for JCAS that need to be addressed in order to pave the way towards natively integrated JCAS in 6G.

223 citations

Journal ArticleDOI
TL;DR: This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency and identifies that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in theUnlicensed band.
Abstract: Future 5th generation networks are expected to enable three key services—enhanced mobile broadband, massive machine type communications and ultra-reliable and low latency communications (URLLC). As per the 3rd generation partnership project URLLC requirements, it is expected that the reliability of one transmission of a 32 byte packet will be at least 99.999% and the latency will be at most 1 ms. This unprecedented level of reliability and latency will yield various new applications, such as smart grids, industrial automation and intelligent transport systems. In this survey we present potential future URLLC applications, and summarize the corresponding reliability and latency requirements. We provide a comprehensive discussion on physical (PHY) and medium access control (MAC) layer techniques that enable URLLC, addressing both licensed and unlicensed bands. This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency. We identify that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in the unlicensed band, and provide numerical evaluations. Lastly, this paper discusses the potential future research directions and challenges in achieving the URLLC requirements.

185 citations

Journal ArticleDOI
TL;DR: Simulation results show that the proposed approach allocates resources to the incoming UR LLC traffic efficiently, while satisfying the reliability of both eMBB and URLLC.
Abstract: Ultra Reliable Low Latency Communication (URLLC) is a 5G New Radio (NR) application that requires strict reliability and latency. URLLC traffic is usually scheduled on top of the ongoing enhanced Mobile Broadband (eMBB) transmissions ( i.e., puncturing the current eMBB transmission) and cannot be queued due to its hard latency requirements. In this letter, we propose a risk-sensitive based formulation to allocate resources to the incoming URLLC traffic, while minimizing the risk of the eMBB transmission ( i.e., protecting the eMBB users with low data rate) and ensuring URLLC reliability. Specifically, the Conditional Value at Risk (CVaR) is introduced as a risk measure for eMBB transmission. Moreover, the reliability constraint of URLLC is formulated as a chance constraint and relaxed based on Markov’s inequality. We decompose the formulated problem into two subproblems in order to transform it into a convex form and then alternatively solve them until convergence. Simulation results show that the proposed approach allocates resources to the incoming URLLC traffic efficiently, while satisfying the reliability of both eMBB and URLLC.

177 citations

Journal ArticleDOI
TL;DR: Simulation results show that the proposed approach can satisfy the stringent URLLC reliability while keeping the eMBB reliability higher than 90%.
Abstract: In this paper, we study the resource slicing problem in a dynamic multiplexing scenario of two distinct 5G services, namely Ultra-Reliable Low Latency Communications (URLLC) and enhanced Mobile BroadBand (eMBB). While eMBB services focus on high data rates, URLLC is very strict in terms of latency and reliability. In view of this, the resource slicing problem is formulated as an optimization problem that aims at maximizing the eMBB data rate subject to a URLLC reliability constraint, while considering the variance of the eMBB data rate to reduce the impact of immediately scheduled URLLC traffic on the eMBB reliability. To solve the formulated problem, an optimization-aided Deep Reinforcement Learning (DRL) based framework is proposed, including: 1) eMBB resource allocation phase , and 2) URLLC scheduling phase . In the first phase, the optimization problem is decomposed into three subproblems and then each subproblem is transformed into a convex form to obtain an approximate resource allocation solution. In the second phase, a DRL-based algorithm is proposed to intelligently distribute the incoming URLLC traffic among eMBB users. Simulation results show that our proposed approach can satisfy the stringent URLLC reliability while keeping the eMBB reliability higher than 90%.

163 citations