scispace - formally typeset
Search or ask a question
Author

Amit Bhardwaj

Bio: Amit Bhardwaj is an academic researcher from Indian Institute of Technology Bombay. The author has contributed to research in topics: Haptic technology & Random forest. The author has an hindex of 6, co-authored 22 publications receiving 210 citations. Previous affiliations of Amit Bhardwaj include Indian Institutes of Technology & Technische Universität München.

Papers
More filters
Journal ArticleDOI
08 Jan 2019
TL;DR: The aspects of the framework such as its created TI architecture, including the elements, functions, interfaces, and other considerations therein, as well as the novel aspects and differentiating factors compared with, e.g., 5G Ultra-Reliable Low-Latency Communication.
Abstract: The IEEE “Tactile Internet” (TI) Standards working group (WG), designated the numbering IEEE 1918.1, undertakes pioneering work on the development of standards for the TI. This paper describes the WG, its intentions, and its developing baseline standard and the associated reasoning behind that and touches on a further standard already initiated under its scope: IEEE 1918.1.1 on “Haptic Codecs for the TI.” IEEE 1918.1 and its baseline standard aim to set the framework and act as the foundations for the TI, thereby also serving as a basis for further standards developed on TI within the WG. This paper discusses the aspects of the framework such as its created TI architecture, including the elements, functions, interfaces, and other considerations therein, as well as the novel aspects and differentiating factors compared with, e.g., 5G Ultra-Reliable Low-Latency Communication, where it is noted that the TI will likely operate as an overlay on other networks or combinations of networks. Key foundations of the WG and its baseline standard are also highlighted, including the intended use cases and associated requirements that the standard must serve, and the TI’s fundamental definition and assumptions as understood by the WG, among other aspects.

113 citations

Journal ArticleDOI
01 Feb 2019
TL;DR: In this article, the authors present the fundamentals and state of the art in haptic codec design for the Tactile Internet and discuss how limitations of the human haptic perception system can be exploited for efficient perceptual coding of kinesthetic and tactile information.
Abstract: The Tactile Internet will enable users to physically explore remote environments and to make their skills available across distances. An important technological aspect in this context is the acquisition, compression, transmission, and display of haptic information. In this paper, we present the fundamentals and state of the art in haptic codec design for the Tactile Internet. The discussion covers both kinesthetic data reduction and tactile signal compression approaches. We put a special focus on how limitations of the human haptic perception system can be exploited for efficient perceptual coding of kinesthetic and tactile information. Further aspects addressed in this paper are the multiplexing of audio and video with haptic information and the quality evaluation of haptic communication solutions. Finally, we describe the current status of the ongoing IEEE standardization activity P1918.1.1 which has the ambition to standardize the first set of codecs for kinesthetic and tactile information exchange across communication networks.

104 citations

Journal ArticleDOI
TL;DR: In this paper, the near-infrared observations of population II Cepheids in the Galactic bulge from VVV survey were used to derive period-luminosity relations after correcting mean-magnitudes for the extinction.
Abstract: We present the near-infrared observations of population II Cepheids in the Galactic bulge from VVV survey. We identify 340 population II Cepheids in the Galactic bulge from VVV survey based on their match with OGLE-III Catalogue. The single-epoch $JH$ and multi-epoch $K_s$ observations complement the accurate periods and optical $(VI)$ mean-magnitudes from OGLE. The sample consisting of BL Herculis and W Virginis subtypes is used to derive period-luminosity relations after correcting mean-magnitudes for the extinction. Our $K_s$-band period-luminosity relation, $K_s = -2.189(0.056)~[\log(P) - 1] + 11.187(0.032)$, is consistent with published work for BL Herculis and W Virginis variables in the Large Magellanic Cloud. We present a combined OGLE-III and VVV catalogue with periods, classification, mean magnitudes and extinction for 264 Galactic bulge population II Cepheids having good-quality $K_s$-band light curves. The absolute magnitudes for population II Cepheids and RR Lyraes calibrated using Gaia and Hubble Space Telescope parallaxes, together with calibrated magnitudes for Large Magellanic Cloud population II Cepheids, are used to obtain a distance to the Galactic center, $R_0=8.34\pm0.03{\mathrm{(stat.)}}\pm0.41{\mathrm{(syst.)}}$, which changes by $^{+ 0.05}_{-0.25}$ with different extinction laws. While noting the limitation of small number statistics, we find that the present sample of population II Cepheids in the Galactic bulge shows a nearly spheroidal spatial distribution, similar to metal-poor RR Lyrae variables. We do not find evidence of the inclined bar as traced by the metal-rich red-clump stars. The number density for population II Cepheids is more limited as compared to abundant RR Lyraes but they are bright and exhibit a wide range in period that provides a robust period-luminosity relation for an accurate estimate of the distance to the Galactic center.

34 citations

Proceedings ArticleDOI
27 May 2013
TL;DR: An experimental setup where users are subjected to piecewise constant haptic stimuli to which they can respond with a click is described and the answer suggests adaptive sampling schemes that improve over Weber sampling.
Abstract: In applications such as telesurgery, it is required to transmit haptic signals to a remote location with a delay of at most few milliseconds. To reduce the packet rate and yet retain perceptual quality, adaptive sampling has been explored in the literature. In particular, in earlier work we proposed and analyzed an adaptive sampling scheme based on Weber's law of perception. In this paper, we explore other possible adaptive sampling candidates. We describe an experimental setup where users are subjected to piecewise constant haptic stimuli to which they can respond with a click. We record the clicks and ask the question: can we identify signal features and classiers to predict the clicks? The answer suggests adaptive sampling schemes that improve over Weber sampling.

18 citations

Proceedings ArticleDOI
01 Oct 2017
TL;DR: A reference hardware and software setup for the evaluation of kinesthetic codecs is proposed and a typical teleoperation scenario in a virtual environment for the realization of closed-loop kinesthetic interactions is defined.
Abstract: Recently, the IEEE P1918.1 standardization activity has been initiated for defining a framework for the Tactile Internet. Within this activity, IEEE P1918.1.1 is a task group for the standardization of Haptic Codecs for the Tactile Internet. Primary goal of the task group is to define/develop codecs for both closed-loop (kinesthetic information exchange) and open-loop (tactile information exchange) communications. In this paper, we propose a reference hardware and software setup for the evaluation of kinesthetic codecs. The setup defines a typical teleoperation scenario in a virtual environment for the realization of closed-loop kinesthetic interactions. For the installation and testing of the setup, we provide detailed guidelines in the paper. The paper also provides sample data traces for both static and dynamic kinesthetic interactions. These data traces may be used for preliminary testing of kinesthetic codecs. The paper also provides links for the download of both the reference setup and the data traces.

18 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
TL;DR: The Internet of Nano Things and Tactile Internet are driving the innovation in the H-IoT applications and the future course for improving the Quality of Service (QoS) using these new technologies are identified.
Abstract: The impact of the Internet of Things (IoT) on the advancement of the healthcare industry is immense. The ushering of the Medicine 4.0 has resulted in an increased effort to develop platforms, both at the hardware level as well as the underlying software level. This vision has led to the development of Healthcare IoT (H-IoT) systems. The basic enabling technologies include the communication systems between the sensing nodes and the processors; and the processing algorithms for generating an output from the data collected by the sensors. However, at present, these enabling technologies are also supported by several new technologies. The use of Artificial Intelligence (AI) has transformed the H-IoT systems at almost every level. The fog/edge paradigm is bringing the computing power close to the deployed network and hence mitigating many challenges in the process. While the big data allows handling an enormous amount of data. Additionally, the Software Defined Networks (SDNs) bring flexibility to the system while the blockchains are finding the most novel use cases in H-IoT systems. The Internet of Nano Things (IoNT) and Tactile Internet (TI) are driving the innovation in the H-IoT applications. This paper delves into the ways these technologies are transforming the H-IoT systems and also identifies the future course for improving the Quality of Service (QoS) using these new technologies.

446 citations

Journal ArticleDOI
24 Apr 2020-Symmetry
TL;DR: This study highlights the most promising lines of research from the recent literature in common directions for the 6G project, exploring the critical issues and key potential features of 6G communications and contributing significantly to opening new horizons for future research directions.
Abstract: The standardization activities of the fifth generation communications are clearly over and deployment has commenced globally. To sustain the competitive edge of wireless networks, industrial and academia synergy have begun to conceptualize the next generation of wireless communication systems (namely, sixth generation, (6G)) aimed at laying the foundation for the stratification of the communication needs of the 2030s. In support of this vision, this study highlights the most promising lines of research from the recent literature in common directions for the 6G project. Its core contribution involves exploring the critical issues and key potential features of 6G communications, including: (i) vision and key features; (ii) challenges and potential solutions; and (iii) research activities. These controversial research topics were profoundly examined in relation to the motivation of their various sub-domains to achieve a precise, concrete, and concise conclusion. Thus, this article will contribute significantly to opening new horizons for future research directions.

207 citations

Journal ArticleDOI
04 Mar 2021
TL;DR: In this paper, the authors discuss the potential of applying supervised/unsupervised deep learning and deep reinforcement learning in ultrareliable and low-latency communications (URLLCs) in future 6G networks.
Abstract: As one of the key communication scenarios in the fifth-generation and also the sixth-generation (6G) mobile communication networks, ultrareliable and low-latency communications (URLLCs) will be central for the development of various emerging mission-critical applications. State-of-the-art mobile communication systems do not fulfill the end-to-end delay and overall reliability requirements of URLLCs. In particular, a holistic framework that takes into account latency, reliability, availability, scalability, and decision-making under uncertainty is lacking. Driven by recent breakthroughs in deep neural networks, deep learning algorithms have been considered as promising ways of developing enabling technologies for URLLCs in future 6G networks. This tutorial illustrates how domain knowledge (models, analytical tools, and optimization frameworks) of communications and networking can be integrated into different kinds of deep learning algorithms for URLLCs. We first provide some background of URLLCs and review promising network architectures and deep learning frameworks for 6G. To better illustrate how to improve learning algorithms with domain knowledge, we revisit model-based analytical tools and cross-layer optimization frameworks for URLLCs. Following this, we examine the potential of applying supervised/unsupervised deep learning and deep reinforcement learning in URLLCs and summarize related open problems. Finally, we provide simulation and experimental results to validate the effectiveness of different learning algorithms and discuss future directions.

203 citations