scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Can we improve over weber sampling of haptic signals

TL;DR: An experimental setup where users are subjected to piecewise constant haptic stimuli to which they can respond with a click is described and the answer suggests adaptive sampling schemes that improve over Weber sampling.
Abstract: In applications such as telesurgery, it is required to transmit haptic signals to a remote location with a delay of at most few milliseconds. To reduce the packet rate and yet retain perceptual quality, adaptive sampling has been explored in the literature. In particular, in earlier work we proposed and analyzed an adaptive sampling scheme based on Weber's law of perception. In this paper, we explore other possible adaptive sampling candidates. We describe an experimental setup where users are subjected to piecewise constant haptic stimuli to which they can respond with a click. We record the clicks and ask the question: can we identify signal features and classiers to predict the clicks? The answer suggests adaptive sampling schemes that improve over Weber sampling.
Citations
More filters
Journal ArticleDOI
TL;DR: This survey focuses on how the fifth generation of mobile networks will allow haptic applications to take life, in combination with the haptic data communication protocols, bilateral teleoperation control schemes and hapticData processing needed.
Abstract: Touch is currently seen as the modality that will complement audition and vision as a third media stream over the Internet in a variety of future haptic applications which will allow full immersion and that will, in many ways, impact society. Nevertheless, the high requirements of these applications demand networks which allow ultra-reliable and low-latency communication for the challenging task of applying the required quality of service for maintaining the user’s quality of experience at optimum levels. In this survey, we enlist, discuss, and evaluate methodologies and technologies of the necessary infrastructure for haptic communication. Furthermore, we focus on how the fifth generation of mobile networks will allow haptic applications to take life, in combination with the haptic data communication protocols, bilateral teleoperation control schemes and haptic data processing needed. Finally, we state the lessons learned throughout the surveyed research material along with the future challenges and infer our conclusions.

179 citations


Additional excerpts

  • ...In comparison to other sampling methods such as the level crossings method (that incorporates absolute differences instead of percentages between samples), the perceptual-based kinesthetic data reduction schemes are proven to have good but similar accuracy [90]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a transport layer congestion control protocol for telehaptic applications operating over shared networks, termed as Dynamic Packetization Module (DPM), which is a lossless, network-aware protocol that tunes the packetization rate based on the level of congestion in the network.
Abstract: Telehaptic applications involve delay-sensitive multimedia communication between remote locations with distinct Quality of Service (QoS) requirements for different media components. These QoS constraints pose a variety of challenges, especially when the communication occurs over a shared network, with unknown and time-varying cross-traffic. In this work, we propose a transport layer congestion control protocol for telehaptic applications operating over shared networks, termed as Dynamic Packetization Module (DPM). DPM is a lossless, network-aware protocol that tunes the telehaptic packetization rate based on the level of congestion in the network. To monitor the network congestion, we devise a novel network feedback module, which communicates the end-to-end delays encountered by the telehaptic packets to the respective transmitters with negligible overhead. Via extensive simulations, we show that DPM meets the QoS requirements of telehaptic applications over a wide range of network cross-traffic conditions. We also report qualitative results of a real-time telepottery experiment with several human subjects, which reveal that DPM preserves the quality of telehaptic activity even under heavily congested network scenarios. Finally, we compare the performance of DPM with several previously proposed telehaptic communication protocols and demonstrate that DPM outperforms these protocols.

16 citations

Proceedings ArticleDOI
08 Apr 2016
TL;DR: A network-based opportunistic improvisation to adaptive sampling for the forward channel telehaptic data stream on a time-varying network is proposed, which explores real-time tuning of the perceptual deadband parameter to minimize network underutilization, and consequently improves the quality oftelehaptic communication.
Abstract: We propose a network-based opportunistic improvisation to adaptive sampling for the forward channel telehaptic data stream on a time-varying network. The algorithm explores real-time tuning of the perceptual deadband parameter to minimize network underutilization, and consequently improves the quality of telehaptic communication. We describe in detail the rationale behind the design choices of the proposed sampling scheme. We perform both real-time telehaptic experiments and simulations to test the proof of concept. The reconstructed haptic signals reveal a substantial improvement in average SNR of 3.57 dB, suggesting that the proposed method outperforms the conventional adaptive sampling technique to a large extent. In addition to satisfying the telehaptic Quality of Service (QoS) requirements, we also demonstrate that our method does not overwhelm the network or penalize the concurrent traffic streams.

13 citations


Cites background from "Can we improve over weber sampling ..."

  • ...In order to overcome the risk of QoS violations, researchers have proposed a variety of haptic data compression techniques; see, for example, [7], [8], [9], [10], [11], [12]....

    [...]

Proceedings ArticleDOI
12 Dec 2013
TL;DR: The frame structure used in HoIP is described in detail, the rationale behind the design choices, the implementation details of HoIP at the transmitter and receiver, and processing delay measurements done using real haptic devices and HAPI (an open source software used to interface with the haptic device).
Abstract: Applications such as telesurgery need low latency communication of haptic data between remote nodes. For the stability of the control loop, a typical round trip delay target is 5 ms or lower. In this paper, we specify an application layer protocol - Haptics over Internet Protocol (HoIP) - which is designed to allow such low latency haptic data transmission without sacrificing on the quality of haptic perception. The key ingredients of HoIP include adaptive sampling strategies for the digitization of the haptic signal, use of a multithreaded architecture at the transmitter and receiver to minimize processing delay, and use of an existing UDP implementation. In this paper, we describe in detail the frame structure used in HoIP and the rationale behind our design choices, the implementation details of HoIP at the transmitter and receiver, and report processing delay measurements done using real haptic devices and HAPI (an open source software used to interface with the haptic device). The HoIP software is entirely written in C++ and our results show that in the worst case transmitter and receiver side processing delays are less than 0.6 ms.

13 citations


Cites background from "Can we improve over weber sampling ..."

  • ...In order to reduce the packet rate, many researchers have proposed the use of adaptive sampling based on perceptually significant characteristics of the haptic signal such as Weber’s law and level crossings; see for example [10], [11], [12], [13]....

    [...]

  • ...We refer to [13], [19] for a study of adaptive sampling of haptic signals....

    [...]

  • ...The thresholds are initially learned through a controlled experiment as discussed in [13]....

    [...]

  • ...22, as discussed in [13], [19], and sample-hold extrapolator at both ends....

    [...]

  • ...12, as discussed in [13], [19], and linear extrapolator at both ends....

    [...]

Proceedings ArticleDOI
16 Apr 2015
TL;DR: HoIP - a low latency application layer protocol that enables haptic, audio and video data transmission over a network between two remotely connected nodes performs well in terms of maintaining the latencies well under the QoS thresholds.
Abstract: Telehaptics applications are usually characterized by a strict imposition of a round trip haptic data latency of less than 30 ms. In this paper, we present Haptics over Internet Protocol (HoIP) - a low latency application layer protocol that enables haptic, audio and video data transmission over a network between two remotely connected nodes. The evaluation of the protocol is carried out through a set of three experiments, each with distinct objectives. First, a haptic-audio-visual (HAV) interactive application, involving two remotely located human personnel communicating via haptic, auditory and visual media, to evaluate the Quality of Service (QoS) violation due to the protocol. Second, a haptic sawing experiment with the goal of assessing the impact of HoIP and network delays in telehaptics applications, by taking the example of a typical telesurgical activity. Third, a telepottery system to determine the protocol's ability in reproducing a real-time interactive user experience with a remote virtual object, in presence of perceptual data compression and reconstruction techniques. Our experiments reveal that the transmission scheduling of multimedia packets performs well in terms of maintaining the latencies well under the QoS thresholds.

12 citations

References
More filters
Proceedings Article
Ron Kohavi1
20 Aug 1995
TL;DR: The results indicate that for real-word datasets similar to the authors', the best method to use for model selection is ten fold stratified cross validation even if computation power allows using more folds.
Abstract: We review accuracy estimation methods and compare the two most common methods crossvalidation and bootstrap. Recent experimental results on artificial data and theoretical re cults in restricted settings have shown that for selecting a good classifier from a set of classifiers (model selection), ten-fold cross-validation may be better than the more expensive leaveone-out cross-validation. We report on a largescale experiment--over half a million runs of C4.5 and a Naive-Bayes algorithm--to estimate the effects of different parameters on these algrithms on real-world datasets. For crossvalidation we vary the number of folds and whether the folds are stratified or not, for bootstrap, we vary the number of bootstrap samples. Our results indicate that for real-word datasets similar to ours, The best method to use for model selection is ten fold stratified cross validation even if computation power allows using more folds.

11,185 citations


"Can we improve over weber sampling ..." refers methods in this paper

  • ...For training of the classifier and evaluating its performance, we use holdout cross-validation [22]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a control law for teleoperators is presented which overcomes the instability caused by time delay by using passivity and scattering theory, a criterion is developed which shows why existing bilateral control systems are unstable for certain environments, and why the proposed bilateral control law is stable for any environment and any time delay.
Abstract: A control law for teleoperators is presented which overcomes the instability caused by time delay. By using passivity and scattering theory, a criterion is developed which shows why existing bilateral control systems are unstable for certain environments, and why the proposed bilateral control law is stable for any environment and any time delay. The control law has been implemented on a single-axis force-reflecting hand controller, and preliminary results are shown. To keep the presentation clear, a single-degree-of-freedom (DOF) linear time-invariant (LTI) teleoperator system is discussed. Nevertheless, results can be extended, without loss of generality, to an n-DOF nonlinear teleoperation system. >

2,131 citations


"Can we improve over weber sampling ..." refers background in this paper

  • ...In order to maintain stability and good quality of perception, it is common in closed loop systems - such as the teleoperation system ([4], [6], [19], [23]) - to sample haptic signals in excess of 1 KHz....

    [...]

Book
26 Mar 1999
TL;DR: A Partial Derivatives of Various Quantities in CRB, a MAP-MRF approach to Depth Recovery and Restoration using MRF Models.
Abstract: 1 Passive Methods for Depth Recovery.- 1.1 Introduction.- 1.2 Different Methods of Depth Recovery.- 1.2.1 Depth from Stereo.- 1.2.2 Structure from Motion.- 1.2.3 Shape from Shading.- 1.2.4 Range from Focus.- 1.2.5 Depth from Defocus.- 1.3 Difficulties in Passive Ranging.- 1.4 Organization of the Book.- 2 Depth Recovery from Defocused Images.- 2.1 Introduction.- 2.2 Theory of Depth from Defocus.- 2.2.1 Real Aperture Imaging.- 2.2.2 Modeling the Camera Defocus.- 2.2.3 Depth Recovery.- 2.2.4 Sources of Errors.- 2.3 Related Work.- 2.4 Summary of the Book.- 3 Mathematical Background.- 3.1 Introduction.- 3.2 Time-Frequency Representation.- 3.2.1 The Complex Spectrogram.- 3.2.2 The Wigner Distribution.- 3.3 Calculus of Variations.- 3.4 Markov Random Fields and Gibbs Distributions.- 3.4.1 Theory of MRF.- 3.4.2 Gibbs Distribution.- 3.4.3 Incorporating Discontinuities.- 4 Depth Recovery with a Block Shift-Variant Blur Model.- 4.1 Introduction.- 4.2 The Block Shift-Variant Blur Model.- 4.2.1 Estimation of Blur.- 4.2.2 Special Cases.- 4.3 Experimental Results.- 4.4 Discussion.- 5 Space-Variant Filtering Models for Recovering Depth.- 5.1 Introduction.- 5.2 Space-Variant Filtering.- 5.3 Depth Recovery Using the Complex Spectrogram.- 5.4 The Pseudo-Wigner Distribution for Recovery of Depth.- 5.5 Imposing Smoothness Constraint.- 5.5.1 Regularized Solution Using the Complex Spectrogram..- 5.5.2 The Pseudo-Wigner Distribution and Regularized Solution.- 5.6 Experimental Results.- 5.7 Discussion.- 6 ML Estimation of Depth and Optimal Camera Settings.- 6.1 Introduction.- 6.2 Image and Observation Models.- 6.3 ML-Based Recovery of Depth.- 6.4 Computation of the Likelihood Function.- 6.5 Optimality of Camera Settings.- 6.5.1 The Cramer-Rao Bound.- 6.5.2 Optimality Criterion.- 6.6 Experimental Results.- 6.7 Discussion.- 7 Recursive Computation of Depth from Multiple Images.- 7.1 Introduction.- 7.2 Blur Identification from Multiple Images.- 7.3 Minimization by Steepest Descent.- 7.4 Recursive Algorithm for Computing the Likelihood Function.- 7.4.1 Single Observation.- 7.4.2 Two Observations.- 7.4.3 General Case of M Observations.- 7.5 Experimental Results.- 7.6 Discussion.- 8 MRF Model-Based Identification of Shift-Variant PSF.- 8.1 Introduction.- 8.2 A MAP-MRF Approach.- 8.3 The Posterior Distribution and Its Neighborhood.- 8.4 MAP Estimation by Simulated Annealing.- 8.5 Experimental Results.- 8.6 Discussion.- 9 Simultaneous Depth Recovery and Image Restoration.- 9.1 Introduction.- 9.2 Depth Recovery and Restoration using MRF Models.- 9.3 Locality of the Posterior Distribution.- 9.4 Parameter Estimation.- 9.5 Experimental Results.- 9.6 Discussion.- 10 Conclusions.- A Partial Derivatives of Various Quantities in CRB.- References.

253 citations


"Can we improve over weber sampling ..." refers methods in this paper

  • ...To ensure that we do not get trapped in a local minima, we use the simulated annealing algorithm (see for example [9])....

    [...]

Journal ArticleDOI
TL;DR: The experimental results show that the presented approach is able to reduce the packet rate between an operator and a teleoperator by up to 90% of the original rate without affecting the performance of the system.
Abstract: We present a novel approach for the transmission of haptic data in telepresence and teleaction systems. The goal of this work is to reduce the packet rate between an operator and a teleoperator without impairing the immersiveness of the system. Our approach exploits the properties of human haptic perception and is, more specifically, based on the concept of just noticeable differences. In our scheme, updates of the haptic amplitude values are signaled across the network only if the change of a haptic stimulus is detectable by the human operator. We investigate haptic data communication for a 1 degree-of-freedom (DoF) and a 3 DoF teleaction system. Our experimental results show that the presented approach is able to reduce the packet rate between the operator and teleoperator by up to 90% of the original rate without affecting the performance of the system.

205 citations


"Can we improve over weber sampling ..." refers background in this paper

  • ...This main idea is exploited for adaptive sampling of haptic signals in [15], [18]....

    [...]

  • ...For real time applications, several authors have attempted to exploit Weber’s law of perception to sample the haptic signal - see for example [10], [11], [15], [16], [17], [18], [20], [26], [29], [32], [33]....

    [...]

Journal ArticleDOI
TL;DR: The relative roles of force/displacement and surface deformation cues are investigated and measured discrimination thresholds with silicone rubber stimuli of differing thickness and compliance found that differences in object thickness are correctly taken into account.
Abstract: For the perception of the hardness of compliant materials, several cues are available. In this paper, the relative roles of force/displacement and surface deformation cues are investigated. We have measured discrimination thresholds with silicone rubber stimuli of differing thickness and compliance. Also, the influence of the finger span is assessed. When compliance is expressed as the Young's modulus, the thresholds in the different conditions follow Weber's law with a Weber fraction of 15 percent. When the surface deformation cue was removed, thresholds more than trebled. Under the assumption of optimal cue combination, this suggests that a large fraction of the information comes from the surface deformation cue. Using a matching experiment, we found that differences in object thickness are correctly taken into account. When cues appear to contradict each other, the conflict is resolved by means of a compromise.

195 citations


"Can we improve over weber sampling ..." refers result in this paper

  • ...6%, which is in the same range as studies of the Weber constant in prior literature (see for example [3], [7])....

    [...]