scispace - formally typeset
Search or ask a question
Book ChapterDOI

Passive Probing Perception: Effect of Latency in Visual-Haptic Feedback

TL;DR: In this article, the effect of latency on the user ability to perform a task after 185.5 ms was found to be significantly higher on the force perception rather than on the displacement perception.
Abstract: Latency is detrimental to haptic systems, specifically in networked telepresence systems. Although the latency effect on stiffness is well studied in the literature, it is not very clear if the latency effects on the stiffness perception are due to the displacement perception or the force perception. In this study, we propose passive probing which involves force perception alone, without any displacement of the finger, for studying latency effects. A psychophysical experiment is conducted with a set of artificially induced latencies which provides a quantitative measure of the effect of these latencies on three parameters: Just Noticeable Difference (JND), the time taken to reach the reference forces and the maximum overshoot. The results showed that the latency has a significant effect on the user ability in task performance after 185.5 ms. From the observation, the latency effect on JND in passive probing is similar to that of the stiffness perception (active probing) task which shows that the effect is significantly higher on the force perception rather than on the displacement perception.
Citations
More filters
Journal ArticleDOI
TL;DR: A novel method of training fine‐motor skills such as Microscopic Selection Task (MST) for robot‐assisted surgery using virtual reality (VR) with objective quantification of performance is proposed.
Abstract: BACKGROUND Training surgeons to use surgical robots are becoming part of surgical training curricula. We propose a novel method of training fine-motor skills such as Microscopic Selection Task (MST) for robot-assisted surgery using virtual reality (VR) with objective quantification of performance. We also introduce vibrotactile feedback (VTFB) to study its impact on training performance. METHODS We use a VR-based environment to perform MST with varying degrees of difficulties. Using a well-known human-computer interaction paradigm and incorporating VTFB, we quantify the performance: speed, precision and accuracy. RESULTS MST with VTFB showed statistically significant improvement in performance metrics leading to faster completion of MST with higher precision and accuracy compared to that without VTFB. DISCUSSION The addition of VTFB to VR-based training for robot-assisted surgeries may improve performance outcomes in real robotic surgery. VTFB, along with proposed performance metrics, can be used in training curricula for robot-assisted surgeries.

10 citations

Journal ArticleDOI
TL;DR: Zhang et al. as discussed by the authors simulate positional vibrotactile feedback (PVF) with multiple vibration motors when colliding with virtual objects in AR, which could significantly reduce the alignment offset between virtual and physical objects with tolerable task completion time increments.
Abstract: Abstract Consistent visual and haptic feedback is an important way to improve the user experience when interacting with virtual objects. However, the perception provided in Augmented Reality (AR) mainly comes from visual cues and amorphous tactile feedback. This work explores how to simulate positional vibrotactile feedback (PVF) with multiple vibration motors when colliding with virtual objects in AR. By attaching spatially distributed vibration motors on a physical haptic proxy, users can obtain an augmented collision experience with positional vibration sensations from the contact point with virtual objects. We first developed a prototype system and conducted a user study to optimize the design parameters. Then we investigated the effect of PVF on user performance and experience in a virtual and real object alignment task in the AR environment. We found that this approach could significantly reduce the alignment offset between virtual and physical objects with tolerable task completion time increments. With the PVF cue, participants obtained a more comprehensive perception of the offset direction, more useful information, and a more authentic AR experience.

2 citations

Journal ArticleDOI
TL;DR: To reduce VKC during scaled movements, tasks should be designed such that the visual awareness of the real hand is avoided.
Abstract: Considering 3D interactions in Virtual-Reality (VR), it is critical to study how visual awareness of real hands influences users scaled interaction performance in different VR environments. We used...
References
More filters
Journal ArticleDOI
TL;DR: In this article, the effect of delay of visual and force information with respect to proprioception was investigated to understand how visual-haptic perception of compliance is achieved, and it was shown that perceived compliance decreases with a delay in the visual information.

69 citations

Journal ArticleDOI
TL;DR: It is found that forces were scaled based on previous lifts (sensorimotor memory) and these effects increased depending on the length of recent lifting experience, and perceptual weight estimates were influenced by the preceding lift, resulting in lower estimations after a heavy lift compared to a light one.
Abstract: When lifting an object, the brain uses visual cues and an internal object representation to predict its weight and scale fingertip forces accordingly. Once available, tactile information is rapidly integrated to update the weight prediction and refine the internal object representation. If visual cues cannot be used to predict weight, force planning relies on implicit knowledge acquired from recent lifting experience, termed sensorimotor memory. Here, we investigated whether perception of weight is similarly biased according to previous lifting experience and how this is related to force scaling. Participants grasped and lifted series of light or heavy objects in a semi-randomized order and estimated their weights. As expected, we found that forces were scaled based on previous lifts (sensorimotor memory) and these effects increased depending on the length of recent lifting experience. Importantly, perceptual weight estimates were also influenced by the preceding lift, resulting in lower estimations after a heavy lift compared to a light one. In addition, weight estimations were negatively correlated with the magnitude of planned force parameters. This perceptual bias was only found if the current lift was light, but not heavy since the magnitude of sensorimotor memory effects had, according to Weber's law, relatively less impact on heavy compared to light objects. A control experiment tested the importance of active lifting in mediating these perceptual changes and showed that when weights are passively applied on the hand, no effect of previous sensory experience is found on perception. These results highlight how fast learning of novel object lifting dynamics can shape weight perception and demonstrate a tight link between action planning and perception control. If predictive force scaling and actual object weight do not match, the online motor corrections, rapidly implemented to downscale forces, will also downscale weight estimation in a proportional manner.

47 citations

Journal ArticleDOI
TL;DR: This article simulated random packet dropouts and communication latency in the visual modality and investigated the effects on the temporal discrimination of visual-haptic collisions, demonstrating that the synchronous perception of crossmodal events was very sensitive to the packet loss rate.
Abstract: Temporal discontinuities and delay caused by packet loss or communication latency often occur in multimodal telepresence systems. It is known that such artifacts can influence the feeling of presence [1]. However, it is largely unknown how the packet loss and communication latency affect the temporal perception of multisensory events. In this article, we simulated random packet dropouts and communication latency in the visual modality and investigated the effects on the temporal discrimination of visual-haptic collisions. Our results demonstrated that the synchronous perception of crossmodal events was very sensitive to the packet loss rate. The packet loss caused the impression of time delay and influenced the perception of the subsequent events. The perceived time of the visual event increased linearly, and the temporal discrimination deteriorated, with increasing packet loss rate. The perceived time was also influenced by the communication delay, which caused time to be slightly overestimated.

46 citations

Journal ArticleDOI
TL;DR: The results suggest that the detection of delay in force feedback depends on the movement frequency and amplitude, while variation of the absolute feedback force level does not influence the detection threshold.
Abstract: Time delay is recognized as an important issue in haptic telepresence systems as it is inherent to long-distance data transmission. What factors influence haptic delay perception in a time-delayed environment are, however, largely unknown. In this article, we examine the impact of manual movement frequency and amplitude in a sinusoidal exploratory movement as well as the stiffness of the haptic environment on the detection threshold for delay in haptic feedback. The results suggest that the detection of delay in force feedback depends on the movement frequency and amplitude, while variation of the absolute feedback force level does not influence the detection threshold. A model based on the exploration movement is proposed and guidelines for system design with respect to the time delay in haptic feedback are provided.

37 citations

Proceedings ArticleDOI
21 Jun 2011
TL;DR: The results were unexpected, but consistent in all three experiments: latency makes the user experience worse, even though performance does not decrease significantly.
Abstract: Touchscreens are becoming more and more popular, especially in mobile devices. There is also clear evidence of the benefits of tactile feedback in touchscreen interaction. However, the effect of the evident latency in interaction has been completely neglected in earlier investigations of touchscreen interaction. In this study we examined the effect of tactile feedback latencies on the usability of a touchscreen keypad. We used a realistic use case for number and character keypads; users entered three-number sequences and short sentences using the virtual buttons on the touch display. The experiments differed from each other in terms of the tactile feedback type (press-only or for press and release) and the keypad layout (number or QWERTY). The results were unexpected, but consistent in all three experiments: The performance did not drop significantly within the latency values used. However, the users evaluated the keypad with the shortest feedback latency more pleasant to use compared to others. We can conclude that latency makes the user experience worse, even though performance does not decrease significantly.

29 citations