scispace - formally typeset
Search or ask a question
Topic

Haptic technology

About: Haptic technology is a research topic. Over the lifetime, 18818 publications have been published within this topic receiving 306713 citations. The topic is also known as: haptics & haptic media.


Papers
More filters
Journal ArticleDOI
TL;DR: The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness).
Abstract: Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

90 citations

Proceedings ArticleDOI
12 Mar 2005
TL;DR: The results of a user study shows that PRISM significantly out-performs the more traditional direct manipulation approach, in contrast to techniques like Go-Go, which scale up hand movement to allow "long distance" manipulation; PRISM scales the hand movement down to increase precision.
Abstract: A significant benefit of an immersive virtual environment is that it provides users the ability to interact with objects in a very natural, direct way; often realized by using a tracked, hand-held wand or stylus to "grab" and position objects. In the absence of force feedback or props, it is difficult and frustrating for users to move their arms, hands, or fingers to precise positions in 3D space, and more difficult to hold them at a constant position, or to move them in a uniform direction over time. The imprecision of user interaction in virtual environments is a fundamental problem that limits the complexity of the environment the user can interact with directly. We present PRISM (precise and rapid interaction through scaled manipulation), a novel interaction technique which acts on the user's behavior in the environment to determine whether they have precise or imprecise goals in mind. When precision is desired, PRISM dynamically adjusts the "control/ display" ratio which determines the relationship between physical hand movements and the motion of the controlled virtual object, making it less sensitive to the user's hand movement. In contrast to techniques like Go-Go, which scale up hand movement to allow "long distance" manipulation; PRISM scales the hand movement down to increase precision. We present the results of a user study which shows that PRISM significantly out-performs the more traditional direct manipulation approach.

90 citations

Proceedings ArticleDOI
01 Dec 2011
TL;DR: Algorithms and machine learning techniques were developed and validated for extracting the radius of curvature, point of application of force (PAF) and force vector (FV), which are useful in evaluating compliance from reaction forces when a finger is pushed into an object at a given velocity.
Abstract: The BioTac® is a biomimetic tactile sensor for grip control and object characterization. It has three sensing modalities: thermal flux, microvibration and force. In this paper, we discuss feature extraction and interpretation of the force modality data. The data produced by this force sensing modality during sensor-object interaction are monotonic but non-linear. Algorithms and machine learning techniques were developed and validated for extracting the radius of curvature (ROC), point of application of force (PAF) and force vector (FV). These features have varying degrees of usefulness in extracting object properties using only cutaneous information; most robots can also provide the equivalent of proprioceptive sensing. For example, PAF and ROC is useful for extracting contact points for grasp and object shape as the finger depresses and moves along an object; magnitude of FV is useful in evaluating compliance from reaction forces when a finger is pushed into an object at a given velocity while direction is important for maintaining stable grip.

90 citations

Proceedings ArticleDOI
Majed Samad1, Elia Gatti1, Anne Hermes1, Hrvoje Benko1, Cesare Parise1 
02 May 2019
TL;DR: These findings provide the first quantification of the range of C/D-ratio that can be used to simulate weight in virtual reality and discuss these findings in terms of estimation of physical work needed to lift an object.
Abstract: In virtual reality, the lack of kinesthetic feedback often prevents users from experiencing the weight of virtual objects. Control-to-display (C/D) ratio manipulation has been proposed as a method to induce weight perception without kinesthetic feedback. Based on the fact that lighter (heavier) objects are easier (harder) to move, this method induces an illusory perception of weight by manipulating the rendered position of users' hands---increasing or decreasing their displayed movements. In a series of experiments we demonstrate that C/D-ratio induces a genuine perception of weight, while preserving ownership over the virtual hand. This means that such a manipulation can be easily introduced in current VR experiences without disrupting the sense of presence. We discuss these findings in terms of estimation of physical work needed to lift an object. Our findings provide the first quantification of the range of C/D-ratio that can be used to simulate weight in virtual reality.

90 citations

Journal ArticleDOI
TL;DR: It is concluded that although the haptic sense seems to be crucial for material perception, the information it can gather alone might not be quite fine-grained and rich enough for perfect material recognition.
Abstract: Research on material perception has received an increasing amount of attention recently. Clearly, both the visual and the haptic sense play important roles in the perception of materials, yet it is still unclear how both senses compare in material perception tasks. Here, we set out to investigate the degree of correspondence between the visual and the haptic representations of different materials. We asked participants to both categorize and rate 84 different materials for several material properties. In the haptic case, participants were blindfolded and asked to assess the materials based on haptic exploration. In the visual condition, participants assessed the stimuli based on their visual impressions only. While categorization performance was less consistent in the haptic condition than in the visual one, ratings correlated highly between the visual and the haptic modality. PCA revealed that all material samples were similarly organized within the perceptual space in both modalities. Moreover, in both senses the first two principal components were dominated by hardness and roughness. These are two material features that are fundamental for the haptic sense. We conclude that although the haptic sense seems to be crucial for material perception, the information it can gather alone might not be quite fine-grained and rich enough for perfect material recognition.

90 citations


Network Information
Related Topics (5)
Robot
103.8K papers, 1.3M citations
89% related
Mobile robot
66.7K papers, 1.1M citations
86% related
User interface
85.4K papers, 1.7M citations
82% related
Mobile device
58.6K papers, 942.8K citations
78% related
Control theory
299.6K papers, 3.1M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023647
20221,508
2021745
20201,056
20191,180
20181,034