scispace - formally typeset
Search or ask a question
Topic

Haptic technology

About: Haptic technology is a research topic. Over the lifetime, 18818 publications have been published within this topic receiving 306713 citations. The topic is also known as: haptics & haptic media.


Papers
More filters
Proceedings ArticleDOI
22 Apr 2006
TL;DR: The first stages of a systematic design effort to break a vicious cycle of inadequate haptic technology obstructs inception of vitalizing applications are presented, beginning with specific usage scenarios and a new handheld display platform based on lateral skin stretch.
Abstract: Mobile interaction can potentially be enhanced with well-designed haptic control and display. However, advances have been limited by a vicious cycle whereby inadequate haptic technology obstructs inception of vitalizing applications. We present the first stages of a systematic design effort to break that cycle, beginning with specific usage scenarios and a new handheld display platform based on lateral skin stretch. Results of a perceptual device characterization inform mappings between device capabilities and specific roles in mobile interaction, and the next step of hardware re-engineering.

230 citations

Journal ArticleDOI
TL;DR: A vibration feedback model is created by measuring the acceleration of the stylus of a three degree-of-freedom haptic display as a human user tapped it on several real materials, which provided different parameters than those derived strictly from acceleration data.
Abstract: Reality-based modeling of vibrations has been used to enhance the haptic display of virtual environments for impact events such as tapping, although the bandwidths of many haptic displays make it difficult to accurately replicate the measured vibrations. We propose modifying reality-based vibration parameters through a series of perceptual experiments with a haptic display. We created a vibration feedback model, a decaying sinusoidal waveform, by measuring the acceleration of the stylus of a three degree-of-freedom haptic display as a human user tapped it on several real materials. A series of perceptual experiments, where human users rated the realism of various parameter combinations, were performed to further enhance the realism of the vibration display for impact events. The results provided different parameters than those derived strictly from acceleration data. Additional experiments verified the effectiveness of these modified model parameters by showing that users could differentiate between materials in a virtual environment.

229 citations

Journal ArticleDOI
TL;DR: Evidence that presence may derive from the process of multi-modal integration and, therefore, may be associated with other illusions, such as cross- modal transfers, that result from theprocess of creating a coherent mental model of the space is concluded.
Abstract: How do users generate an illusion of presence in a rich and consistent virtual environment from an impoverished, incomplete, and often inconsistent set of sensory cues? We conducted an experiment to explore how multimodal perceptual cues are integrated into a coherent experience of virtual objects and spaces. Specifically, we explored whether inter-modal integration contributes to generating the illusion of presence in virtual environments. To discover whether intermodal integration might play a role in presence, we looked for evidence of intermodal integration in the form of cross-modal interactions---perceptual illusions in which users use sensory cues in one modality to “fill in” the “missing” components of perceptual experience. One form of cross-modal interaction, a cross-modal transfer, is defined as a form of synesthesia, that is, a perceptual illusion in which stimulation to a sensory modality connected to the interface (such as the visual modality) is accompanied by perceived stimulation to an unconnected sensory modality that receives no apparent stimulation from the virtual environment (such as the haptic modality). Users of our experimental virtual environment who manipulated the visual analog of a physical force, a virtual spring, reported haptic sensations of “physical resistance”, even though the interface included no haptic displays. A path model of the data suggested that this cross-modal illusion was correlated with and dependent upon the sensation of spatial and sensory presence. We conclude that this is evidence that presence may derive from the process of multi-modal integration and, therefore, may be associated with other illusions, such as cross-modal transfers, that result from the process of creating a coherent mental model of the space. Finally, we suggest that this perceptual phenomenon might be used to improve user experiences with multimodal interfaces, specifically by supporting limited sensory displays (such as haptic displays) with appropriate synesthetic stimulation to other sensory modalities (such as visual and auditory analogs of haptic forces).

229 citations

Proceedings ArticleDOI
16 Oct 2016
TL;DR: It is found that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of mechanically-actuated hand-held controllers.
Abstract: We present an investigation of mechanically-actuated hand-held controllers that render the shape of virtual objects through physical shape displacement, enabling users to feel 3D surfaces, textures, and forces that match the visual rendering. We demonstrate two such controllers, NormalTouch and TextureTouch, which are tracked in 3D and produce spatially-registered haptic feedback to a user's finger. NormalTouch haptically renders object surfaces and provides force feedback using a tiltable and extrudable platform. TextureTouch renders the shape of virtual objects including detailed surface structure through a 4×4 matrix of actuated pins. By moving our controllers around while keeping their finger on the actuated platform, users obtain the impression of a much larger 3D shape by cognitively integrating output sensations over time. Our evaluation compares the effectiveness of our controllers with the two de-facto standards in Virtual Reality controllers: device vibration and visual feedback only. We find that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of our controllers.

228 citations

Proceedings ArticleDOI
18 Mar 2005
TL;DR: The paper presents the mechanical design of the L-EXOS, a new exoskeleton for the human arm that has been optimized to obtain a solution with reduced mass and high stiffness, by employing special mechanical components and carbon fiber structural parts.
Abstract: The paper presents the mechanical design of the L-EXOS, a new exoskeleton for the human arm. The exoskeleton is a tendon driven wearable haptic interface with 5 dof 4 actuated ones, and is characterized by a workspace very close to the one of the human arm. The design has been optimized to obtain a solution with reduced mass and high stiffness, by employing special mechanical components and carbon fiber structural parts. The devised exoskeleton is very effective for simulating the touch by hand of large objects or the manipulation within the whole workspace of the arm. The main features of the first prototype that has been developed at PERCRO are presented, together with an indication of the achieved and tested performance.

228 citations


Network Information
Related Topics (5)
Robot
103.8K papers, 1.3M citations
89% related
Mobile robot
66.7K papers, 1.1M citations
86% related
User interface
85.4K papers, 1.7M citations
82% related
Mobile device
58.6K papers, 942.8K citations
78% related
Control theory
299.6K papers, 3.1M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023647
20221,508
2021745
20201,056
20191,180
20181,034