scispace - formally typeset
Search or ask a question
Topic

Haptic technology

About: Haptic technology is a research topic. Over the lifetime, 18818 publications have been published within this topic receiving 306713 citations. The topic is also known as: haptics & haptic media.


Papers
More filters
Journal ArticleDOI
TL;DR: SmartTouch uses optical sensors to gather information and electrical stimulation to translate it into tactile display, which makes physical contact with an object and touches the surface information of any modality, even those that are typically untouchable.
Abstract: Augmented haptics lets users touch surface information of any modality. SmartTouch uses optical sensors to gather information and electrical stimulation to translate it into tactile display. Augmented reality is an engineer's approach to this dream. In AR, sensors capture artificial information from the world, and existing sensing channels display it. Hence, we virtually acquire the sensor's physical ability as our own. Augmented haptics, the result of applying AR to haptics, would allow a person to touch the untouchable. Our system, SmartTouch, uses a tactile display and a sensor. When the sensor contacts an object, an electrical stimulation translates the acquired information into a tactile sensation, such as a vibration or pressure, through the tactile display. Thus, an individual not only makes physical contact with an object, but also touches the surface information of any modality, even those that are typically untouchable.

152 citations

Journal ArticleDOI
TL;DR: This paper presents a bag-of-features framework that uses several tactile-image descriptors, some that are adapted from the vision domain and others that are novel, to estimate a probability distribution over object identity as an unknown object is explored.
Abstract: This paper explores the connection between sensor-based perception and exploration in the context of haptic object identification. The proposed approach combines 1) object recognition from tactile appearance with 2) purposeful haptic exploration of unknown objects to extract appearance information. The recognition component brings to bear computer-vision techniques by viewing tactile-sensor readings as images. We present a bag-of-features framework that uses several tactile-image descriptors, some that are adapted from the vision domain and others that are novel, to estimate a probability distribution over object identity as an unknown object is explored. Haptic exploration is treated as a search problem in a continuous space to take advantage of sampling-based motion planning to explore the unknown object and construct its tactile appearance. Simulation experiments of a robot arm equipped with a haptic sensor at the end-effector provide promising validation, thereby indicating high accuracy in identifying complex shapes from tactile information gathered during exploration. The proposed approach is also validated by using readings from actual tactile sensors to recognize real objects.

152 citations

Proceedings ArticleDOI
01 Oct 2008
TL;DR: This paper proposes haptic guidance based on the concept of shared control, where both the driver and the support system influence the steering wheel torque, to support drivers in actively producing (more) optimal steering actions during curve negotiation.
Abstract: Haptic feedback on the steering wheel is reported in literature as a promising way to support drivers during steering tasks. Haptic support allows drivers to remain in the direct manual control loop, avoiding known human factors issues with automation. This paper proposes haptic guidance based on the concept of shared control, where both the driver and the support system influence the steering wheel torque. The haptic guidance is developed to continuously generate relatively low forces on the steering wheel, requiring the driver's active steering input to safely negotiate curves. An experiment in a fixed-base driving simulator was conducted, in which 12 young, experienced drivers steered a vehicle - with and without haptic guidance - at a fixed speed along a road with varying curvature. The haptic guidance allowed drivers to slightly but significantly improve safety boundaries in their curve negotiation behavior. Their steering activity was reduced and smoother. The results indicated that continuous haptic guidance is a promising way to support drivers in actively producing (more) optimal steering actions during curve negotiation.

152 citations

17 Oct 2001
TL;DR: In this paper, the authors investigated the benefits of force feedback for virtual reality training of a real task, the construction of a LEGO biplane model, and found a significant change in performance due to training level.
Abstract: This paper describes an experiment conducted to investigate the benefits of force feedback for virtual reality training of a real task. Three groups of subjects received different levels of training before completing a manual task, the construction of a LEGO biplane model. One group trained on a Virtual Building Block (VBB) simulation, which emulated the real task in a virtual environment, including haptic feedback. A second group was also trained on the VBB system, but without the force feedback. The last group received no virtual reality training. Completion times were compared for these different groups in building the actual biplane model in the real world ANOVA analysis showed a significant change in performance due to training level.

152 citations

Patent
03 Sep 2014
TL;DR: In this article, the authors coupleable to a display screen includes a camera system that acquires optical data of a user comfortably gesturing in a user-customizable interaction zone having a z 0 plane.
Abstract: An electronic device coupleable to a display screen includes a camera system that acquires optical data of a user comfortably gesturing in a user-customizable interaction zone having a z 0 plane, while controlling operation of the device. Subtle gestures include hand movements commenced in a dynamically resizable and relocatable interaction zone. Preferably (x,y,z) locations in the interaction zone are mapped to two-dimensional display screen locations. Detected user hand movements can signal the device that an interaction is occurring in gesture mode. Device response includes presenting GUI on the display screen, creating user feedback including haptic feedback. User three-dimensional interaction can manipulate displayed virtual objects, including releasing such objects. User hand gesture trajectory clues enable the device to anticipate probable user intent and to appropriately update display screen renderings.

152 citations


Network Information
Related Topics (5)
Robot
103.8K papers, 1.3M citations
89% related
Mobile robot
66.7K papers, 1.1M citations
86% related
User interface
85.4K papers, 1.7M citations
82% related
Mobile device
58.6K papers, 942.8K citations
78% related
Control theory
299.6K papers, 3.1M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023647
20221,508
2021745
20201,056
20191,180
20181,034