scispace - formally typeset
Search or ask a question
Topic

Haptic technology

About: Haptic technology is a research topic. Over the lifetime, 18818 publications have been published within this topic receiving 306713 citations. The topic is also known as: haptics & haptic media.


Papers
More filters
Journal ArticleDOI
TL;DR: Evaluated feedback conditions for learning of the discrete and continuous elements of a timing task showed that haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.
Abstract: While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions—visual or haptic guidance—optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task—learning to start a tennis stroke and (2) a tracking task—learning to follow a velocity profile. The effect that the task difficulty and subject’s initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects’ initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects’ initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.

79 citations

Book ChapterDOI
01 Jan 2003
TL;DR: In this article, the authors propose a 4.4.4-approximation algorithm for each node. And they show that it works well for all vertices of the vertices.
Abstract: 4.

79 citations

Proceedings ArticleDOI
10 Nov 2003
TL;DR: An MR compatible master slave concept using a hydraulic transmission is introduced, and an operational robot able to work within an MRI/fMRI scanner and acquire images continuously during motion is presented.
Abstract: Magnetically compatible robots are required to develop haptic interfaces for neuroscience studies and MRI guided robots for minimally invasive interventions. This paper introduces an MR compatible master slave concept using a hydraulic transmission, and presents an operational robot able to work within an MRI/fMRI scanner and acquire images continuously during motion. It describes a magnetically inert actuator using a direct drive to power the hydraulic circuitry and a modular set of position and force/torque sensors that we have developed. These were integrated into a haptic interface prototype with on rotary degree of freedom, which can be used in conjuction with an fMRI. The MR compatibility was confirmed experimentally, and the performances show a manipulation accuracy of a few micrometers over a range of several centimeters, and forces up to several thousand Newton.

79 citations

Proceedings ArticleDOI
21 May 2018
TL;DR: In this paper, a deep recurrent model was proposed to predict the forces a garment will apply to a person's body during dressing, which can be used to provide better dressing assistance for people with disabilities.
Abstract: Robot-assisted dressing offers an opportunity to benefit the lives of many people with disabilities, such as some older adults. However, robots currently lack common sense about the physical implications of their actions on people. The physical implications of dressing are complicated by non-rigid garments, which can result in a robot indirectly applying high forces to a person's body. We present a deep recurrent model that, when given a proposed action by the robot, predicts the forces a garment will apply to a person's body. We also show that a robot can provide better dressing assistance by using this model with model predictive control. The predictions made by our model only use haptic and kinematic observations from the robot's end effector, which are readily attainable. Collecting training data from real world physical human-robot interaction can be time consuming, costly, and put people at risk. Instead, we train our predictive model using data collected in an entirely self-supervised fashion from a physics-based simulation. We evaluated our approach with a PR2 robot that attempted to pull a hospital gown onto the arms of 10 human participants. With a 0.2s prediction horizon, our controller succeeded at high rates and lowered applied force while navigating the garment around a persons fist and elbow without getting caught. Shorter prediction horizons resulted in significantly reduced performance with the sleeve catching on the participants' fists and elbows, demonstrating the value of our model's predictions. These behaviors of mitigating catches emerged from our deep predictive model and the controller objective function, which primarily penalizes high forces.

79 citations

Journal IssueDOI
TL;DR: An integrated system for training ultrasound (US) guided needle puncture is presented to provide a validated training tool for interventional radiology (IR) that uses actual patient data and a volume haptic model is proposed that implements an effective model of needle punctures.
Abstract: We present an integrated system for training ultrasound (US) guided needle puncture. Our aim is to provide a validated training tool for interventional radiology (IR) that uses actual patient data. IR procedures are highly reliant on the sense of touch and so haptic hardware is an important part of our solution. A hybrid surface-volume haptic rendering of an US transducer is proposed to constrain the device to remain outside the bony structures when scanning the patient's skin. A volume haptic model is proposed that implements an effective model of needle puncture. Force measurements have been made on real tissue and the resulting data is incorporated into the model. The other input data required is a computed tomography (CT) scan of the patient that is used to create the patient specific models. It is also the data source for a novel simulation of a virtual US scanner, which is used to guide the needle to the correct location. Copyright © 2007 John Wiley & Sons, Ltd.

79 citations


Network Information
Related Topics (5)
Robot
103.8K papers, 1.3M citations
89% related
Mobile robot
66.7K papers, 1.1M citations
86% related
User interface
85.4K papers, 1.7M citations
82% related
Mobile device
58.6K papers, 942.8K citations
78% related
Control theory
299.6K papers, 3.1M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023647
20221,508
2021745
20201,056
20191,180
20181,034