scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2006"


Patent
21 Feb 2006
TL;DR: In this article, a computer system is programmed to implement control parameters for controlling a surgical device to provide haptic guidance to the user and a limit on user manipulation of the surgical device, based on a relationship between an anatomy of the patient and at least one of a position, an orientation, a velocity, and an acceleration of a portion of the surgeon.
Abstract: A surgical apparatus includes a surgical device, configured to be manipulated by a user to perform a procedure on a patient, and a computer system. The computer system is programmed to implement control parameters for controlling the surgical device to provide at least one of haptic guidance to the user and a limit on user manipulation of the surgical device, based on a relationship between an anatomy of the patient and at least one of a position, an orientation, a velocity, and an acceleration of a portion of the surgical device, and to adjust the control parameters in response to movement of the anatomy during the procedure.

822 citations


Journal ArticleDOI
TL;DR: A new method for generating appropriate transients inverts a dynamic model of the haptic device to determine the motor forces required to create prerecorded acceleration profiles at the user's fingertips, providing an important new avenue for increasing the realism of contact in haptic interactions.
Abstract: Tapping on surfaces in a typical virtual environment feels like contact with soft foam rather than a hard object. The realism of such interactions can be dramatically improved by superimposing event-based, high-frequency transient forces over traditional position-based feedback. When scaled by impact velocity, hand-tuned pulses and decaying sinusoids produce haptic cues that resemble those experienced during real impacts. Our new method for generating appropriate transients inverts a dynamic model of the haptic device to determine the motor forces required to create prerecorded acceleration profiles at the user's fingertips. After development, the event-based haptic paradigm and the method of acceleration matching were evaluated in a carefully controlled user study. Sixteen individuals blindly tapped on nine virtual and three real samples, rating the degree to which each felt like real wood. Event-based feedback achieved significantly higher realism ratings than the traditional rendering method. The display of transient signals made virtual objects feel similar to a real sample of wood on a foam substrate, while position feedback alone received ratings similar to those of foam. This work provides an important new avenue for increasing the realism of contact in haptic interactions.

343 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a detailed review of the requirements and constraints that are involved in the design of a high-quality haptic arm exoskeleton for training and rehabilitation in virtual environments.
Abstract: A high-quality haptic interface is typically characterized by low apparent inertia and damping, high structural stiffness, minimal backlash, and absence of mechanical singularities in the workspace. In addition to these specifications, exoskeleton haptic interface design involves consideration of space and weight limitations, workspace requirements, and the kinematic constraints placed on the device by the human arm. These constraints impose conflicting design requirements on the engineer attempting to design an arm exoskeleton. In this paper, the authors present a detailed review of the requirements and constraints that are involved in the design of a high-quality haptic arm exoskeleton. In this context, the design of a five-degree-of-freedom haptic arm exoskeleton for training and rehabilitation in virtual environments is presented. The device is capable of providing kinesthetic feedback to the joints of the lower arm and wrist of the operator, and will be used in future work for robot-assisted rehabilitation and training. Motivation for such applications is based on findings that show robot-assisted physical therapy aids in the rehabilitation process following neurological injuries. As a training tool, the device provides a means to implement flexible, repeatable, and safe training methodologies.

303 citations


Journal ArticleDOI
TL;DR: Providing users with inadequate somesthetic feedback in virtual environments might impair their performance, just as major somesthetic loss does.
Abstract: What would be worse, losing your sight or your sense of touch? Although touch (more generally, somesthesis) is commonly underrated, major somesthetic loss can't be adequately compensated for by sight. It results in catastrophic impairments of hand dexterity, haptic capabilities, walking, perception of limb position, and so on. Providing users with inadequate somesthetic feedback in virtual environments might impair their performance, just as major somesthetic loss does

300 citations


Journal ArticleDOI
TL;DR: A study of this problem, which relates the maximum achievable object stiffness to the elements of the control loop, examines how the sampling rate, quantization, computational delay, and amplifier dynamics interact with the inertia, natural viscous, and Coulomb damping of the haptic device.
Abstract: Rendering stiff virtual objects remains a core challenge in the field of haptics. A study of this problem is presented, which relates the maximum achievable object stiffness to the elements of the control loop. In particular, we examine how the sampling rate, quantization, computational delay, and amplifier dynamics interact with the inertia, natural viscous, and Coulomb damping of the haptic device. Nonlinear effects create distinct stability regions, and many common devices operate stably, yet in violation of passivity criteria. An energy-based approach provides theoretical insights, supported by simulations, experimental data, and a describing function analysis. The presented results subsume previously known stability conditions

295 citations


Journal ArticleDOI
TL;DR: This study demonstrates that prior training on the LapMentor™ laparoscopic simulator leads to improved resident performance of basic skills in the animate operating room environment, and provides evidence that LapMentingor™ training may lead to improved operating room performance.
Abstract: Numerous influences have led to the development of computer-aided simulation to facilitate training in minimally invasive surgery (MIS). These forces include the unique and complex nature of MIS procedures, the requirement for greater efficiency of surgical training due to resident duty hour restrictions and the stringent financial reality of the operating room environment, and most importantly, the legitimate increasing public demand to demonstrate some level of procedural competence prior to performing procedures in the human operating room. Ziv et al have suggested that simulation-based medical education is an ethical imperative, and that the use of simulation in training sends a message that patients are to be protected whenever possible and are not to be used as a convenience of training.1 Simulation-based training has long provided the framework for training in many other complex, high-risk professions (ie, aviation, nuclear power, and the military) with the goal of maximizing safety during training and minimizing risk. It is only recently, however, that simulation has been embraced in the healthcare environment as a possible means of facilitating safer surgical training. Laparoscopic procedures require psychomotor mapping of 3-dimensional space while interacting with a 2-dimensional image. It is this fact combined with the evolution of MIS which has provided the perfect platform and opportunity for the development of computer-aided simulation. Performance of laparoscopic procedures also requires complex psychomotor skills and utilization of optics and instrumentation that are vastly different from those used in conventional open surgery. Additionally, it is far more difficult for the mentor to directly guide the hand of the student and control the course of the laparoscopic procedure, thus often leading to frustration for both learner and instructor. Increasingly, evidence suggests that a well-structured curriculum, which incorporates virtual reality training for laparoscopic surgery, improves performance in both the animate2–6 and human7–10 operating rooms. The majority of these studies, however, have evaluated only 2 of the commercially available laparoscopic simulators (MIST VR™3,6,8,10 or LapSim™2,5), and very few of the other simulators currently on the market have undergone rigorous evaluation and validation. The LapMentor™ (Simbionix USA Corp., Cleveland, OH) is a high-fidelity, computer-aided simulator that provides a laparoscopic training curriculum comprised of basic skills, tutorial procedural tasks, and full procedures (such as laparoscopic cholecystectomy). At the completion of a task or procedure, the simulator provides immediate feedback to the trainee with measures of time, accuracy rate, efficiency of motion, and safety parameters displayed on the screen. The LapMentor™ is also equipped with a high-end technological haptic system, which transmits resistance when tissues or objects are encountered during the simulated task, a feature lacking in many other computer-aided simulators but obviously present in the operating room. While all of these features are highly attractive, the LapMentor™ costs in excess of $100,000, and effective transfer of laparoscopic skills from this simulator to the operating room has not yet been demonstrated. Therefore, the purpose of this study was to determine if training on the LapMentor™ laparoscopic simulator improves resident performance of basic MIS skills in the operating room environment as compared with residents who did not receive any prior training.

267 citations


Journal ArticleDOI
TL;DR: A new computer haptics algorithm to be used in general interactive manipulations of deformable virtual objects is presented and stable and realistic 6D haptic feedback is demonstrated through a clipping task experiment.
Abstract: A new computer haptics algorithm to be used in general interactive manipulations of deformable virtual objects is presented. In multimodal interactive simulations, haptic feedback computation often comes from contact forces. Subsequently, the fidelity of haptic rendering depends significantly on contact space modeling. Contact and friction laws between deformable models are often simplified in up to date methods. They do not allow a "realistic" rendering of the subtleties of contact space physical phenomena (such as slip and stick effects due to friction or mechanical coupling between contacts). In this paper, we use Signorini's contact law and Coulomb's friction law as a computer haptics basis. Real-time performance is made possible thanks to a linearization of the behavior in the contact space, formulated as the so-called Delassus operator, and iteratively solved by a Gauss-Seidel type algorithm. Dynamic deformation uses corotational global formulation to obtain the Delassus operator in which the mass and stiffness ratio are dissociated from the simulation time step. This last point is crucial to keep stable haptic feedback. This global approach has been packaged, implemented, and tested. Stable and realistic 6D haptic feedback is demonstrated through a clipping task experiment.

266 citations


Journal ArticleDOI
01 Jul 2006
TL;DR: A framework for haptic interaction with a reactive virtual human in a physically simulated virtual world that creates realistic reactions and that users can easily estimate the input motions of the avatar is proposed.
Abstract: In this article we propose a framework for haptic interaction with a reactive virtual human in a physically simulated virtual world. The user controls an avatar in the virtual world via human-scale haptic interface and interacts with the virtual human through the avatar. The virtual human recognizes the user's motion and reacts to it. We create a virtual boxing system as an application of the proposed framework. We performed an experiment to evaluate the validity of the reaction of the virtual human. We got confirmation that the proposed framework creates realistic reactions and that users can easily estimate the input motions of the avatar.

241 citations


Proceedings ArticleDOI
22 Apr 2006
TL;DR: The first stages of a systematic design effort to break a vicious cycle of inadequate haptic technology obstructs inception of vitalizing applications are presented, beginning with specific usage scenarios and a new handheld display platform based on lateral skin stretch.
Abstract: Mobile interaction can potentially be enhanced with well-designed haptic control and display. However, advances have been limited by a vicious cycle whereby inadequate haptic technology obstructs inception of vitalizing applications. We present the first stages of a systematic design effort to break that cycle, beginning with specific usage scenarios and a new handheld display platform based on lateral skin stretch. Results of a perceptual device characterization inform mappings between device capabilities and specific roles in mobile interaction, and the next step of hardware re-engineering.

230 citations


PatentDOI
TL;DR: In this article, a wide variety of actuator types may be employed to provide synchronized vibration, including linear actuators, rotary actuators and rotating eccentric mass actuators (REMA).
Abstract: The present invention relates to synchronized vibration devices (620) that can provide haptic feedback to a user. A wide variety of actuator types may be employed to provide synchronized vibration, including linear actuators (100), rotary actuators (300), rotating eccentric mass actuators (304), and rocking or pivoting mass actuators (400, 490). A controller (502) may send signals to one or more driver circuits (504) for directing operation of the actuators. The controller may provide direction and amplitude control (508), vibration control (512), and frequency control (510) to direct the haptic experience. Parameters such as frequency, phase, amplitude, duration, and direction can be programmed or input as different patterns suitable for use in gaming, virtual reality and real-world situations.

217 citations


Patent
Rudolf Ritter1, Eric Lauper1
04 Sep 2006
TL;DR: In this paper, a communication device, system and method comprising a Virtual Retinal Display (VRD) in form of glasses, at least one haptic sensor mounted on the frame of said glasses or connected by a short range communication interface to said glasses was disclosed.
Abstract: It is discloses a communication device, system and method comprising a Virtual Retinal Display (VRD) in form of glasses (1), at least one haptic sensor (12) mounted on the frame of said glasses or connected by a short range communication interface (13) to said glasses (1), wherein it is possible to navigate by means of a cursor through an image displayed by the Virtual Retinal Display (VRD) with the at least one haptic sensor (12). A central control unit controls (11) the Virtual Retinal Display (VRD) and the at least one haptic sensor (12). When the Virtual Retinal Display (VRD) is connected to an external device (2, 9) by a short range communication interface (13), the user can navigate through the content of the external device (2, 9) by easy use of the haptic sensor (12).

Patent
19 Apr 2006
TL;DR: A palpation simulator as discussed by the authors is an interface for interfacing a user with a computer running a palpation simulation, which generates a graphical environment comprising a cursor and a graphical representation of at least a portion of a living body.
Abstract: A palpation simulator comprises an interface for interfacing a user with a computer running a palpation simulation. The computer generates a graphical environment comprising a cursor and a graphical representation of at least a portion of a living body. In one version, a method comprises providing an object in communication with the computer, controlling the cursor in relation to manipulation of at least a portion of the object by the user, and outputting a haptic sensation to the user when the cursor interacts with a region within the graphical representation to provide the user with haptic feedback related to a simulated palpation of the region.

Journal ArticleDOI
TL;DR: In this paper, a shared-control interaction paradigm for haptic interface systems is presented, where the haptic device contributes to execution of a dynamic target-hitting task via force commands from an automatic controller.
Abstract: This paper presents a shared-control interaction paradigm for haptic interface systems, with experimental data from two user studies. Shared control, evolved from its initial telerobotics applications, is adapted as a form of haptic assistance in that the haptic device contributes to execution of a dynamic manual target-hitting task via force commands from an automatic controller. Compared to haptic virtual environments, which merely display the physics of the virtual system, or to passive methods of haptic assistance for performance enhancement based on virtual fixtures, the shared-control approach offers a method for actively demonstrating desired motions during virtual environment interactions. The paper presents a thorough review of the literature related to haptic assistance. In addition, two experiments were conducted to independently verify the efficacy of the shared-control approach for performance enhancement and improved training effectiveness of the task. In the first experiment, shared control is found to be as effective as virtual fixtures for performance enhancement, with both methods resulting in significantly better performance in terms of time between target hits for the manual target-hitting task than sessions where subjects feel only the forces arising from the mass-spring-damper system dynamics. Since shared control is more general than virtual fixtures, this approach may be extremely beneficial for performance enhancement in virtual environments. In terms of training enhancement, shared control and virtual fixtures were no better than practice in an unassisted mode. For manual control tasks, such as the one described in this paper, shared control is beneficial for performance enhancement, but may not be viable for enhancing training effectiveness.

Journal ArticleDOI
TL;DR: This technical note describes a new robotic workstation for neurological rehabilitation, shortly named Braccio di Ferro, designed by having in mind the range of forces and the frequency bandwidth that characterize the interaction between a patient and a physical therapist.
Abstract: This technical note describes a new robotic workstation for neurological rehabilitation, shortly named Braccio di Ferro. It has been designed by having in mind the range of forces and the frequency bandwidth that characterize the interaction between a patient and a physical therapist, as well as a number of requirements that we think are essential for allowing a natural haptic interaction: back-driveability, very low friction and inertia, mechanical robustness, the possibility to operate in different planes, and an open software environment, which allows the operator to add new functionalities and design personalized rehabilitation protocols. Braccio di Ferro is an open system and, in the spirit of open source design, is intended to foster the dissemination of robot therapy. Moreover, its combination of features is not present in commercially available systems.

Patent
20 Jun 2006
TL;DR: In this article, a design interface tool for designing force sensations for use with a host computer and haptic feedback interface device is presented, where the user selects and characterizes force sensations using the interface tool, and a graphical representation of the characterized force sensation is displayed.
Abstract: A design interface tool for designing force sensations for use with a host computer and haptic feedback interface device. A haptic feedback device communicates with a host computer that displays the interface tool. The user selects and characterizes force sensations using the interface tool, and a graphical representation of the characterized force sensation is displayed. The characterized force sensation is output to a user manipulatable object of the force feedback device so that the user can feel the designed force sensation The user can include multiple force sensations in a compound force sensation, where the compound sensation is graphically displayed to indicate the relative start times and duration of each of the force sensations. The user can also associate a sound with the force sensation, such that the sound is output in conjunction with the output of the force sensation.

Journal ArticleDOI
TL;DR: It is confirmed that the tactile display can be used as a navigation aid outdoors and that the vibrotactile patterns presented can be interpreted as directional or instructional cues with almost perfect accuracy.
Abstract: A wirelessly controlled tactile display has been designed, fabricated and tested for use as a navigation aid. The display comprises a 4 × 4 array of vibrating motors that is mounted on a waist band...

Journal ArticleDOI
TL;DR: A predictive haptic guidance method based on a look-ahead algorithm, along with a user evaluation which compares it with other approaches (no guidance and a standard potential-field method) in a 1-DoF steered path-following scenario, suggests the potential of predictive methods in aiding manual control of dynamic interactive tasks where intelligent support is available.
Abstract: Intelligent systems are increasingly able to offer real-time information relevant to a user's manual control of an interactive system, such as dynamic system control space constraints for animation control and driving. However, it is difficult to present this information in a usable manner and other approaches which have employed haptic cues for manual control in "slow" systems often lead to instabilities in highly dynamic tasks. We present a predictive haptic guidance method based on a look-ahead algorithm, along with a user evaluation which compares it with other approaches (no guidance and a standard potential-field method) in a 1-DoF steered path-following scenario. Look-ahead guidance outperformed the other methods in both quantitative performance and subjective preference across a range of path complexity and visibility and a force analysis demonstrated that it applied smaller and fewer forces to users. These results (which appear to derive from the predictive guidance's supporting users in taking earlier and more subtle corrective action) suggest the potential of predictive methods in aiding manual control of dynamic interactive tasks where intelligent support is available.

Patent
Kim Kyu Yong1, Sang-Youn Kim1, Byung-seok Soh1, Gyung-hye Yang1, Lee Yong Beom1 
27 Oct 2006
TL;DR: In this article, a haptic button providing various stimulations to a user according to a current application and a haptronic device using the same are provided, which includes an electro-active polymer having a flat shape, a pair of electrodes contacting two sides of the electroactive polymer, an electric circuit applying a predetermined voltage, and a sensor sensing a button input from a user.
Abstract: A haptic button providing various stimulations to a user according to a current application and a haptic device using the same are provided. The haptic button includes an electro-active polymer having a flat shape, a pair of electrodes contacting two sides of the electro-active polymer, an electric circuit applying a predetermined voltage to the pair of electrodes, and a sensor sensing a button input from a user, wherein stimulation provided from the electro-active polymer to the user is changed by changing a waveform of the voltage according to a current application status.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science and found that haptic feedback from the haptic gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.
Abstract: This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts. © 2005 Wiley Periodicals, Inc. Sci Ed90:111–123, 2006

Journal ArticleDOI
TL;DR: In this paper, the role of the visual and haptic senses on the elongation bias was highlighted, which predicts that the taller of two equivolume objects will appear bigger.
Abstract: We highlight the role of interacting senses on consumer judgment. Specifically, we focus on the role of the visual and haptic (touch) senses on the elongation bias, which predicts that the taller of two equivolume objects will appear bigger. We show that sensory modality will affect the extent (and even direction) of the elongation bias—with visual cues alone and with bimodal “visual and haptic cues” (seeing and handling the objects), we obtain the elongation bias; however, with haptic cues alone (handling the objects blindfolded) and in bimodal judgments with visual load, we obtain a reversal of the elongation bias.

Journal ArticleDOI
TL;DR: This paper presents a new teleoperation approach using a virtual spring, and local contact force control on the slave robot, which allows the desired motion and contact forces to be combined in a single force command.
Abstract: This paper presents a new teleoperation approach using a virtual spring, and local contact force control on the slave robot. The operational space framework provides the control structure needed to achieve decoupled task dynamics. A virtual spring connects the master and slave systems and a closed-loop force controller compensates for the dynamics of the slave system, rendering transparent the effector of the slave robotic system. The active force control approach allows the desired motion and contact forces to be combined in a single force command. The required performance and robustness of force control are achieved by a full state reconstruction using a modified Kalman estimator, which addresses disturbances and modeling uncertainties. The performance of both telepresence and force control are further improved by on-line stiffness estimation of the object in contact with the effector. The redundancy of the mobile manipulation system is addressed through a decoupled decomposition of task and posture dynamics.

Proceedings ArticleDOI
11 Oct 2006
TL;DR: A modular electronic device to allow users to perceive and respond simultaneously to multiple spatial information sources using haptic stimulus, and among the numerous potential applications are electronic travel aids and visual prosthetics for the blind, augmentation of spatial awareness in hazardous working environments, as well as enhanced obstacle awareness for motorcycle or car drivers.
Abstract: We are developing a modular electronic device to allow users to perceive and respond simultaneously to multiple spatial information sources using haptic stimulus. Each module of this wearable "haptic radar" acts as an artificial hair capable of sensing obstacles, measuring their range and transducing this information as a vibro-tactile cue on the skin directly beneath the module. Our first prototype (a headband) provides the wearer with 360 degrees of spatial awareness thanks to invisible, insect-like antennas. During a proof-of-principle experiment, a significant proportion (87%, p = 1.26 * 10-5) of participants moved to avoid an unseen object approaching from behind without any previous training. Participants reported the system as more of a help, easy, and intuitive. Among the numerous potential applications of this interface are electronic travel aids and visual prosthetics for the blind, augmentation of spatial awareness in hazardous working environments, as well as enhanced obstacle awareness for motorcycle or car drivers (in this case the sensors may cover the surface of the car).

Proceedings ArticleDOI
21 Apr 2006
TL;DR: A working prototype informed by a pilot study is presented of TapTap, a wearable haptic system that allows nurturing human touch to be recorded, broadcast and played back for emotional therapy.
Abstract: TapTap is a wearable haptic system that allows nurturing human touch to be recorded, broadcast and played back for emotional therapy. Haptic input/output modules in a convenient modular scarf provide affectionate touch that can be personalized. We present a working prototype informed by a pilot study.

Journal ArticleDOI
TL;DR: A multi-modal human-scale VE VIREPSE (virtual reality platform for simulation and experimentation) that provides haptic interaction using a string-based interface called SPIDAR, olfactory and auditory feedbacks is described and an application that allows students experiencing the abstract concept of the Bohr atomic model and the quantization of the energy levels has been developed.
Abstract: It has been suggested that immersive virtual reality (VR) technology allows knowledge-building experiences and in this way provides an alternative educational process. Important key features of constructivist educational computer-based environments for science teaching and learning, include interaction, size, transduction and reification. Indeed, multi-sensory VR technology suits very well the needs of sciences that require a higher level of visualization and interaction. Haptics that refers to physical interactions with virtual environments (VEs) may be coupled with other sensory modalities such as vision and audition but are hardly ever associated with other feedback channels, such as olfactory feedback. A survey of theory and existing VEs including haptic or olfactory feedback, especially in the field of education is provided. Our multi-modal human-scale VE VIREPSE (virtual reality platform for simulation and experimentation) that provides haptic interaction using a string-based interface called SPIDAR (space interface device for artificial reality), olfactory and auditory feedbacks is described. An application that allows students experiencing the abstract concept of the Bohr atomic model and the quantization of the energy levels has been developed. Different configurations that support interaction, size and reification through the use of immersive and multi-modal (visual, haptic, auditory and olfactory) feedback are proposed for further evaluation. Haptic interaction is achieved using different techniques ranging from desktop pseudo-haptic feedback to human-scale haptic interaction. Olfactory information is provided using different fan-based olfactory displays (ODs). Significance of developing such multi-modal VEs for education is discussed.

Patent
24 Jan 2006
TL;DR: In this article, a compact haptic and augmented virtual reality system that produces an augmented reality environment is presented, equipped with software and devices that provide users with stereoscopic visualization and force feedback simultaneously in real time.
Abstract: The invention provides compact haptic (18) and augmented virtual reality system that produces an augmented reality environment. The system is equipped with software and devices that provide users with stereoscopic visualization and force feedback simultaneously in real time. High resolution, high pixel density, head and hand tracking ability are provided. Well-matched haptics and graphics volumes are realized. Systems of the invention are compact, making use of a standard personal display device, e. g., a computer monitor, as the display driver. Systems of the invention may therefore be inexpensive compared to many conventional virtual reality systems.

Patent
13 Sep 2006
TL;DR: In this article, a method for mapping between an event of interest and a corresponding haptic effect is presented, allowing a user to associate haptic effects with one or more events of interest.
Abstract: In one embodiment, a method for mapping between an event of interest and a corresponding haptic effect includes: providing a plurality of haptic effects to a user; allowing a user to associate haptic effects with one or more events of interest; and compiling the mappings made between various events of interest and corresponding haptic effects into a haptic lookup table, storable in memory.

Proceedings ArticleDOI
02 Nov 2006
TL;DR: It is found that users could consistently recall an arbitrary association between a haptic stimulus and its assigned arbitrary meaning in a 9-phoneme set, during a 45 minute test period following a reinforced learning stage.
Abstract: A haptic phoneme represents the smallest unit of a constructed haptic signal to which a meaning can be assigned. These haptic phonemes can be combined serially or in parallel to form haptic words, or haptic icons, which can hold more elaborate meanings for their users. Here, we use phonemes which consist of brief (


Journal ArticleDOI
TL;DR: These results indicate that both forms of robotic demonstration can improve short-term performance of a novel desired path and the motor system is inclined to repeat its previous mistakes following just a few movements without robotic demonstration, but these systematic errors can be reduced with periodic training.
Abstract: Background: Mechanical guidance with a robotic device is a candidate technique for teaching people desired movement patterns during motor rehabilitation, surgery, and sports training, but it is unclear how effective this approach is as compared to visual demonstration alone. Further, little is known about motor learning and retention involved with either robot-mediated mechanical guidance or visual demonstration alone. Methods: Healthy subjects (n = 20) attempted to reproduce a novel three-dimensional path after practicing it with mechanical guidance from a robot. Subjects viewed their arm as the robot guided it, so this "haptic guidance" training condition provided both somatosensory and visual input. Learning was compared to reproducing the movement following only visual observation of the robot moving along the path, with the hand in the lap (the "visual demonstration" training condition). Retention was assessed periodically by instructing the subjects to reproduce the path without robotic demonstration. Results: Subjects improved in ability to reproduce the path following practice in the haptic guidance or visual demonstration training conditions, as evidenced by a 30–40% decrease in spatial error across 126 movement attempts in each condition. Performance gains were not significantly different between the two techniques, but there was a nearly significant trend for the visual demonstration condition to be better than the haptic guidance condition (p = 0.09). The 95% confidence interval of the mean difference between the techniques was at most 25% of the absolute error in the last cycle. When asked to reproduce the path repeatedly following either training condition, the subjects' performance degraded significantly over the course of a few trials. The tracing errors were not random, but instead were consistent with a systematic evolution toward another path, as if being drawn to an "attractor path". Conclusion: These results indicate that both forms of robotic demonstration can improve shortterm performance of a novel desired path. The availability of both haptic and visual input during the haptic guidance condition did not significantly improve performance compared to visual input alone in the visual demonstration condition. Further, the motor system is inclined to repeat its previous mistakes following just a few movements without robotic demonstration, but these systematic errors can be reduced with periodic training.

Journal ArticleDOI
TL;DR: A linearized contact model that frees the simulation from the computational bottleneck of collision detection, with penalty-based collision response well suited for fixed time-stepping, guarantees that the motion of the virtual tool is simulated at the same high rate as the synthesis of feedback force and torque.
Abstract: This paper presents a modular algorithm for six-degree-of-freedom (6-DOF) haptic rendering. The algorithm is aimed to provide transparent manipulation of rigid models with a high polygon count. On the one hand, enabling a stable display is simplified by exploiting the concept of virtual coupling and employing passive implicit integration methods for the simulation of the virtual tool. On the other hand, transparency is enhanced by maximizing the update rate of the simulation of the virtual tool, and thereby the coupling impedance, and allowing for stable simulation with small mass values. The combination of a linearized contact model that frees the simulation from the computational bottleneck of collision detection, with penalty-based collision response well suited for fixed time-stepping, guarantees that the motion of the virtual tool is simulated at the same high rate as the synthesis of feedback force and torque. Moreover, sensation-preserving multiresolution collision detection ensures a fast update of the linearized contact model in complex contact scenarios, and a novel contact clustering technique alleviates possible instability problems induced by penalty-based collision response