scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 1997"


Proceedings ArticleDOI
03 Aug 1997
TL;DR: This work has developed a haptic rendering system that allows for the efficient tactile display of graphical information and uses a common high-level framework to model contact constraints, surface shading, friction and tex ture.
Abstract: Force feedback coupled with visual display allows people to interact intuitiv ely with complex virtual environments. For this synergy of haptics and graphics to flourish, however, haptic systems must be capable of modeling environments with the same richness, complexity and interactivity that can be found in existing graphic systems. To help meet this challenge, we have developed a haptic rendering system that allows for the efficient tactile display of graphical information. The system uses a common high-level framework to model contact constraints, surface shading, friction and tex ture. The multilevel control system also helps ensure that the haptic device will remain stable even as the limits of the renderer’s capabilities are reached.

661 citations


Proceedings ArticleDOI
03 Aug 1997
TL;DR: A unified framework for virtual-environment interaction based on proprioception, a person's sense of the position and orientation of his body and limbs is presented, allowing a user to interact with a virtual world intuitively, efficiently, precisely, and lazily.
Abstract: Manipulation in immersive virtual environments is difficult partly because users must do without the haptic contact with real objects they rely on in the real world to orient themselves and their manipulanda. To compensate for this lack, we propose exploiting the one real object every user has in a virtual environment, his body. We present a unified framework for virtual-environment interaction based on proprioception, a person's sense of the position and orientation of his body and limbs. We describe three forms of body-relative interaction: • Direct manipulation—ways to use body sense to help control manipulation • Physical mnemonics—ways to store/recall information relative to the body • Gestural actions—ways to use body-relative actions to issue commands Automatic scaling is a way to bring objects instantly within reach so that users can manipulate them using proprioceptive cues. Several novel virtual interaction techniques based upon automatic scaling and our proposed framework of proprioception allow a user to interact with a virtual world intuitively, efficiently, precisely, and lazily. We report the results of both informal user trials and formal user studies of the usability of the body-relative interaction techniques presented. CR

661 citations


Journal ArticleDOI
TL;DR: The major advances in a new discipline, Computer Haptics (analogous to computer graphics), that is concerned with the techniques and processes associated with generating and displaying haptic stimuli to the human user are described.

576 citations


Journal ArticleDOI
TL;DR: Passivity of systems comprising a continuous time plant and discrete time controller is considered and an example—implementation of a “virtual wall” via a one degree-of-freedom haptic interface—is given and discussed in some detail.
Abstract: Passivity of systems comprising a continuous time plant and discrete time controller is considered. This topic is motivated by stability considerations arising in the control of robots and force-reflecting human interfaces (“haptic interfaces”). Necessary conditions for passivity are found via a small gain condition, and sufficient conditions are found via an application of Parseval's theorem and a sequence of frequency domain manipulations. An example—implementation of a “virtual wall” via a one degree-of-freedom haptic interface—is given and discussed in some detail. © 1997 John Wiley & Sons, Inc.

456 citations



Proceedings ArticleDOI
22 Mar 1997
TL;DR: This paper presents the design of the prototype inTouch system which provides a physical link between users separated by distance, and introduces a new approach for applying haptic feedback technology to interpersonal communication.
Abstract: In this paper, we introduce a new approach for applying haptic feedback technology to interpersonal communication. We present the design of our prototype inTouch system which provides a physical link between users separated by distance.

318 citations


Patent
16 May 1997
TL;DR: In this paper, a method and apparatus for providing force feedback to a user operating a human/computer interface device and interacting with a computer-generated simulation was described. But the physical object provided by the user was not considered.
Abstract: A method and apparatus for providing force feedback to a user operating a human/computer interface device (14) and interacting with a computer-generated simulation (20). In one aspect, a computer implemented method simulates the interaction of simulated objects displayed to a user who controls one of the simulated objects by manipulating a physical object (34) of an interface device (14). The physical object provides force feedback to the user which imparts a physical sensation corresponding to the interaction of the computer simulated objects.

287 citations


Journal ArticleDOI
TL;DR: Inspired by the authors' previous work in interpreting robot touch sensor information and study of human touch perception, the Phantom interface permits users to feel the forces of interaction they would encounter while touching objects with the end of a stylus or the tip of their finger.
Abstract: In 1993, haptic interaction with computers took a significant step forward with the development of the Phantom haptic interface. This simple device has spawned a new field analogous to computer graphics-computer haptics-defined as the discipline concerned with the techniques and processes associated with generating and displaying synthesized haptic stimuli to the human user. Inspired by the authors' previous work in interpreting robot touch sensor information and study of human touch perception, the Phantom interface permits users to feel the forces of interaction they would encounter while touching objects with the end of a stylus or the tip of their finger. The resulting sensations prove startling, and many first-time users are quite surprised at the compelling sense of physical presence they encounter when touching virtual objects. To appreciate why the Phantom system succeeded where others failed, you need to understand the nature and functioning of the human haptic system.

209 citations


Book ChapterDOI
19 Mar 1997
TL;DR: A system for simulating arthroscopic knee surgery that is based on volumetric object models derived from 3D Magnetic Resonance Imaging is presented and feedback is provided to the user via real-time volume rendering and force feedback for haptic exploration.
Abstract: A system for simulating arthroscopic knee surgery that is based on volumetric object models derived from 3D Magnetic Resonance Imaging is presented. Feedback is provided to the user via real-time volume rendering and force feedback for haptic exploration. The system is the result of a unique collaboration between an industrial research laboratory, two major universities, and a leading research hospital. In this paper, components of the system are detailed and the current state of the integrated system is presented. Issues related to future research and plans for expanding the current system are discussed.

175 citations


Journal ArticleDOI
TL;DR: Investigation of whether estimates of ease of part handling and part insertion can be provided by multimodal simulation using virtual environment (VE) technology, rather than by using conventional table-based methods such as Boothroyd and Dewhurst Charts shows that the Multimodal VE is able to replicate experimental results.
Abstract: The goal of this work is to investigate whether estimates of ease of part handling and part insertion can be provided by multimodal simulation using virtual environment (VE) technology, rather than by using conventional table-based methods such as Boothroyd and Dewhurst Charts. The long term goal is to extend cad systems to evaluate and compare alternative designs using Design for Assembly Analysis. A unified physically based model has been developed for modeling dynamic interactions among virtual objects and haptic interactions between the human designer and the virtual objects. This model is augmented with auditory events in a multimodal VE system called the Virtual Environment for Design for Assembly (VEDA). The designer sees a visual representation of the objects, hears collision sounds when objects hit each other and can feel and manipulate the objects through haptic interface devices with force feedback. Currently these models are 2D in order to preserve interactive update rates. Experiments were conducted with human subjects using two-dimensional peg-in-hole apparatus and a VEDA simulation of the same apparatus. The simulation duplicated as well as possible the weight, shape, size, peg-hole clearance, and frictional characteristics of the physical apparatus. The experiments showed that the Multimodal VE is able to replicate experimental results in which increased task completion times correlated with increasing task difficulty (measured as increased friction, increased handling distance combined with decreased peg-hole clearance). However, the Multimodal VE task completion times are approximately two times the physical apparatus completion times. A number of possible factors for this temporal discrepancy have been identified but their effect has not been quantified.

134 citations


01 Dec 1997
TL;DR: A set of software algorithms that work with a force reflecting haptic interface and enable the user to touch and feel arbitrary 3D polyhedral virtual objects in virtual environments are developed.
Abstract: In this paper, we propose a new ray-based haptic rendering method for displaying 3D objects in virtual environments. We have developed a set of software algorithms that work with a force reflecting haptic interface and enable the user to touch and feel arbitrary 3D polyhedral virtual objects. Using the interface device and the suggested model, the user feels as if exploring the shape and surface details of objects such as textures. The components of the model include a hierarchical database for storing geometrical and material properties of objects, collision detection algorithms, a simple mechanistic model for computing the forces of interaction between the 3D virtual objects and the force-reflecting device, and a haptic filtering technique for simulating surface details of objects such as smoothness and texture. The developed algorithms together with a haptic interface device have several applications in areas such as medicine, education, computer animation, teleoperation, entertainment, and rehabilitation.

Journal ArticleDOI
TL;DR: The operating room and hospital of the future will be first designed and tested in virtual reality, bringing together the full power of the digital physician.
Abstract: Medical applications for virtual reality (VR) are just beginning to emerge. These include VR surgical simulators, telepresence surgery, complex medical database visualization, and rehabilitation. These applications are mediated through the computer interface and as such are the embodiment of VR as an integral part of the paradigm shift in the field of medicine. The Green Telepresence Surgery System consists of two components, the surgical workstation and remote worksite. At the remote site there is a 3-D camera system and responsive manipulators with sensory input. At the workstation there is a 3-D monitor and dexterous handles with force feedback. The VR surgical simulator is a stylized recreation of the human abdomen with several essential organs. Using a helmet mounted display and DataGloveTM, a person can learn anatomy from a new perspective by ‘flying’ inside and around the organs, or can practice surgical procedures with a scalpel and clamps. Database visualization creates 3-D images of complex medical data for new perspectives in analysis. Rehabilitation medicine permits impaired individuals to explore worlds not otherwise available to them, allows accurate assessment and therapy for their disabilities, and helps architects understand their critical needs in public or personal space. And to support these advanced technologies, the operating room and hospital of the future will be first designed and tested in virtual reality, bringing together the full power of the digital physician.

Proceedings ArticleDOI
30 Apr 1997
TL;DR: A new tracing algorithm is described that supports haptic rendering of NURBS surfaces without the use of any intermediate representation and by using this tracing algorithm in conjunction with algorithms for surface proximity testing and surface transitions, a complete haptic render system for sculptured models has been developed.
Abstract: A new tracing algorithm is described that supports haptic rendering of NURBS surfaces without the use of any intermediate representation. By using this tracing algorithm in conjunction with algorithms for surface proximity testing and surface transitions, a complete haptic rendering system for sculptured models has been developed. The system links an advanced CAD modeling system with a Sarcos force-reflecting exo-skeleton arm. A method for measuring the quality of the tracking component of the haptic rendering separately from the haptic device and force computation is also described. CR Descriptors: H.1.2 [Models and Principles] User/Machine Systems; C.3 [Special-Purpose and Application-Based Systems] Real-Time Systems; I.3.7 [Computer Graphics] Three-Dimensional Graphics and Realism; I.6.4 [Simulation and Modeling] Types of Simulation - Distributed; F.2.2 [Analysis of Algorithms and Problem Complexity] Nonnumerical Algorithms and Problems; J.6 [Computer-Aided Engineering].

Journal ArticleDOI
TL;DR: Various techniques to integrate force feedback in shared virtual simulations, dealing with significant and unpredictable delays are described.

01 Dec 1997
TL;DR: Psychophysical experiments examining the influence of sound on the haptic perception of stiffness indicate that when the physical stiffnesses of the surfaces were the same, subjects consistently ranked the surfaces according to sound, i.e., surfaces paired with sound cues that are typically associated with tapping harder surfaces were generally perceived as stiffer.
Abstract: To explore the possibility that multisensory information may be useful in expanding the range of haptic experiences in virtual environments, psychophysical experiments examining the influence of sound on the haptic perception of stiffness were carried out. In these experiments, subjects utilized the PHANToM, a six-degree-of-freedom haptic interface device with force-reflection along three axes, to feel the stiffness of various virtual surfaces. As subjects tapped on the different virtual surfaces, they were simultaneously presented with various impact sounds. The subjects were asked to rank the surfaces based on their perceived stiffness. The results indicate that when the physical stiffnesses of the surfaces were the same, subjects consistently ranked the surfaces according to sound, i.e., surfaces paired with sound cues that are typically associated with tapping harder surfaces were generally perceived as stiffer. However, when sound cues were randomly paired with surfaces of different mechanical stiffnesses, the results were more equivocal: naive subjects who had not used the PHANToM previously tended to be more affected by sound cues than another group of subjects who had previously completed a set of stiffness discrimination experiments without sound cues. The possible implications of this result for the design of multimodal virtual environments and its comparison to prior work by some of the authors on the effects of vision on haptic perception are discussed.

Proceedings ArticleDOI
01 Mar 1997
TL;DR: A virtual reality training simulation was created to address the specific need of detecting subsurface tumors by utilizing the Rutgers Master II force feedback system and a graphical user interface was developed to facilitate navigation.
Abstract: In the area of medical education, there is a strong need for palpation training to address the specific need of detecting subsurface tumors. A virtual reality training simulation was created to address this need. Utilizing the Rutgers Master II force feedback system, the simulation allows the user to perform a patient examination and palpate (touch) the patient's virtual liver to search for hard regions beneath the surface. When the user's fingertips pass over a "tumor", experimentally determined force/deflection curves are used to give the user the feeling of an object beneath the surface. A graphical user interface was developed to facilitate navigation as well as providing a training quiz. The trainee is asked to identify the location and relative hardness of tumors, and performance is evaluated in terms of positional and diagnostic errors.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: Simulations and experiments with a two-fingered hand were conducted to investigate the robustness of the approach for exploring various object shapes.
Abstract: We present an approach for haptic exploration of unknown objects with dextrous robotic hands. The emphasis is on developing a robust manipulation process that allows fingers to traverse the surface of an object. The process consists of a sequence of phases in which some fingers are responsible for grasping and manipulating the object while others roll and slide over the object surface. The rolling/sliding fingers can utilize sensors to determine surface properties such as texture, friction or small features such as grooves and ridges. Simulations and experiments with a two-fingered hand were conducted to investigate the robustness of the approach for exploring various object shapes.

Proceedings ArticleDOI
02 Apr 1997
TL;DR: This paper discusses the development of a new approach or method for the off-line programming of robotic devices, indicating some of the potential applications, and highlighting some the restricting limitations that need to be overcome.
Abstract: Virtual reality (VR) is an evolving technology being adopted by manufacturing in areas (such as operator training, and the virtual testing of new products before manufacture), where it is now becoming a widely accepted industrial tool. Of particular note are applications in off-line programming of robots. However, before this concept finds acceptance in industry, countless problems need to be resolved. Several problems are addressed within this paper, including the requirement of an acceptable interaction device, and the subject of haptic and tactile feedback. As manufacturing continually moves to become a more responsive environment, with products having shorter life cycles and batch quantities reducing in size, robot programming times become critical, and hence an area to be addressed in order to seek improved productivity. Off-line programming is one approach, where for example off-line programming within a virtual environment could reduce the required skill levels of a programmer, reduce the programming times, allow the operator a `natural' interface with which the operator would conduct the task in the real world, and reduce the boredom factor. This paper discusses the development of a new approach or method for the off-line programming of robotic devices, indicating some of the potential applications, and highlighting some of the restricting limitations that need to be overcome. The paper also includes a review of previous published work on the off-line programming of robots using VR.

Proceedings ArticleDOI
07 Sep 1997
TL;DR: A haptic rendering framework that allows the tactile display of complex virtual environments and reduces the task of the haptic servo control loop to the minimization of the error between user's position and that of the proxy.
Abstract: We present a haptic rendering framework that allows the tactile display of complex virtual environments. This framework allows surface constraints, surface shading, friction, texture and other effects to be modeled solely by updating the position of a representative object, the "virtual proxy". This abstraction reduces the task of the haptic servo control loop to the minimization of the error between user's position and that of the proxy. This framework has been implemented in a system that is able to haptically render virtual environments of a complexity that is near and often in excess of the capabilities of current interactive graphic systems.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: The design of a novel spherical remote-center-of-motion manipulator for force reflection in a laparoscopic surgery simulation environment is discussed and a description of the pantograph-linkage based parallel structure is presented.
Abstract: Force reflecting manual man-machine interfaces can provide the user with useful kinesthetic information in teleoperation tasks and virtual reality applications. In this paper we discuss the design of a novel spherical remote-center-of-motion manipulator for force reflection in a laparoscopic surgery simulation environment. A description of the pantograph-linkage based parallel structure is presented together with a discussion of its direct and inverse geometric models, instantaneous kinematics and statics, and usable workspace.

Book ChapterDOI
15 Jun 1997
TL;DR: A seven axis haptic device, called the Freedom-7, is described in relation to its application to surgical training and the generality of its concept makes it also relevant to most other haptic applications.
Abstract: A seven axis haptic device, called the Freedom-7, is described in relation to its application to surgical training. The generality of its concept makes it also relevant to most other haptic applications. The design rationale is driven by a long list of requirements since such a device is meant to interact with the human hand: uniform response, balanced inertial properties, static balancing, low inertia, high frequency response, high resolution, low friction, arbitrary reorientation, and low visual intrusion. Some basic performance figures are also reported.

Proceedings ArticleDOI
05 Oct 1997
TL;DR: In this paper, a tactile display for presenting the moving or shearing direction to the skin of a human finger was proposed using micro resonators driven by electromagnetic force to vibrating pins.
Abstract: This paper proposes a tactile display for presenting the moving or the shearing direction to the skin of a human finger. We consider stimulating tactile sense by vibration and the development of a tactile feedback system. The proposed vibro-tactile stimulator employs a number of vibrating pins to provide a ticking sensation to the human operator's skin. We used micro resonators driven by electromagnetic force to vibrating pins. We performed an experiment to present vibro-tactile stimulation using a single micro resonator, and confirmed the ability to perform vibro-tactile stimulation. Next we arrayed the micro resonators in a matrix state and constructed the tactile display. We made a tactile feedback experiment using this display. From the experimental results, it was made clear that the tactile display had the ability to present much information to a human finger.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: Design issues for teletaction systems, particularly sampling density, aliasing, and the limitations of using an array of 1-DOF actuators to approximate a continuous stress distribution on the human finger are considered.
Abstract: Teletaction is the transmission of cutaneous information from a remote tactile sensor to an operator's skin, typically the finger tips. Ideally, one would like a realistic sensation of directly touching an object with one's own finger and sense properties such as local shape, hardness, or texture. Teletaction or tactile feedback is one component of haptic feedback, the other component being force or kinesthetic feedback. This paper considers design issues for teletaction systems, particularly sampling density, aliasing, and the limitations of using an array of 1-DOF actuators to approximate a continuous stress distribution on the human finger.

01 Dec 1997
TL;DR: The use of an impulse-based simulation as a general purpose multibody simulator for haptic display using Poisson’s hypothesis in conjunction with Coulomb friction as the basis of the contact model is discussed.
Abstract: To date, haptic interfaces have rarely been used to display complex virtual environments such as those involving 3D mechanical assembly operations, or the use of hand tools Of the limitations which have been encountered, one is the absence of suitable simulation techniques which provide a general, user-defined virtual environment This paper discusses the use of an impulse-based simulation as a general purpose multibody simulator for haptic display This technique uses Poisson's hypothesis in conjunction with Coulomb friction as the basis of the contact model Since haptic displays are required to run in real time with a constant step size and high update rate, changes to the algorithm are required We identify the "impact state" as a critical feature of the contact model, and outline the various ways it can be chosen Finally, we describe the implementation of a planar multibody simulator on one and two-axis haptic displays

Journal ArticleDOI
TL;DR: Whether estimates of ease of part handling and part insertion can be provided by multimodal simulation using virtual environment (VE) technology is investigated and it is shown that the multi-modal VE is able to replicate experimental results in which increased task completion times correlated with increasing task difficulty.
Abstract: The goal of this work is to investigate whether estimates of ease of part handling and part insertion can be provided by multimodal simulation using virtual environment VE technology The long-term goal is to use this data to extend computer-aided design CAD systems in order to evaluate and compare alternate designs using design for assembly analysis A unified, physically-based model has been developed for modeling dynamic interactions and has been built into a multimodal VE system called the Virtual Environment for Design for Assembly VEDA The designer sees a visual representation of objects, hears collision sounds when objects hit each other, and can feel and manipulate the objects through haptic interface devices with force feedback Currently these models are 2D in order to preserve interactive update rates Experiments were conducted with human subjects using a two-dimensional peg-in-hole apparatus and a VEDA simulation of the same apparatus The simulation duplicated as well as possible the weight, shape, size, peg-hole clearance, and fictional characteristics of the physical apparatus The experiments showed that the multimodal VE is able to replicate experimental results in which increased task completion times correlated with increasing task difficulty measured as increased friction, increased handling distance, and decreased peg-hole clearance However, the multimodal VE task completion times are approximately twice those of the physical apparatus completion process A number of possible factors have been identified, but the effect of these factors has not been quantified

Journal ArticleDOI
TL;DR: A virtual reality training simulation that allows the trainee to perform a patient examination and palpate (touch) the patient's virtual liver to search for hard regions beneath the surface of subsurface tumors.

Proceedings ArticleDOI
06 Jan 1997
TL;DR: An intelligent adaptive systcm for the integration of haptic output in graphical user interfaces and several methods to generate haptic cues which might be integrated in mttltimodal user interfaces in the future are presented.
Abstract: This paper presents an intelligent adaptive systcm for the integration of haptic output in graphical user interfaces. The system observes the user’s actions, extracts meaningful features, and generates a user and application specific model. When the model is sufficiently delailled, it is used to predict the widget which is most likely 10 be used next by the user. Upon entering this widget, two magnets in a specialized mouse are activated to stop the movement, so target acquisition becomes easier and more comfortable. Besides the intelligent control system, we will present several methods to generate haptic cues which might be integrated in mttltimodal user interfaces in the future.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: Using industrial robotics equipment in a haptic (or kinesthetic) force display system conceived for mechanism design applications and results are presented which demonstrate human user interaction with the virtual mechanism.
Abstract: This paper explores using industrial robotics equipment in a haptic (or kinesthetic) force display system conceived for mechanism design applications. The dynamics and kinematics of an aircraft flight control column/wheel are simulated as a human interacts directly with the end effector of a commonly available robotic manipulator. An admittance control paradigm is used for developing a haptic system wherein realistic simulation of the dynamic interaction forces between a human user and the simulated virtual object or mechanism is required. Experimental results are presented which demonstrate human user interaction with the virtual mechanism.

01 Jan 1997
TL;DR: In this article, a single-axis force reflecting joystick was used to teach students about electromechanical systems, dynamics and controls, and the students assembled the devices from kits, tested and analyzed them, and used them to interact with computer models of dynamic systems.
Abstract: Low cost, single-axis force reflecting joysticks were used to teach students about electromechanical systems, dynamics and controls. The students assembled the devices from kits, tested and analyzed them, and used them to interact with computer models of dynamic systems. The devices helped students to appreciate such phenomena as equivalent inertia, friction and the effects of changing control system parameters. They also generated high enthusiasm among the students, particularly when used in cooperative “haptic video games” at the end of the course.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: This paper explores the sensation of touch from a physiological and technological perspective and shows how this can be combined with an integrated touch/force reflecting system to produce a 'realistic' haptic rendering.
Abstract: Haptic sensation has two complex components; skin (cutaneous) sensing which is mediated by a variety of sensing organs that respond to pressure, vibration, displacement and temperature and kinaesthetic/proprioceptive sensing (muscles and joints) which responds to motions and forces exerted by the interaction of the body with the external environment. Although haptic interaction has been identified as being crucial for many applications, achieving realism in haptic feedback has not been possible due to physical, understanding and modelling problems. This paper explores the sensation of touch from a physiological and technological perspective and shows how this can be combined with an integrated touch/force reflecting system to produce a 'realistic' haptic rendering.