scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 1998"


Proceedings ArticleDOI
01 Nov 1998
TL;DR: A new approach to enhance remote collaboration and communication, based on the idea of Tangible Interfaces, which places a greater emphasis on touch and physicality is presented, which employs telemanipulation technology to create the illusion that distant users are interacting with shared physical objects.
Abstract: Current systems for real-time distributed CSCW are largely rooted in traditional GUI-based groupware and voice/video conferencing methodologies. In these approaches, interactions are limited to visual and auditory media, and shared environments are confined to the digital world. This paper presents a new approach to enhance remote collaboration and communication, based on the idea of Tangible Interfaces, which places a greater emphasis on touch and physicality. The approach is grounded in a concept called Synchronized Distributed Physical Objects, which employs telemanipulation technology to create the illusion that distant users are interacting with shared physical objects. We describe two applications of this approach: PSyBench, a physical shared workspace, and inTouch, a device for haptic interpersonal communication.

350 citations


Proceedings ArticleDOI
14 Mar 1998
TL;DR: The study explored the impact of physically touching a virtual object on how realistic the VE seems to the user and empirically demonstrates the effectiveness of mixed reality as a simple, safe, inexpensive technique for adding physical texture and force feedback cues to virtual objects with large freedom of motion.
Abstract: The study explored the impact of physically touching a virtual object on how realistic the VE seems to the user Subjects in a "no touch" group picked up a 3D virtual image of a kitchen plate in a VE, using a traditional 3D wand "See and touch" subjects physically picked up a virtual plate possessing solidity and weight, using a mixed-reality force feedback technique Afterwards, subjects made predictions about the properties of other virtual objects they saw but did not interact with in the VE "See and touch" subjects predicted these objects would be more solid, heavier, and more likely to obey gravity than the "no touch" group Results provide converging evidence for the value of adding physical qualities to virtual objects The study first empirically demonstrates the effectiveness of mixed reality as a simple, safe, inexpensive technique for adding physical texture and force feedback cues to virtual objects with large freedom of motion Examples of practical applications are discussed

237 citations


Patent
15 Jun 1998
TL;DR: In this article, a design interface tool (300) for designing force sensations for a force feedback interface device (14) connected to a host computer (12) that displays the interface tool(300).
Abstract: A design interface tool (300) for designing force sensations for a force feedback interface device (14) connected to a host computer (12) that displays the interface tool (300). Input from a user is received in the interface (300) to select a type of force sensation to be commanded by a host computer (12) and output by a force feedback interface device (14). A graphical representation of the force sensation is displayed on the host computer (12) which provides a visual demonstration of a feel of the force sensation so that the user can view an effect of user input parameters on said force sensation. The force sensation is output to a user manipulatable object (34) of the force feedback device (14) so that the user can feel the designed force sensation, where the graphical representation is updated in conjunction with the output of the force sensation. The user can interactively modify force sensation characteristics and feel the results.

217 citations


Proceedings ArticleDOI
16 May 1998
TL;DR: In order to create realistic vibrotactile feedback, vibrations, forces, and velocities were collected during various tasks executed with a stylus: tapping on materials, stroking textures, and puncturing membranes.
Abstract: Vibrations can significantly enhance touch perception for virtual environment applications with minimal design complexity and cost. In order to create realistic vibrotactile feedback, we collected vibrations, forces, and velocities during various tasks executed with a stylus: tapping on materials, stroking textures, and puncturing membranes. Empirical models were fit to these waveforms and a library of model parameters was compiled. These models simulated tasks involving simultaneous display of forces and vibrations on a high-bandwidth force-feedback joystick. Vibration feedback adds little complexity to virtual environment algorithms. Human subjects interacting with the system showed improved execution and perception when performing surface feature discrimination tasks.

211 citations


Book ChapterDOI
01 Jan 1998
TL;DR: From visual or haptic information the authors identify objects and automatically retrieve the relevant models and their swift updating with changes in object properties depend, however, on signals from tactile sensors in the fingertips.
Abstract: When we use our digits to manipulate objects the applied fingertip forces and torques tangential to the grip surfaces are a result of complex muscle activity. These patterns are acquired during our ontogenetic development and we select them according to the manipulative intent. But the basic force coordination expressed in these patterns has to be tuned to the physical properties of the current object, e.g. shape, surface friction and weight. This takes place primarily by parametric adjustments of the force output based on internal models of the target object, i.e. implicit memory systems that represent critical object properties. From visual or haptic information we identify objects and automatically retrieve the relevant models. These models are then used to adapt the motor commands prior to their execution. The formation of models and their swift updating with changes in object properties depend, however, on signals from tactile sensors in the fingertips.

179 citations


Proceedings ArticleDOI
19 Oct 1998
TL;DR: A wearable navigation system based on a haptic directional display embedded in the back of a vest that includes an infrared-based input system for locating the user in an environment, and a wearable computer for route planning is described.
Abstract: This paper describes a wearable navigation system based on a haptic directional display embedded in the back of a vest. The system consists of a 4-by-4 array of micromotors for delivering haptic navigational signals to the user's back, an infrared-based input system for locating the user in an environment, and a wearable computer for route planning. User testing was conducted to evaluate the effectiveness of this system as a navigation guide for sighted users in an unfamiliar lab area. It is hoped that such a system can be a useful navigation guide for individuals with severe visual impairments in an unfamiliar environment. Future work will address the specific issues concerning blind navigation.

174 citations


Patent
18 Feb 1998
TL;DR: In this paper, a force feedback interface (34) including a haptic accelerator (60) that relieves the computational burden associated with force feedback generation from the force feedback processor (54) is presented.
Abstract: A force feedback interface (34) including a haptic accelerator (60) that relieves the computational burden associated with force feedback generation from a force feedback processor (54). The force feedback processor (54) is preferably a device microprocessor (54) included in the interface device and separate from a controlling host computer (32) for determining forces to be output. The haptic accelerator (60) quickly determines velocity and/or acceleration information describing motion of a user manipulatable object (66) from raw position data received from sensors (62) of the interface device and representing the position of the user object (66). The velocity and/or acceleration data is used by the force feedback processor (54) in the determination of forces to be output on the user object (66). The haptic accelerator (60) can in some embodiments also quickly and reliably determine condition forces which depend on the motion of the user object (66), thus relieving additional computation burden from the force feedback processor (54) and permitting the force feedback processor (54) to focus on determining other types of forces and overseeing the operation of the force feedback interface device (34).

173 citations


Proceedings ArticleDOI
01 Jan 1998
TL;DR: The results have implications for the future design of VEs in that it cannot be assumed that virtual textures and objects will feel to the user as the designer intends, but they do show that a haptic interface has considerable potential for blind computer users.
Abstract: This paper describes a series of studies involving a haptic device which can display virtual textures and 3-D objects. The device has potential for simulating real world objects and assisting in the navigation of virtual environments. Three experiments investigated: (a) whether previous results from experiments using real textures could be replicated using virtual textures; (b) whether participants perceived virtual objects to have the intended size and angle; and (c) whether simulated real objects could be recognised. In all the experiments differences in perception by blind and sighted people were also explored. The results have implications for the future design of VEs in that it cannot be assumed that virtual textures and objects will feel to the user as the designer intends. However, they do show that a haptic interface has considerable potential for blind computer users.

153 citations


Proceedings ArticleDOI
16 May 1998
TL;DR: The concept of a virtual lesson is proposed for transferring a teacher's skill to a student using haptic virtual reality technology, and a virtual calligraphy system is developed as one of its applications.
Abstract: The concept of a virtual lesson is proposed for transferring a teacher's skill to a student using haptic virtual reality technology. We have developed a virtual calligraphy system as one of its applications. In this system, the position and force trajectories of the teacher's writing brush is recorded first and then these trajectories are displayed to the student. What the student can learn is the teacher's horizontal brush position trajectory, his normal pushing force against a virtual paper, and the distance between the teacher's brush and the virtual paper. Recognizing that it is impossible to display both the normal force information and the normal position information at the same time, we have implemented two methods of skill display: one is to use the haptic display device for displaying the position information, and the other is to use it for displaying the force information. The remaining information is displayed using a secondary display device, visual display in the developed system. A preliminary experimental result is also presented.

140 citations


Patent
18 Aug 1998
TL;DR: In this article, an apparatus for using selectable instruments in virtual medical simulations with input devices actuated by user and resembling medical instruments which transmit various identifying data to the virtual computer model from said instruments which have been selected; then, said apparatus assist in creating full immersion for the user in the virtual reality model by tracking and homing to instruments with haptic, or force feedback generating, receptacles with which said instruments dock by means of a numerical grid.
Abstract: Invention is apparatus for using selectable instruments in virtual medical simulations with input devices actuated by user and resembling medical instruments which transmit various identifying data to the virtual computer model from said instruments which have been selected; then, said apparatus assist in creating full immersion for the user in the virtual reality model by tracking and homing to instruments with haptic, or force feedback generating, receptacles with which said instruments dock by means of a numerical grid, creating a seamless interface of instrument selection and use in the virtual reality anatomy.

138 citations


Proceedings ArticleDOI
13 Oct 1998
TL;DR: A tele-nanorobotics system using an atomic force microscope (AFM) as the nanorobot and a bilateral teleoperation control system with virtual impedance approach has been introduced for feeling the nano forces.
Abstract: A tele-nanorobotics system using an atomic force microscope (AFM) as the nanorobot has been proposed. Modeling and control of the AFM cantilever, and modeling of nanometer scale forces have been realized for telemanipulation applications. Besides 3-D virtual reality visual feedback in the user interface, a 1 DOF haptic device has been constructed for nano scale haptic sensing. For feeling the nano forces, a bilateral teleoperation control system with virtual impedance approach has been introduced. Initial experiments and simulations on the AFM and teleoperation system show that the system can be utilized for different tele-nanomanipulation applications such as 2-D nano particle assembly or biological object manipulation.

Proceedings ArticleDOI
13 Oct 1998
TL;DR: By decoupling the haptic display control problem from the design of virtual environments, the use of a virtual coupling network frees the developer of haptic-enabled virtual reality models from issues of mechanical stability.
Abstract: A haptic interface is a kinesthetic link between a human operator and a virtual environment. This paper addresses stability and performance issues associated with haptic interaction. It generalizes and extends the concept of a virtual coupling network, an artificial connection between a haptic display and a virtual world, to include both the impedance and admittance models of haptic interaction. A benchmark example exposes an important duality between these two cases. Linear circuit theory is used to develop necessary and sufficient conditions for the stability of a haptic simulation, assuming the human operator and virtual environment are passive. These equations lead to an explicit design procedure for virtual coupling networks which give maximum performance while guaranteeing stability. By decoupling the haptic display control problem from the design of virtual environments, the use of a virtual coupling network frees the developer of haptic-enabled virtual reality models from issues of mechanical stability.

Proceedings ArticleDOI
16 May 1998
TL;DR: A new compact 6-DOF haptic interface with large workspace is proposed that contains a newly developed five bar spatial gimbal mechanism for orientation, placed on a modified Delta parallel-link mechanism.
Abstract: In this paper we propose a new compact 6-DOF haptic interface with large workspace. It contains a newly developed five bar spatial gimbal mechanism for orientation, placed on a modified Delta parallel-link mechanism. The motion range of each axis of the five bar mechanism is over /spl plusmn/70 degrees. Quick motions can be realized easily due to the parallelism inherent to both the modified Delta substructure and the five bar substructure.

01 Jan 1998
TL;DR: Techniques that can free the user from restricttive requirements such as working in calibrated environments, resutls with haptic interface technology incorporated into augmented reality domains, and systems considerations that underlie the practical realization of these interactive augmented reality techinques are presented.
Abstract: Augmented reality is the merging of synthetic sensory information into a user''s perception of a real environment. Until recently, it has presented a passive interface to its human users, who were merely viewers of the scene augmented only with visual information. In contrast, practically since its inception, computer graphics--and its outgrowth into virtual reality--has presented an interactive environment. It is our thesis that the agumented reality interfce can be made interactive. We present: techniques that can free the user from restricttive requirements such as working in calibrated environments, resutls with haptic interface technology incorporated into augmented reality domains, and systems considerations that underlie the practical realization of these interactive augmented reality techinques.

Patent
21 Jan 1998
TL;DR: In this paper, a method for haptically deforming a virtual surface within a haptic virtual environment is used to plastically deform the virtual surface of a virtual object by sensing a user's position in real space.
Abstract: A method for haptically deforming a virtual surface within a haptic virtual environment is used to plastically deform the virtual surface of a virtual object by sensing a user's position in real space, determining a haptic interface location in the haptic environment based thereon, and determining whether the virtual surface collides with the haptic interface location Upon detection of a collision above a threshold force, a visual representation of the virtual surface is plastically deformed and a corresponding force is calculated and fed back to the user The virtual surface can be visco-elastically deformed

Proceedings ArticleDOI
01 Jan 1998
TL;DR: Developing a haptic interface like HandJive presented special challenges, such as creating rapid physical prototypes that could withstand abuse, developing a preliminary system of haptic interaction, and testing haptic interfaces through low-tech prototypes.
Abstract: The paper describes how we designed and prototyped HandJive, a haptic device for interpersonal entertainment. Handlive is notable because it relies entirely on haptic input and output. The design process included typical steps such as analyzing user needs and performing iterative prototyping 2nd testing. However, developing a haptic interface like HandJive also presented special challenges, such as creating rapid physical prototypes that could withstand abuse, developing a preliminary system of haptic interaction, and testing haptic interfaces through low-tech prototypes.

Patent
Robert W. Rice1
10 Nov 1998
TL;DR: In this paper, a 3D, virtual reality, tissue specific model of a human or animal body which provides a high level of user-interaction is presented, where the model functions can be analyzed and user-modified on a tissue-by-tissue basis, thereby allowing modeling of a wide variety of normal and abnormal tissue attributes and corresponding study thereof.
Abstract: A three-dimensional, virtual reality, tissue specific model of a human or animal body which provides a high level of user-interactivity. The model functions can be analyzed and user-modified on a tissue-by-tissue basis, thereby allowing modeling of a wide variety of normal and abnormal tissue attributes and corresponding study thereof. The model can be user-modified through a keyboard, or other VR tools such as a haptic interface. The haptic interface can modify the model to correspond to the tissue attributes of a user, and can provide sensory output corresponding to the interaction of the model to a prescripted scene.

Journal Article
TL;DR: A highly interactive training simulator system by means of computer graphics and virtual reality techniques is developed.
Abstract: Arthroscopy has already become an irreplaceable method in diagnostics. The arthroscope, with optics and light source, and the exploratory probe are inserted into the knee joint through two small incisions underneath the patella. Currently, the skills required for arthroscopy are taught through hands-on clinical experience. As arthroscopies became a more common procedure even in smaller hospitals, it became obvious that special training was necessary to guarantee qualification of the surgeons. On-the-job training proved to be insufficient. Therefore, research groups from the Berufsgenossenschaftliche Unfallklinik Frankfurt am Main approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on virtual reality (VR) techniques. Two main issues are addressed: the three-dimensional (3-D) reconstruction process and the 3-D interaction. To provide the virtual environment a realistic representation of the region of interest with all relevant anatomical structures is required. Based on a magnetic resonance image sequence a realistic representation of the knee joint was obtained suitable for computer simulation. Two main components of the VR interface can be distinguished: the 3-D interaction to guide the surgical instruments and the 2-D graphical user interface for visual feedback and control of the session. Moreover, the 3-D interaction has to be realized by means of Virtual Reality techniques providing a simulation of an arthroscope and an intuitive handling of other surgical instruments. Currently, the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Technical University Darmstadt a haptic display is designed and built for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way.

Journal ArticleDOI
TL;DR: This study is the first to empirically demonstrate the effectiveness of tactile augmentation as a simple, safe, inexpensive technique with large freedom of motion for adding physical texture, force feedback cues, smell and taste to virtual objects.
Abstract: Experiment 1 explored the impact of physically touching a virtual object on how realistic the virtual environment (VE) seemed to the user. Subjects in a `no touch' group picked up a 3D virtual image of a kitchen plate in a VE, using a traditional 3D wand. `See and touch' subjects physically picked up a virtual plate possessing solidity and weight, using a technique called `tactile augmentation'. Afterwards, subjects made predictions about the properties of other virtual objects they saw but did not interact with in the VE. `See and touch' subjects predicted these objects would be more solid, heavier, and more likely to obey gravity than the `no touch' group. In Experiment 2 (a pilot study), subjects `physically bit' a chocolate bar in one condition, and `imagined biting' a chocolate bar in another condition. Subjects rated the event more fun and realistic when allowed to physically bite the chocolate bar. Results of the two experiments converge with a growing literature showing the value of adding physical qualities to virtual objects. This study is the first to empirically demonstrate the effectiveness of tactile augmentation as a simple, safe, inexpensive technique with large freedom of motion for adding physical texture, force feedback cues, smell and taste to virtual objects. Examples of practical applications are discussed.

Journal Article
TL;DR: In this article, a seven axis haptic device, called the Freedom-7, is described in relation to its application to surgical training and the design rationale is driven by a long list of requirements since such a device is meant to interact with the human hand.
Abstract: A seven axis haptic device, called the Freedom-7, is described in relation to its application to surgical training. The generality of its concept makes it also relevant to most other haptic applications. The design rationale is driven by a long list of requirements since such a device is meant to interact with the human hand: uniform response, balanced inertial properties, static balancing, low inertia, high frequency response, high resolution, low friction, arbitrary reorientation, and low visual intrusion. Some basic performance figures are also reported.

01 Jan 1998
TL;DR: Numerical and experimental results for a two degree-of-freedom haptic display demonstrate the effectiveness of the proposed approach in achieving performance and stability in haptic simulation.
Abstract: In haptic simulation, a human operator kinesthetically explores a virtual environment. To achieve a virtual sense of touch, the human interacts with an active mechanical device, called a haptic display. This paper presents an approach to guarantee that this physical man-machine interface remains stable, while maximizing performance. The key element in ensuring stability is the virtual coupling network, an artificial link between the haptic display and the virtual environment. Considerations of structural flexibility in the haptic device are included in the derivation of design criteria for such networks. Solutions for both the impedance and admittance models of haptic interaction are included. Numerical and experimental results for a two degree-of-freedom haptic display demonstrate the effectiveness of the proposed approach in achieving performance and stability in haptic simulation.

Book ChapterDOI
01 Jan 1998
TL;DR: A survey of teleoperation control can be found in this paper, where the authors discuss issues of simulation and control that arise in the manipulation of virtual environments, including the control of haptic interfaces/teleoperator masters.
Abstract: The concept of teleoperation has evolved to accommodate not only manipulation at a distance but manipulation across barriers of scale and in virtual environments, with applications in many areas. Furthermore, the design of high-performance force-feedback teleoperation masters has been a significant driving force in the development of novel electromechanical or “haptic” computer-user interfaces that provide kinesthetic and tactile feedback to the computer user. Since haptic interfaces/teleoperator masters must interact with an operator and a real or virtual dynamic slave that exhibits significant dynamic uncertainty, including sometimes large and unknown delays, the control of such devices poses significant challenges. This chapter presents a survey of teleoperation control work and discusses issues of simulation and control that arise in the manipulation of virtual environments.

Journal ArticleDOI
01 Mar 1998
TL;DR: Interest in force-feedback devices is gaining momentum in the commercial sector however, notably in the area of personal computer games, and the authors believe this interest will drive down the cost of components and spur research efforts so that better, more cost-effective force- feedback devices will be available to the medical community for use in widespread surgical-simulation systems.
Abstract: Surgical simulation can provide great benefits to medicine by reducing the cost and duration of training and making the process more intuitive and informative. However, a simulation system imposes stringent requirements on the human-machine interface. A sense of touch greatly enhances the simulation experience, since much of the skill that a medical professional possesses is in his ability to explore and diagnose by touch. This sensory input can be provided by an input device with force and/or tactile feedback. There are many technical challenges associated with the creation of a robust surgical-simulation system incorporating touch feedback. The medical application has unique needs that drive the design of the mechanism, the control scheme, the tissue deformation engine, and the overall system architecture and distribution of computation. This technology is not yet mature; several companies are dedicated to creating various parts of a simulation system, but as yet there are no commercially available solutions that are cost effective. Interest in force-feedback devices is gaining momentum in the commercial sector however, notably in the area of personal computer games. The authors believe this interest will drive down the cost of components and spur research efforts so that better, more cost-effective force-feedback devices will be available to the medical community for use in widespread surgical-simulation systems.

Proceedings ArticleDOI
16 May 1998
TL;DR: A haptic display system is presented for manipulating virtual mechanisms derived from a mechanical CAD design and the operator experiences the dynamic forces from the mechanism plus constraint forces.
Abstract: A haptic display system is presented for manipulating virtual mechanisms derived from a mechanical CAD design. Links are designed and assembled into mechanisms using Utah's Alpha-1 CAD system, and are then manipulated with a Sarcos Dextrous Arm Master. Based on the mechanism's kinematics and the virtual grasp, the motion of the master is divided into motion of the mechanism and constraint violation. The operator experiences the dynamic forces from the mechanism plus constraint forces.

Journal ArticleDOI
TL;DR: This work proposes a method of selectively stimulating only superficial mechanoreceptors and shows that it makes people feel a more realistic, finer virtual texture than possible by adjusting the stimulator spacing.
Abstract: Research in virtual reality has recognized the need for more realistic tactile display in addition to touch and non-touch display and force display. We propose a method of selectively stimulating only superficial mechanoreceptors. We show that it makes people feel a more realistic, finer virtual texture than possible by adjusting the stimulator spacing. The apparatus is simple and we expect this idea to develop into a device to display varieties of tactile feeling.

Journal ArticleDOI
TL;DR: The CSHI concept is presented, along with the required mathematical transformations for use of the device, driving for lighter, safer, crisper, more dexterous, and more economical operation.
Abstract: A cable-suspended haptic interface (CSHI) concept is presented. The goal is to create an input/output device to provide six-degree-of-freedom (dof) wrench (force and moment) feedback to a human operator in virtual reality or remote applications. Compared to commercially-available haptic interfaces for virtual reality applications, the present concept is driving for lighter, safer, crisper, more dexterous, and more economical operation. The CSHI concept is presented, along with the required mathematical transformations for use of the device.

Journal ArticleDOI
TL;DR: The results of the current study showed that the haptic device was accurate in and of itself, within its current physical limitations, and that the isosurface‐based simulator was preferred.
Abstract: Objective/Hypothesis: To determine the efficacy of a haptic (force feedback) device and to compare isosurface and volumetric models of a functional endoscopic sinus surgery (FESS) training simulator. Study Design: A pilot study involving faculty and residents from the Department of Otolaryngology at The Ohio State University. Methods: Objective trials evaluated the haptic device's ability to perceive three-dimensional shapes (stereognosis) without the aid of image visualization. Ethmoidectomy tasks were performed with both isosurface and volumetric FESS simulators, and surveys compared the two models. Results: The haptic device was 77% effective for stereognosis tasks. There was a preference toward the isosurface model over the volumetric model in terms of visual representation, comfort, haptic-visual fidelity, and overall performance. Conclusions: The FESS simulator uses both visual and haptic feedback to create a virtual reality environment to teach paranasal sinus anatomy and basic endoscopic sinus surgery techniques to ear, nose, and throat residents. The results of the current study showed that the haptic device was accurate in and of itself, within its current physical limitations, and that the isosurface-based simulator was preferred.

Proceedings ArticleDOI
01 Jan 1998
TL;DR: The developed algorithms deal directly with geometry of anatomical organs, surface and compliance characteristics of tissues, and the estimation of appropriate reaction forces to convey to the user a feeling of touch and force sensations.
Abstract: Research in the area of computer assisted surgery and surgical simulation has mainly focused on developing 3D geometrical models of the human body from 2D medical images, visualization of internal structures for educational and preoperative surgical planning purposes, and graphical display of soft tissue behavior in real time. Conveying to the surgeon the touch and force sensations with the use of haptic interfaces has not been investigated in detail. We have developed a set of haptic rendering algorithms for simulating "surgical instrument--soft tissue" interactions. Although the focus of the study is the development of algorithms for simulation of laparoscopic procedures, the developed techniques are also useful in simulating other medical procedures involving touch and feel of soft tissues. The proposed force-reflecting soft tissue models are in various fidelities and have been developed to simulate the behavior of elastically deformable objects in virtual environments. The developed algorithms deal directly with geometry of anatomical organs, surface and compliance characteristics of tissues, and the estimation of appropriate reaction forces to convey to the user a feeling of touch and force sensations.

Journal Article
TL;DR: In this paper, the VIRGY project at the VRAI Group (Virtual Reality and Active Interface), Swiss Federal Institute of Technology (Lausanne, Switzerland) developed an endoscopic surgical training tool which realistically simulates the interactions between one or more surgical instruments and gastrointestinal organs.
Abstract: This paper describes the VIRGY project at the VRAI Group (Virtual Reality and Active Interface), Swiss Federal Institute of Technology (Lausanne, Switzerland). Since 1994, we have been investigating a variety of virtual-reality based methods for simulating laparoscopic surgery procedures. Our goal is to develop an endoscopic surgical training tool which realistically simulates the interactions between one or more surgical instruments and gastrointestinal organs. To support real-time interaction and manipulation between instruments and organs, we have developed several novel graphic simulation techniques. In particular, we are using live video texturing to achieve dynamic effects such as bleeding or vaporization of fatty tissues. Special texture manipulations allows us to generate pulsing objects while minimizing processor load. Additionally, we have created a new surface deformation algorithm which enables real-time deformations under external constraints. Lastly, we have developed a new 3D object definition which allows us to perform operations such as total or partial object cuttings, as well as to selectively render objects with different levels of detail. To provide realistic physical simulation of the forces and torques on surgical instruments encountered during an operation, we have also designed a new haptic device dedicated to endososcopic surgery constraints. We are using special interpolation and extrapolation techniques to integrate our 25 Hz visual simulation with the 300 Hz feedback required for realistic tactile interaction. The fully VIRGY simulator has been tested by surgeons and the quality of both our visual and haptic simulation has been judged sufficient for training basic surgery gestures.

Journal ArticleDOI
TL;DR: The article discusses the application of touch feedback systems to 3D modelling, which requires novel rendering techniques such as volume-based rendering algorithms to achieve a high interactivity level.
Abstract: As we progress into applications that incorporate interactive life-like 3D computer graphics, the mouse falls short as a user interface device, and it becomes obvious that 3D computer graphics could achieve much more with a more intuitive user interface mechanism. Haptic interfaces, or force feedback devices, promise to increase the quality of human-computer interaction by accommodating our sense of touch. The article discusses the application of touch feedback systems to 3D modelling. To achieve a high interactivity level requires novel rendering techniques such as volume-based rendering algorithms.