scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2001"


Patent
24 May 2001
TL;DR: In this paper, a haptic feedback interface device using electroactive polymer actuators is described, including direct forces, inertial forces, and braking forces, which provide haptic sensation to the user.
Abstract: Haptic feedback interface devices using electroactive polymer (EAP) actuators to provide haptic sensations and/or sensing capabilities. A haptic feedback interface device (12) is in communication with a host computer (14) and includes a sensor (112) device that detects the manipulation of the interface device by the user and an electroactive polymer actuator (18) responsive to input signals and operative to output a force to the user cause by motion of the actuator. The output force provides a haptic sensation to the user. Various embodiments of interface devices employing EAP actuators are described, including embodiments providing direct forces, inertial forces, and braking forces.

366 citations


Proceedings ArticleDOI
01 Aug 2001
TL;DR: This paper presents work carried out for a project to develop a new interactive technique that combines haptic sensation with computer graphics and a new interface device comprising of a flexible screen, an actuator array and a projector.
Abstract: This paper presents work carried out for a project to develop a new interactive technique that combines haptic sensation with computer graphics. The project has two goals. The first is to provide users with a spatially continuous surface on which they can effectively touch an image using any part of their bare hand, including the palm. The second goal is to present visual and haptic sensation simultaneously by using a single device that doesn't oblige the user to wear any extra equipment. In order to achieve these goals, we designed a new interface device comprising of a flexible screen, an actuator array and a projector. The actuator deforms the flexible screen onto which the image is projected. The user can then touch the image directly and feel its shape and rigidity. Initially we fabricated two prototypes, and their effectiveness is examined by studying the observations made by anonymous users and a performance evaluation test for spatial resolution.

349 citations


Journal ArticleDOI
TL;DR: A computer-based training system to simulate laparoscopic procedures in virtual environments for medical training and can be trained to grasp and insert a flexible and freely moving catheter into the deformable cystic duct invirtual environments.
Abstract: We develop a computer-based training system to simulate laparoscopic procedures in virtual environments for medical training. The major hardware components of our system include a computer monitor to display visual interactions between 3D virtual models of organs and instruments together with a pair of force feedback devices interfaced with laparoscopic instruments to simulate haptic interactions. We simulate a surgical procedure that involves inserting a catheter into the cystic duct using a pair of laparoscopic forceps. This procedure is performed during laparoscopic cholecystectomy to search for gallstones in the common bile duct. Using the proposed system, the user can be trained to grasp and insert a flexible and freely moving catheter into the deformable cystic duct in virtual environments. The associated deformations are displayed on the computer screen and the reaction forces are fed back to the user through the force feedback devices. A hybrid modeling approach was developed to simulate the real-time visual and haptic interactions that take place between the forceps and the catheter, as well as the duct; and between the catheter and the duct.

294 citations


Patent
01 Nov 2001
TL;DR: In this article, a set of haptic elements (haptels) are arranged in a grid and each haptel is a haptic feedback device with linear motion and a touchable surface substantially perpendicular to the direction of motion.
Abstract: A set of haptic elements (haptels) are arranged in a grid. Each haptel is a haptic feedback device with linear motion and a touchable surface substantially perpendicular to the direction of motion. In a preferred embodiment, each haptel has a position sensor which measures the vertical position of the surface within its range of travel, a linear actuator which provides a controllable vertical bi-directional feedback force, and a touch location sensor on the touchable surface. All haptels have their sensors and effectors interfaced to a control processor. The touch location sensor readings are processed and sent to a computer, which returns the type of haptic response to use for each touch in progress. The control processor reads the position sensors, derives velocity, acceleration, net force and applied force measurements, and computes the desired force response for each haptel. The haptels are coordinated such that force feedback for a single touch is distributed across all haptels involved. This enables the feel of the haptic response to be independent of where touch is located and how many haptels are involved in the touch. As a touch moves across the device, haptels are added and removed from the coordination set such that the user experiences an uninterrupted haptic effect. Because the touch surface is comprised of a multiple haptels, the device can provide multiple simultaneous interactions, limited only by the size of the surface and the number of haptels. The size of the haptels determines the minimum distance between independent touches on the surface, but otherwise does not affect the properties of the device. Thus, the device is a pointing device for graphical user interfaces which provides dynamic haptic feedback under application control for multiple simultaneous interactions.

265 citations


Patent
15 May 2001
TL;DR: In this paper, an input control device with force sensors is configured to sense hand movements of a surgeon performing a robot-assisted microsurgery, which actuate a mechanically decoupled robot manipulator.
Abstract: An input control device with force sensors is configured to sense hand movements of a surgeon performing a robot-assisted microsurgery. The sensed hand movements actuate a mechanically decoupled robot manipulator. A microsurgical manipulator, attached to the robot manipulator, is activated to move small objects and perform microsurgical tasks. A force-feedback element coupled to the robot manipulator and the input control device provides the input control device with an amplified sense of touch in the microsurgical manipulator.

253 citations


Proceedings ArticleDOI
01 Aug 2001
TL;DR: A physically-based, deformable, 3D brush model and bi-directional, two-layer, paint model that allows the user to produce complex brush strokes intuitively and the haptic feedback enhances the sense of realism and provides critical tactile cues.
Abstract: We present a novel painting system with an intuitive haptic interface, which serves as an expressive vehicle for interactively creating painterly works. We introduce a deformable, 3D brush model, which gives the user natural control of complex brush strokes. The force feedback enhances the sense of realism and provides tactile cues that enable the user to better manipulate the paint brush. We have also developed a bidirectional, two-layer paint model that, combined with a palette interface, enables easy loading of complex blends onto our 3D virtual brushes to generate interesting paint effects on the canvas. The resulting system, DAB, provides the user with an artistic setting, which is conceptually equivalent to a real-world painting environment. Several users have tested DAB and were able to start creating original art work within minutes.

249 citations


Journal ArticleDOI
TL;DR: A vibration feedback model is created by measuring the acceleration of the stylus of a three degree-of-freedom haptic display as a human user tapped it on several real materials, which provided different parameters than those derived strictly from acceleration data.
Abstract: Reality-based modeling of vibrations has been used to enhance the haptic display of virtual environments for impact events such as tapping, although the bandwidths of many haptic displays make it difficult to accurately replicate the measured vibrations. We propose modifying reality-based vibration parameters through a series of perceptual experiments with a haptic display. We created a vibration feedback model, a decaying sinusoidal waveform, by measuring the acceleration of the stylus of a three degree-of-freedom haptic display as a human user tapped it on several real materials. A series of perceptual experiments, where human users rated the realism of various parameter combinations, were performed to further enhance the realism of the vibration display for impact events. The results provided different parameters than those derived strictly from acceleration data. Additional experiments verified the effectiveness of these modified model parameters by showing that users could differentiate between materials in a virtual environment.

229 citations


Journal ArticleDOI
TL;DR: Evidence that presence may derive from the process of multi-modal integration and, therefore, may be associated with other illusions, such as cross- modal transfers, that result from theprocess of creating a coherent mental model of the space is concluded.
Abstract: How do users generate an illusion of presence in a rich and consistent virtual environment from an impoverished, incomplete, and often inconsistent set of sensory cues? We conducted an experiment to explore how multimodal perceptual cues are integrated into a coherent experience of virtual objects and spaces. Specifically, we explored whether inter-modal integration contributes to generating the illusion of presence in virtual environments. To discover whether intermodal integration might play a role in presence, we looked for evidence of intermodal integration in the form of cross-modal interactions---perceptual illusions in which users use sensory cues in one modality to “fill in” the “missing” components of perceptual experience. One form of cross-modal interaction, a cross-modal transfer, is defined as a form of synesthesia, that is, a perceptual illusion in which stimulation to a sensory modality connected to the interface (such as the visual modality) is accompanied by perceived stimulation to an unconnected sensory modality that receives no apparent stimulation from the virtual environment (such as the haptic modality). Users of our experimental virtual environment who manipulated the visual analog of a physical force, a virtual spring, reported haptic sensations of “physical resistance”, even though the interface included no haptic displays. A path model of the data suggested that this cross-modal illusion was correlated with and dependent upon the sensation of spatial and sensory presence. We conclude that this is evidence that presence may derive from the process of multi-modal integration and, therefore, may be associated with other illusions, such as cross-modal transfers, that result from the process of creating a coherent mental model of the space. Finally, we suggest that this perceptual phenomenon might be used to improve user experiences with multimodal interfaces, specifically by supporting limited sensory displays (such as haptic displays) with appropriate synesthetic stimulation to other sensory modalities (such as visual and auditory analogs of haptic forces).

229 citations


Patent
27 Sep 2001
TL;DR: In this paper, a haptic feedback interface device includes at least two actuator assemblies, which each include a moving inertial mass and the control signals have different duty cycles to provide directional sensations.
Abstract: Directional haptic feedback provided in a haptic feedback interface device. An interface device includes at least two actuator assemblies, which each include a moving inertial mass. A single control signal provided to the actuator assemblies at different magnitudes provides directional inertial sensations felt by the user. A greater magnitude waveform can be applied to one actuator to provide a sensation having a direction approximately corresponding to a position of that actuator in the housing. In another embodiment, the actuator assemblies each include a rotary inertial mass and the control signals have different duty cycles to provide directional sensations. For power-consumption efficiency, the control signals can be interlaced or pulsed at a different frequency and duty cycle to reduce average power requirements.

220 citations


Journal ArticleDOI
TL;DR: The authors' data indicate that the visual system recognizes the front view of objects best, whereas the hand recognizes objects best from the back, and when the sensory modalities differed between learning an object and recognizing it, recognition performance was best when the objects were rotated back-to-front between learning and recognition.
Abstract: On the whole, people recognize objects best when they see the objects from a familiar view and worse when they see the objects from views that were previously occluded from sight. Unexpectedly, we found haptic object recognition to be viewpoint-specific as well, even though hand movements were unrestricted. This viewpoint dependence was due to the hands preferring the back "view" of the objects. Furthermore, when the sensory modalities (visual vs. haptic) differed between learning an object and recognizing it, recognition performance was best when the objects were rotated back-to-front between learning and recognition. Our data indicate that the visual system recognizes the front view of objects best, whereas the hand recognizes objects best from the back.

216 citations


Proceedings ArticleDOI
01 Mar 2001
TL;DR: A system which permits manipulation of molecules in molecular dynamics simulations with real-time force feedback and graphical display, and results of an IMD study of a 4,000 atom system, the gramicidin A channel are discussed.
Abstract: We have implemented a system termed Interactive Molecular Dynamics (IMD), which permits manipulation of molecules in molecular dynamics simulations with real-time force feedback and graphical display. Communication is achieved through an efficient socket connection between the visualization program (VMD) and a molecular dynamics program (NAMD) running on single or multiple machines. A natural force feedback interface for molecular steering is provided by a haptic device. We model the effect of simulation speed on the haptic feedback, and discuss results of an IMD study of a 4,000 atom system, the gramicidin A channel.

Patent
16 Feb 2001
TL;DR: In this article, a haptic boundary is provided corresponding to scrollable or scalable portions of the application domain, and the user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display).
Abstract: The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.

Journal ArticleDOI
01 Oct 2001
TL;DR: This work proposes a control architecture in which bi-directional information transfer occurs across the control interface, allowing the human to use the interface to simultaneously exert control and extract information.
Abstract: When humans interface with machines, the control interface is usually passive and its response contains little information pertinent to the state of the environment. Usually, information flows thro...

Patent
28 Sep 2001
TL;DR: A haptic feedback interface device, in communication with a host computer, includes a housing physically contacted by a user operating the interface device and a plurality of actuators producing inertial forces when the actuators are driven by control signals as mentioned in this paper.
Abstract: Directional haptic feedback for a haptic feedback interface device. A haptic feedback interface device, in communication with a host computer, includes a housing physically contacted by a user operating the interface device, and a plurality of actuators producing inertial forces when the actuators are driven by control signals. Each of the actuators includes a rotatable eccentric mass positioned offset on a rotating shaft of the actuator, where the actuators are rotated simultaneously such that centrifugal forces from the rotation of masses combine to output the inertial forces substantially only along a single axis having a desired direction approximately in a plane of rotation of the masses.

Proceedings ArticleDOI
01 Mar 2001
TL;DR: A novel, interactive sculpting framework founded upon subdivision solids and physics-based modeling, equipped with natural, haptic-based interaction to provide the user with a realistic sculpting experience.
Abstract: In this paper we systematically develop a novel, interactive sculpting framework founded upon subdivision solids and physics-based modeling. In contrast with popular subdivision surfaces, subdivision solids have the unique advantage offering both the boundary representation and the interior material of a solid object. We unify the geometry of subdivision solids with the principle of physicsbased models and formulate dynamic subdivision solids. Dynamic subdivision solids respond to applied forces in a natural and predictive manner and give the user the illusion of manipulating semielastic virtual clay. We have developed a real-time sculpting system that provides the user with a wide array of intuitive sculpting toolkits. The flexibility of the subdivision solid approach allows users to easily modify the topology of sculpted objects, while the inherent physical properties are exploited to provide a natural interface for direct, force-based deformation. More importantly, our sculpting system is equipped with natural, haptic-based interaction to provide the user with a realistic sculpting experience. CR Categories: I.3.5 [Computer Graphics]: Physics-based modeling; I.3.3 [Computer Graphics]: Modeling packages; I.3.6 [Computer Graphics]: Interaction techniques; H.5.2 [User Interfaces]: Haptic I/O; I.3.7 [Computer Graphics]: Virtual reality; I.3.7 [Computer Graphics]: Animation.

Patent
20 Jun 2001
TL;DR: In this paper, a chat interface allowing a user to exchange haptic chat messages with other users in a chat session over a computer network is presented, where the haptic sensation is based at least in part on the received force information received from the remote computer.
Abstract: A chat interface allowing a user to exchange haptic chat messages with other users in a chat session over a computer network. A chat interface can be displayed by a local computer and receives input data from a user of the local computer, such as text characters or speech input. The input data provides an outgoing chat message that can include sent force information. The outgoing chat message is sent to a remote computer that is connected to the local host computer via a computer network. The remote computer can display a chat interface and output a haptic sensation to a user of the remote computer based at least in part on the force information in the outgoing chat message. An incoming message from the remote computer can also be received at the chat interface, which may also include received force information. The incoming chat message is displayed on a display device to the user of the local computer. A haptic sensation can be output to the user of the local computer using a haptic device coupled to the local computer, where the haptic sensation is based at least in part on the received force information received from the remote computer.

17 Oct 2001
TL;DR: In this paper, the authors investigated the benefits of force feedback for virtual reality training of a real task, the construction of a LEGO biplane model, and found a significant change in performance due to training level.
Abstract: This paper describes an experiment conducted to investigate the benefits of force feedback for virtual reality training of a real task. Three groups of subjects received different levels of training before completing a manual task, the construction of a LEGO biplane model. One group trained on a Virtual Building Block (VBB) simulation, which emulated the real task in a virtual environment, including haptic feedback. A second group was also trained on the VBB system, but without the force feedback. The last group received no virtual reality training. Completion times were compared for these different groups in building the actual biplane model in the real world ANOVA analysis showed a significant change in performance due to training level.

Journal ArticleDOI
TL;DR: This paper studies Internet-based teleoperation systems that include haptic feedback, concentrating on the control of such systems and their performance, and examines key issues, such as stability, synchronization, and transparency.
Abstract: Many tasks can be done easily by humans turn out to be very difficult to accomplish with a teleoperated robot. The main reason for this is the lack of tactile sensing, which cannot be replaced by visual feedback alone. Once haptic devices are developed, their potential in many fields is obvious. Especially, in teleoperation systems, where haptic feedback can increase the efficiency and even render some tasks feasible. This paper studies Internet-based teleoperation systems that include haptic feedback, concentrating on the control of such systems and their performance. The potential of this technology and its advantages are explored. In addition, key issues, such as stability, synchronization, and transparency are analyzed and studied. Specifically, an event-based planning and control of Internet-based teleoperation systems is presented with experimental results of several implemented system scenarios in micro- and macro-scales.

Journal ArticleDOI
TL;DR: The authors consider the detection of small surface features, such as cracks, bumps, and ridges, on the surface of an object during haptic exploration and dexterous manipulation and presents several algorithms for feature detection based on feature definitions.
Abstract: The authors consider the detection of small surface features, such as cracks, bumps, and ridges, on the surface of an object during haptic exploration and dexterous manipulation. Surface feature de...

Journal ArticleDOI
TL;DR: An objective test for evaluating the functional studies of the upper limbs (UL) in patients with neurological diseases (ND) is presented, based on creating a virtual environment, using a computer display for visual information and a PHANTOM haptic interface for providing tactile feedback.
Abstract: An objective test for evaluating the functional studies of the upper limbs (UL) in patients with neurological diseases (ND) is presented. The method allows assessment of kinematic and dynamic motor abilities of UL. Our methodology is based on creating a virtual environment, using a computer display for visual information and a PHANTOM haptic interface. The haptic interface is used as a kinematic measuring device and for providing tactile feedback to the patient. In virtual environment, a labyrinth in patient's frontal plane was created at the start of each test. By moving the haptic interface control stick the patient was able to move the pointer (a ball) through the labyrinth in three dimensions and to feel the reactive forces of the wall. The new test offers a wide range of numerical and graphic results. It has so far been applied to 13 subjects with various forms of ND (e.g., Friedreich Ataxia, Parkinson's disease, Multiple Sclerosis) as well as to healthy subjects. The comparison in performance between right and left UL has been carried out in healthy subjects.

Patent
16 Jan 2001
TL;DR: In this paper, a haptic pointing device includes a plurality of rigid, elongated proximal members, each connected to a separate rigid elongated distal member through an articulating joint.
Abstract: A haptic pointing device includes a plurality of rigid, elongated proximal members, each connected to a separate rigid, elongated distal member through an articulating joint. The other end of each proximal member is coupled to an actuator such as a motor, causing that member to swing within a separate plane perpendicular to the shaft of the motor in response to a control signal. An end-effector is interconnected to the second end of each distal member through an articulating joint, such that as the actuators move the proximal members, the end-effector moves in space. In a preferred embodiment, the device includes at least three proximal members and three distal members, and the end-effector is coupled to a user-graspable element such as a stylus which retains a preferred orientation in space as the members are driven by the actuators. In a force-feedback application, the haptic pointing device further includes a position sensor associated with each degree of freedom, and haptic processing means interfaced to a virtual reality system or teleoperations environment. Additional components may be provided to increase flexibility, degrees of freedom, or both.

Patent
16 Mar 2001
TL;DR: In this paper, an elongated member of a medical instrument can be sensed and an actuator can be used to apply force to the instrument for control and manipulation of the instrument.
Abstract: Apparatus and method for controlling force applied to and for manipulation of medical instruments. An elongated member of a medical instrument can be sensed, and an actuator can be used to apply force to the medical instrument for control and manipulation of the instrument. Via use of the applied forces, the instrument can be moved to a desired position in a working channel, haptic indications of position can be output to the user, and/or user control over the instrument can be enhanced.


Journal ArticleDOI
TL;DR: Two related examples in which a humanoid robot determines the models and representations that govern its behavior are shown, including a model that captures the dynamics of a haptic exploration of an object with a dextrous robot hand that supports skillful grasping.

Proceedings ArticleDOI
02 Mar 2001
TL;DR: This paper has developed three novel user interfaces: GestureDriver, HapticDriver and PdaDriver, and presents the motivation for and design of each interface.
Abstract: Remote driving is a difficult task. Not only do operators have problems perceiving and evaluating the remote environment, but they frequently make incorrect or sub-optimal control decisions. Thus, there is a need to develop alternative approaches which make remote driving easier and more productive. To address this need, we have developed three novel user interfaces: GestureDriver, HapticDriver and PdaDriver. In this paper, we present the motivation for and design of each interface. We also discuss research issues related to the use of gesture, haptics, and palm-size computers for remote driving. Finally, we describe lessons learned, potential applications and planned extensions for each interface.

Journal ArticleDOI
TL;DR: A novel and natural haptic interface is proposed and a physics-based geometric modeling approach is presented that facilitates interactive sculpting of spline-based virtual material in CAD/CAM, virtual prototyping, human–computer interface, and medical training and simulation.
Abstract: Conventional geometric design techniques based on B-splines and NURBS often require tedious control-point manipulation and/or painstaking constraint specification via unnatural mouse-based computer interfaces. In this paper, we propose a novel and natural haptic interface and present a physics-based geometric modeling approach that facilitates interactive sculpting of spline-based virtual material. Using the PHANToM haptic device, modelers can feel the physically realistic presence of virtual spline objects and interactively deform virtual materials with force feedback throughout the design process. We develop various haptic sculpting tools to expedite the deformation of B-spline surfaces with haptic feedback and constraints. The most significant contribution of this paper is that point, normal, and curvature constraints can be specified interactively and modified naturally using forces. To achieve the real-time sculpting performance, we devise a novel dual representation for B-spline surfaces in both physical and mathematical space: the physics-based mass-spring model is mathematically constrained by the B-spline surface throughout the sculpting session. We envision that the integration of haptics with traditional computer-aided design makes it possible to realize all the potential offered by both haptic sculpting and physics-based modeling in CAD/CAM, virtual prototyping, human–computer interface, and medical training and simulation.

Proceedings ArticleDOI
11 Nov 2001
TL;DR: It is demonstrated how continuous interaction through a haptically actuated device rather than discrete button and key presses can produce simple yet powerful tools that leverage physical intuition.
Abstract: We introduce a set of techniques for haptically manipulating digital media such as video, audio, voicemail and computer graphics, utilizing virtual mediating dynamic models based on intuitive physical metaphors. For example, a video sequence can be modeled by linking its motion to a heavy spinning virtual wheel: the user browses by grasping a physical force-feedback knob and engaging the virtual wheel through a simulated clutch to spin or brake it, while feeling the passage of individual frames. These systems were implemented on a collection of single axis actuated displays (knobs and sliders), equipped with orthogonal force sensing to enhance their expressive potential. We demonstrate how continuous interaction through a haptically actuated device rather than discrete button and key presses can produce simple yet powerful tools that leverage physical intuition.

Proceedings ArticleDOI
01 Jan 2001
TL;DR: An epidural injection simulator for medical training and education that provides the user with realistic feel encountered during an actual procedure and includes a new training feature called "Haptic Guidance" that allows the user to follow a previously recorded expert procedure and feel the encountered forces.
Abstract: Performing epidural injections is a complex task that demands a high level of skill and precision from the physician, since an improperly performed procedure can result in serious complications for the patient The objective of our project is to create an epidural injection simulator for medical training and education that provides the user with realistic feel encountered during an actual procedure We have used a Phantom haptic interface by SensAble Technologies, which is capable of three-dimensional force feedback, to simulate interactions between the needle and bones or tissues An additional degree-of-freedom through an actual syringe was incorporated to simulate the "loss of resistance" effect, commonly considered to be the most reliable method for identifying the epidural space during an injection procedure The simulator also includes a new training feature called "Haptic Guidance" that allows the user to follow a previously recorded expert procedure and feel the encountered forces Evaluations of the simulator by experienced professionals indicate that the simulation system has considerable potential to become a useful aid in medical training

Journal ArticleDOI
TL;DR: The results suggest that observers can involuntarily compare visual and haptic percepts in order to evaluate the relative reliabilities of visual cues, and that these reliability determine how cues are combined during three-dimensional visual perception.

Book
01 Dec 2001
TL;DR: Touch in Virtual Environments: Haptics and the Design of Interactive Systems is an outgrowth of a one-day conference on haptics held at the University of Southern California in February, 2001, sponsored by USC's Integrated Media Systems Center, a National Science Foundation Engineering Research Center, the Annenberg School for Communication at USC, and the IEEE Control Systems Society as mentioned in this paper.
Abstract: From the Book: Preface The haptic interface is becoming an increasingly important component of immersive systems Haptics refers to the modality of touch and the sensation of shape and texture an observer feels when exploring a virtual object, such as a three-dimensional model of a tool, instrument, or art object Researchers in the field are interested in developing, refining, and testing haptic devices and interfaces, and applying findings from psychological studies of human touch to the simulation of the tactile sense in virtual environments Touch in Virtual Environments: Haptics and the Design of Interactive Systems is an outgrowth of a one-day conference on haptics held at the University of Southern California in February, 2001, sponsored by USC's Integrated Media Systems Center, a National Science Foundation Engineering Research Center, the Annenberg School for Communication at USC, and the IEEE Control Systems Society Many of the chapters were first presented as papers at that venue The contributors to this volume, who represent a variety of academic disciplines and institutional affiliations, are researchers who can fairly be said to be working at the cutting edge of engineering science, in an area that is just beginning to have an impact in the design of immersive systems In Chapters 1-8 of this book, the contributors ponder questions about the haptic interface, such as: How can current state-of-the-art haptic displays be improved via better sensing? What are the software tools and models needed to facilitate multi-user tactile exploration of shared virtual environments? How can we optimize low-level force control for haptic devices? What algorithms andtechniques are needed to convey the feel of deformable objects? How do we capture users' exploration with haptic devices? How do we compress haptic exploration data so that it becomes possible to store or transmit long interactive sessions? In Chapters 9-12, the contributors consider the impact of the unpredictable, and highly variable, "human-in-the-loop" They examine questions like the following: How can we make haptic displays more usable for blind and visually impaired users? What are the differences between perceiving texture with the bare skin and with a probe, and how do factors like probe size and speed contribute? What can we learn about human thresholds for detecting small haptic effects that will be useful for the design of hand-held devices? To what extent do vision, sound, and haptics complement or interfere with one another in multimodal interactive systems? In addition to exploring basic research issues in haptics such as acquisition of models, contact detection, force feedback, compression, capture, collaboration, and human factors, the contributors to Touch in Virtual Environments describe in detail several promising applications A primary application area for haptics has been in surgical simulation and medical training Haptics has also been incorporated into scientific visualization, providing an intuitive interface to complex displays of biological and geoscientific data In some projects haptic displays have been used as alternative input devices for painting, sculpting and computer-assisted design There have also been instances of the application of haptics to military training and simulation, providing an accurate source of orientation information in land, sea, and aerospace environments In Chapters 13-15 the reader will find accounts of applications to telesurgery and surgical simulation, sign language recognition, and museum display