scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Recordable Haptic textures

01 Jan 2006-pp 130-133
TL;DR: A method to record the surface texture of real life objects like metal files, sandpaper etc that can subsequently be played back on virtual surfaces using commonly available haptic hardware and the 3DOF SensAble PHANToM is presented.
Abstract: In this paper we present a method to record the surface texture of real life objects like metal files, sandpaper etc. These textures can subsequently be played back on virtual surfaces. Our method has the advantage that it can record textures using commonly available haptic hardware. We use the 3DOF SensAble PHANToM to record the textures. The algorithm involves creating recordings of the frequency content of a real surface, by exploring it with a haptic device. We estimate the frequency spectra at two different velocities, and subsequently interpolate between them on a virtual surface. The extent of correlation between real and simulated spectra was estimated and a near exact spectral match was obtained. The simulated texture was played back using the same haptic device. The algorithm to record and playback textures is simple and can be easily implemented for planar surfaces with uniform textures
Citations
More filters
Journal ArticleDOI
TL;DR: This paper presents a set of methods for creating a haptic texture model from tool-surface interaction data recorded by a human in a natural and unconstrained manner and uses these texture model sets to render synthetic vibration signals in real time as a user interacts with the TexturePad system.
Abstract: Texture gives real objects an important perceptual dimension that is largely missing from virtual haptic interactions due to limitations of standard modeling and rendering approaches. This paper presents a set of methods for creating a haptic texture model from tool-surface interaction data recorded by a human in a natural and unconstrained manner. The recorded high-frequency tool acceleration signal, which varies as a function of normal force and scanning speed, is segmented and modeled as a piecewise autoregressive (AR) model. Each AR model is labeled with the source segment's median force and speed values and stored in a Delaunay triangulation to create a model set for a given texture. We use these texture model sets to render synthetic vibration signals in real time as a user interacts with our TexturePad system, which includes a Wacom tablet and a stylus augmented with a Haptuator. We ran a human-subject study with two sets of ten participants to evaluate the realism of our virtual textures and the strengths and weaknesses of this approach. The results indicated that our virtual textures accurately capture and recreate the roughness of real textures, but other modeling and rendering approaches are required to completely match surface hardness and slipperiness.

140 citations


Cites background from "Recordable Haptic textures"

  • ...types of data used to model textures include friction variations [25] and vertical perturbations [26] resulting from dragging across a surface....

    [...]

Journal ArticleDOI
TL;DR: The tactual scanning of five naturalistic textures was recorded with an apparatus that showed that the transformation from the geometry of a surface to the force of traction and, hence, to the skin deformation experienced by a finger is a highly nonlinear process and speculated that the mechanical properties of the finger enables spatial information to be used for perceptual purposes in humans with no distributed sensing.
Abstract: The tactual scanning of five naturalistic textures was recorded with an apparatus that is capable of measuring the tangential interaction force with a high degree of temporal and spatial resolution. The resulting signal showed that the transformation from the geometry of a surface to the force of traction and, hence, to the skin deformation experienced by a finger is a highly nonlinear process. Participants were asked to identify simulated textures reproduced by stimulating their fingers with rapid, imposed lateral skin displacements as a function of net position. They performed the identification task with a high degree of success, yet not perfectly. The fact that the experimental conditions eliminated many aspects of the interaction, including low-frequency finger deformation, distributed information, as well as normal skin movements, shows that the nervous system is able to rely on only two cues: amplitude and spectral information. The examination of the “spatial spectrograms” of the imposed lateral skin displacement revealed that texture could be represented spatially, despite being sensed through time and that these spectrograms were distinctively organized into what could be called “spatial formants.” This finding led us to speculate that the mechanical properties of the finger enables spatial information to be used for perceptual purposes in humans with no distributed sensing, which is a principle that could be applied to robots.

122 citations


Cites background from "Recordable Haptic textures"

  • ...eration of a stylus is measured, or in [24], where the scanning velocity is measured....

    [...]

Proceedings ArticleDOI
20 Mar 2014
TL;DR: A method for resampling the texture models so they can be rendered at a sampling rate other than the 10 kHz used when recording data, to increase the adaptability and utility of HaTT.
Abstract: This paper introduces the Penn Haptic Texture Toolkit (HaTT), a publicly available repository of haptic texture models for use by the research community. HaTT includes 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic interface such as a SensAble Phantom Omni. This paper reviews our previously developed methods for modeling haptic virtual textures, describes our technique for modeling Coulomb friction between a tooltip and a surface, discusses the adaptation of our rendering methods for display using an impedance-type haptic device, and provides an overview of the information included in the toolkit. Each texture and friction model was based on a ten-second recording of the force, speed, and high-frequency acceleration experienced by a handheld tool moved by an experimenter against the surface in a natural manner. We modeled each texture's recorded acceleration signal as a piecewise autoregressive (AR) process and stored the individual AR models in a Delaunay triangulation as a function of the force and speed used when recording the data. To increase the adaptability and utility of HaTT, we developed a method for resampling the texture models so they can be rendered at a sampling rate other than the 10 kHz used when recording data. Measurements of the user's instantaneous normal force and tangential speed are used to synthesize texture vibrations in real time. These vibrations are transformed into a texture force vector that is added to the friction and normal force vectors for display to the user.

101 citations

Proceedings ArticleDOI
14 Apr 2013
TL;DR: A new method for creating haptic texture models from data recorded during natural and unconstrained motions using a new haptic recording device and presents a new spectral metric for determining perceptual match of the models in order to evaluate the effectiveness and consistency of the segmenting and modeling approach.
Abstract: If you pick up a tool and drag its tip across a table, a rock, or a swatch of fabric, you are able to feel variations in the textures even though you are not directly touching them. These vibrations are characteristic of the material and the motions made when interacting with the surface. This paper presents a new method for creating haptic texture models from data recorded during natural and unconstrained motions using a new haptic recording device. The recorded vibration data is parsed into short segments that represent the feel of the surface at the associated tool force and speed. We create a low-order auto-regressive (AR) model for each data segment and construct a Delaunay triangulation of models in force-speed space for each surface. During texture rendering, we stably interpolate between these models using barycentric coordinates and drive the interpolated model with white noise to output synthetic vibrations. Our methods were validated through application to data recorded by eight human subjects and the experimenter interacting with six textures. We present a new spectral metric for determining perceptual match of the models in order to evaluate the effectiveness and consistency of the segmenting and modeling approach. Multidimensional scaling (MDS) on the pairwise differences in the synthesized vibrations shows that the 54 created texture models cluster by texture in a two-dimensional perceptual space.

61 citations


Cites background from "Recordable Haptic textures"

  • ...Previous work in data-driven texture modeling [11,24,26,29] has shown that high-frequency vibrations created during interactions with real textures can be recreated from models made with recorded data....

    [...]

Proceedings ArticleDOI
04 Mar 2012
TL;DR: The TexturePad system as mentioned in this paper uses a low-order auto-regressive moving-average (ARMA) model for texture modeling and rendering to generate a stable and spectrally accurate vibration waveform in real time.
Abstract: Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one's scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user's current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts.

52 citations


Cites methods from "Recordable Haptic textures"

  • ...Vasudevan and Manivannan [26] used a SensAble PHANToM to drag across a textured surface and modeled the resulting haptic texture from the frequency spectrum of the tooltip’s vertical displacements....

    [...]

References
More filters
Proceedings ArticleDOI
01 Feb 1990
TL;DR: The force display technology used in the Sandpaper system is a motor-driven two-degree of freedo m joystick, which computes the appropriate forces for the joystick's motors in real-time.

546 citations


"Recordable Haptic textures" refers methods in this paper

  • ...To incorporate these features in a virtual model, we require a quick and efficient method of recording and reproducing these textures....

    [...]

Proceedings ArticleDOI
01 Aug 2001
TL;DR: A system for constructing computer models of several aspects of physical interaction behavior, by scanning the response of real objects, using a highly automated robotic facility that can scan behavior models of whole objects.
Abstract: We describe a system for constructing computer models of several aspects of physical interaction behavior, by scanning the response of real objects. The behaviors we can successfully scan and model include deformation response, contact textures for interaction with force-feedback, and contact sounds. The system we describe uses a highly automated robotic facility that can scan behavior models of whole objects. We provide a comprehensive view of the modeling process, including selection of model structure, measurement, estimation, and rendering at interactive rates. The results are demonstrated with two examples: a soft stuffed toy which has significant deformation behavior, and a hard clay pot which has significant contact textures and sounds. The results described here make it possible to quickly construct physical interaction models of objects for applications in games, animation, and e-commerce.

209 citations

Journal ArticleDOI
TL;DR: An efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs) using a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time.
Abstract: Computer haptics, an emerging field of research that is analogous to computer graphics, is concerned with the generation and rendering of haptic virtual objects. In this paper, we propose an efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs). Using this method and a haptic interface device, the users can manually explore and feel the shape and surface details of virtual objects. The main component of our rendering method is the “neighborhood watch” algorithm that takes advantage of precomputed connectivity information for detecting collisions between the end effector of a force-reflecting robot and polyhedral objects in VEs. We use a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time such that the haptic servo rate after the first contact is essentially independent of the number of polygons that represent the object. We also propose efficient methods for displaying surface properties of objects such as haptic texture and friction. Our haptic-texturing techniques and friction model can add surface details onto convex or concave 3-D polygonal surfaces. These haptic-rendering techniques can be extended to display dynamics of rigid and deformable objects.

201 citations


"Recordable Haptic textures" refers methods in this paper

  • ...We present in this paper simple and easily implementable method....

    [...]

  • ...This algorithm is simple and an advantage of this method is the ease with which the texture is recorded and played....

    [...]

Proceedings ArticleDOI
22 Apr 1996
TL;DR: The authors present a simple, fast algorithm to synthesize haptic textures from statistical properties of surfaces, which has been successfully implemented on a two-degree-of-freedom haptic interface (the Pantograph).
Abstract: All objects have a surface roughness which manifests itself as small forces when objects slide under load against each other. Simulating this roughness haptically enriches the interaction between a user and a virtual world, just as creating graphical textures enhances the depiction of a scene. As with graphical textures, a major design constraint for haptic textures is the generation of a sufficiently "realistic" texture given hard constraints on computational costs. The authors present a simple, fast algorithm to synthesize haptic textures from statistical properties of surfaces. The synthesized texture can be overlaid on other contact models, such as hard contact with Coulomb friction. The algorithm requires minimal hardware support, and can be implemented on a variety of force-feedback mechanisms. It has been successfully implemented on a two-degree-of-freedom haptic interface (the Pantograph).

116 citations


"Recordable Haptic textures" refers background in this paper

  • ...Our method uses the commonly available SensAble PHANToM ® to record textures from real surfaces....

    [...]

  • ...We use the 3DOF SensAble PHANToM ® to record the textures....

    [...]

  • ...Keywords – Haptics, Textures, Signal Processing, FFT,PHANToM I. INTRODUCTION Modeling interaction behavior is essential and challenging for creating interactive virtual environments....

    [...]

  • ...This is accomplished by estimating the frequency spectrum of vertical perturbations while dragging the tip of the PHANToM on a real surface laterally....

    [...]

  • ...Figure 1 shows how the PHANToM is used to explore the surface of a metal file....

    [...]

Proceedings ArticleDOI
TL;DR: Two new rendering methods for haptic texturing are presented for implementation of stochastic based texture models using a 3 DOF point interaction haptic interface.
Abstract: Recent research in haptic systems has begun to focus on the generation of textures to enhance haptic simulations. Synthetic texture generation can be achieved through the use of stochastic modeling techniques to produce random and pseudo-random texture patterns. These models are based on techniques used in computer graphics texture generation and textured image analysis and modeling. The goal for this project is to synthesize haptic textures that are perceptually distinct. Two new rendering methods for haptic texturing are presented for implementation of stochastic based texture models using a 3 DOF point interaction haptic interface. The synthesized textures can be used in a myriad of applications, including haptic data visualization for blind individuals and overall enhancement of haptic simulations.

79 citations


"Recordable Haptic textures" refers methods in this paper

  • ...To simulate textures on a virtual surface, we employ linear interpolation between these two recorded samples based on the velocity of the tip of the phantom on the virtual surface....

    [...]