scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Tangible Images: Runtime Generation of Haptic Textures From Images

13 Mar 2008-pp 357-360
TL;DR: Although the main focus of this paper pertains to haptic interaction with static images, the algorithm developed could also be used to haptically interact with videos and generate textures for virtual surfaces in a simple and efficient manner.
Abstract: This paper describes an algorithm for generation of forces from two dimensional bitmapped digital images, the forces allow the user to feel the contours of the image with sufficient realism using haptic devices. The algorithm can also be used to generate textures for virtual surfaces in a simple and efficient manner. The paper describes the design of the mask to be used to achieve the above two objectives and discusses the factors affecting the performance of the algorithm. Although the main focus of this paper pertains to haptic interaction with static images, the algorithm developed could also be used to haptically interact with videos.
Citations
More filters
Journal ArticleDOI
01 Apr 2019
TL;DR: This paper surveys the paradigm shift of haptic display occurred in the past 30 years, which is classified into three stages, including desktop haptics, surface haptic, and wearable haptic systems, and the importance of understanding human haptic perception for designing effective haptic devices is addressed.
Abstract: Immersion, interaction, and imagination are three features of virtual reality (VR). Existing VR systems possess fairly realistic visual and auditory feedbacks, and however, are poor with haptic feedback, by means of which human can perceive the physical world via abundant haptic properties. Haptic display is an interface aiming to enable bilateral signal communications between human and computer, and thus to greatly enhance the immersion and interaction of VR systems. This paper surveys the paradigm shift of haptic display occurred in the past 30 years, which is classified into three stages, including desktop haptics, surface haptics, and wearable haptics. The driving forces, key technologies and typical applications in each stage are critically reviewed. Toward the future high-fidelity VR interaction, research challenges are highlighted concerning handheld haptic device, multimodal haptic device, and high fidelity haptic rendering. In the end, the importance of understanding human haptic perception for designing effective haptic devices is addressed.

98 citations

Journal ArticleDOI
TL;DR: A tactile model for rendering image-textures based on electrovibration, which is achieved by varying stimuli signals to modulate friction between the finger and the touchscreen, is presented.
Abstract: Image-textures contain most surface features of real objects that are largely missing from virtual tactile interactions. This paper presents a tactile model for rendering image-textures based on electrovibration, which is achieved by varying stimuli signals to modulate friction between the finger and the touchscreen. We do research on the relationships between human tactile sensation and stimuli signals through experiments. According to the relationships, we establish a mapping model based on gradients of image-textures which are gained by the Roberts filter. We use the mapping model to synthetize the frequency and amplitude of stimuli signals for rendering image-textures as a user interacts with our tactile prototype. Specifically, stimuli frequency mainly reflects hardness and granularity of image-textures, stimuli amplitude mainly reflects heights of image-textures. We compare the proposed model with the model based on stimuli amplitude through experiments on the prototype. Results show that the proposed model can effectively enhance tactile reality of image-textures.

35 citations


Cites background from "Tangible Images: Runtime Generation..."

  • ...[11] put forward the mask multiplied by the sub-image to model and render the edge of texture-images....

    [...]

Proceedings ArticleDOI
29 Oct 2010
TL;DR: A method of generation of forces from a two-dimensional static image to make texture tangible using haptic devices using Tsai & Shah algorithm and a new haptic texture rendering model is described.
Abstract: This paper describes a method of generation of forces from a two-dimensional static image to make texture tangible using haptic devices. The proposed technique consists in creating a height map computed thanks to Tsai & Shah algorithm – an algorithm in shape-from-shading. This height map is then used to generate the virtual surface, and forces fields creating haptic texture are calculated by a new haptic texture rendering model.

18 citations

Proceedings ArticleDOI
12 Dec 2010
TL;DR: A new image-based method to display haptic texture information extracted from a static two dimensional image that allows the user to feel the contours and textures of the image with sufficient realism using haptic devices.
Abstract: In this paper, we present a new image-based method to display haptic texture information extracted from a static two dimensional image. The three dimensional forces allow the user to feel the contours and textures of the image with sufficient realism using haptic devices. The texture force is decomposed into a normal force component and a tangential force component at the haptic interaction point. The magnitude of the normal force is determined either by the color temperature (i.e. the emotions of warmth/coolness evoked by colors) or the luminance values in images. Warmer/brighter colors determine stronger normal forces to generate bumps, and cooler/darker colors determine weaker normal forces to produce depressions. The tangential force describes the relative color variation between haptic interaction point and its neighbors, thus, can be considered as a descriptor of the local texture feature. Because it is oriented to the pixels with cooler color in the neighborhood, when the user is exploring the bump by a haptic device, tangential resistive forces are sent to the user until the top of the bump is reached and, after the top, the virtual probe is pulled in the other direction. Our force model is by means of simple image processing techniques. The calculation is totally done by local, independent and direct operation. In other words, global operation such as recovering height maps is not necessary.

17 citations


Cites methods from "Tangible Images: Runtime Generation..."

  • ...Vasudevan used well known edge detection algorithms from grayscale image processing literature and proposed the design of the haptic rendering mask to allow the user to feel the contours and textures of the image via haptic devices [Vasudevan and Manivannan 2008]....

    [...]

  • ...Vasudevan used well known edge detection algorithms from grayscale image processing literature and proposed the design of the haptic rendering mask to allow the user to feel the contours and textures of the image via haptic devices [Vasudevan and Manivannan 2008]....

    [...]

Proceedings ArticleDOI
11 Dec 2011
TL;DR: This work proposes a framework for making tangible images which allows haptic perception of three features: scene geometry, texture and physical properties, and proposes dynamic mapping of haptic workspace in real-time to enable sensation of fine surface details.
Abstract: Visual and haptic rendering pipelines exist concurrently and compete for computing resources while the refresh rate of haptic rendering is two orders of magnitude higher than that of visual rendering (1000 Hz vs. 30-50Hz). However, in many cases, 3D visual rendering can be replaced by merely displaying 2D images, thus releasing the resources to image-driven haptic rendering algorithms. These algorithms provide for haptic texture rendering in vicinity of a touch point, but usually require additional information augmented with the image to provide for haptic perception of geometry of the shapes displayed in images. We propose a framework for making tangible images which allows haptic perception of three features: scene geometry, texture and physical properties. Haptic geometry rendering technique uses depth information, that could be acquired by a multitude of ways for providing haptic interaction with images and videos in real-time. The presented method neither performs 3D reconstruction nor requires for using polygonal models. It is based on direct force calculation and allows for smooth haptic interaction even at object boundaries. We also propose dynamic mapping of haptic workspace in real-time to enable sensation of fine surface details. Alternately, one of the existing shading-based haptic texture rendering methods can be combined with the proposed haptic geometry rendering algorithm to provide believable interaction. Haptic perception of physical properties is achieved by automatic segmentation of an image into haptic regions and interactive assignment of physical properties to them.

16 citations


Cites background from "Tangible Images: Runtime Generation..."

  • ...In [Vasudevan and Manivannan 2008], for example, at any point on the image a force is generated in the direction of maximum gradient and is proportional to the magnitude of the gradient....

    [...]

  • ...Different types of information present in an image can be exploited for force generation, such as illumination, visual texture information, color, histogram, luminance/chrominance, etc., [Juan et al. 2007; Kagawa et al. 2007; Vasudevan and Manivannan 2008]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: An efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs) using a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time.
Abstract: Computer haptics, an emerging field of research that is analogous to computer graphics, is concerned with the generation and rendering of haptic virtual objects. In this paper, we propose an efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs). Using this method and a haptic interface device, the users can manually explore and feel the shape and surface details of virtual objects. The main component of our rendering method is the “neighborhood watch” algorithm that takes advantage of precomputed connectivity information for detecting collisions between the end effector of a force-reflecting robot and polyhedral objects in VEs. We use a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time such that the haptic servo rate after the first contact is essentially independent of the number of polygons that represent the object. We also propose efficient methods for displaying surface properties of objects such as haptic texture and friction. Our haptic-texturing techniques and friction model can add surface details onto convex or concave 3-D polygonal surfaces. These haptic-rendering techniques can be extended to display dynamics of rigid and deformable objects.

201 citations

Proceedings ArticleDOI
22 Apr 1996
TL;DR: The authors present a simple, fast algorithm to synthesize haptic textures from statistical properties of surfaces, which has been successfully implemented on a two-degree-of-freedom haptic interface (the Pantograph).
Abstract: All objects have a surface roughness which manifests itself as small forces when objects slide under load against each other. Simulating this roughness haptically enriches the interaction between a user and a virtual world, just as creating graphical textures enhances the depiction of a scene. As with graphical textures, a major design constraint for haptic textures is the generation of a sufficiently "realistic" texture given hard constraints on computational costs. The authors present a simple, fast algorithm to synthesize haptic textures from statistical properties of surfaces. The synthesized texture can be overlaid on other contact models, such as hard contact with Coulomb friction. The algorithm requires minimal hardware support, and can be implemented on a variety of force-feedback mechanisms. It has been successfully implemented on a two-degree-of-freedom haptic interface (the Pantograph).

116 citations

Proceedings ArticleDOI
TL;DR: Two new rendering methods for haptic texturing are presented for implementation of stochastic based texture models using a 3 DOF point interaction haptic interface.
Abstract: Recent research in haptic systems has begun to focus on the generation of textures to enhance haptic simulations. Synthetic texture generation can be achieved through the use of stochastic modeling techniques to produce random and pseudo-random texture patterns. These models are based on techniques used in computer graphics texture generation and textured image analysis and modeling. The goal for this project is to synthesize haptic textures that are perceptually distinct. Two new rendering methods for haptic texturing are presented for implementation of stochastic based texture models using a 3 DOF point interaction haptic interface. The synthesized textures can be used in a myriad of applications, including haptic data visualization for blind individuals and overall enhancement of haptic simulations.

79 citations


"Tangible Images: Runtime Generation..." refers background in this paper

  • ...Fritz and Barner [3] follow this up by presenting two stochastic models to generate haptic textures....

    [...]

  • ...00 ©2008 IEEE so far [3, 9, 8, 7, 1] need geometric and other information....

    [...]

Proceedings ArticleDOI
10 Oct 2004
TL;DR: This work presents a new algorithm to display haptic texture information resulting from the interaction between two textured objects that is able to haptically display interaction due to fine surface textures that previous algorithms do not capture.
Abstract: Surface texture is among the most salient haptic characteristics of objects; it can induce vibratory contact forces that lead to perception of roughness In this paper, we present a new algorithm to display haptic texture information resulting from the interaction between two textured objects We compute contact forces and torques using low-resolution geometric representations along with texture images that encode surface details We also introduce a novel force model based on directional penetration depth and describe an efficient implementation on programmable graphics hardware that enables interactive haptic texture rendering of complex models Our force model takes into account important factors identified by psychophysics studies and is able to haptically display interaction due to fine surface textures that previous algorithms do not capture

74 citations

Proceedings ArticleDOI
07 Aug 2004
TL;DR: The first force model for haptic display of interaction between two textured objects is developed and it is shown that the model captures similar effects to those observed in the earlier experiments on roughness perception.
Abstract: One of the most salient haptic characteristics of objects is surface texture. Psychophysics studies have identified several key factors that affect perception of roughness during exploration of surface textures. Inspired by these recent findings, we develop the first force model for haptic display of interaction between two textured objects. We describe how our force model accounts for important elements identified by psychophysics studies. We then analyze and validate our model by comparing our simulation results against actual perceptual studies. We show that our model captures similar effects to those observed in the earlier experiments on roughness perception.

40 citations