scispace - formally typeset
Open AccessJournal ArticleDOI

Playing with Senses in VR: Alternate Perceptions Combining Vision and Touch

Anatole Lécuyer
- 01 Jan 2017 - 
- Vol. 37, Iss: 1, pp 20-26
TLDR
This article discusses how to exploit ambiguous sensorial situations to generate new kinds of percept using three types of examples: pseudo-haptic effects, self-motion sensations, and body-ownership illusions.
Abstract
Virtual reality is an immersive experience based on computer-generated stimulations perceived with multiple sensory channels. It is possible to manipulate these sensory stimulations independently and create conflicting situations in which, for instance, vision and touch are spatially and/or temporally inconsistent. This article discusses how to exploit these ambiguous sensorial situations to generate new kinds of percept using three types of examples: pseudo-haptic effects, self-motion sensations, and body-ownership illusions.

read more

Content maybe subject to copyright    Report

HAL Id: hal-01848314
https://hal.archives-ouvertes.fr/hal-01848314
Submitted on 24 Jul 2018
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of sci-
entic research documents, whether they are pub-
lished or not. The documents may come from
teaching and research institutions in France or
abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diusion de documents
scientiques de niveau recherche, publiés ou non,
émanant des établissements d’enseignement et de
recherche français ou étrangers, des laboratoires
publics ou privés.
Playing with senses in VR: Alternate perceptions
combining vision and touch
Anatole Lécuyer
To cite this version:
Anatole Lécuyer. Playing with senses in VR: Alternate perceptions combining vision and touch. IEEE
Computer Graphics and Applications, Institute of Electrical and Electronics Engineers, 2017, 37 (1),
pp.20-26. �10.1109/MCG.2017.14�. �hal-01848314�

Playing with senses in VR:
Alternate perceptions combining vision and touch
Anatole Lécuyer, Inria, Rennes, France
anatole.lecuyer@inria.fr
Abstract: Virtual Reality is an immersive experience based on computer-generated stimulations
perceived with multiple sensory channels. It is possible to manipulate these sensory stimulations
independently and create conflicting situations in which, for instance, vision and touch are spatially
and/or temporally inconsistent. In this article we show how to exploit these ambiguous sensorial
situations in order to generate new kinds of percept but also plausible 3D interactions in virtual
environments. We particularly insist on three results obtained by playing with visual and haptic
senses in virtual reality: (i) pseudo-haptic effects, (ii) self-motion sensations, and (iii) body-
ownership illusions.
Conflicts and coherence of senses in virtual reality
Virtual Reality (VR) technologies aim at generating the sensory illusion of an alternate reality: being
located in a different place, or interacting with objects or characters that are not physically present in
the real surrounding of the user. A sensory illusion is a matter of interpretation. Our senses send
messages that are consistent with the stimulations they receive, but the resulting combination of them
is somehow inconsistent, and the brain is fooled in its interpretation and final percept (Berthoz, 2002).
In this quest for creating a proper sensory illusion, VR settings put the user in a situation of
psychological conflict between two situations and two living experiences: the real situation (in the real
setup) and the artificial one (in the virtual environment). In general, the real and the virtual situations
share common properties, such as the same shape of floor, so that the fusion or transition between both
situations is smoother. The real setup remains mostly stable in time, and everything is made to make it
imperceptible - which is literally the case when putting on a head-mounted display (HMD). So-called
“breaks in presencecan occur whenever a discrepancy between the virtual and the real situations is
noticed or when the virtual stimulation becomes less reliable, e.g., with a high latency (Slater, 2000).
If it works, and when this psychological conflict is solved in favor of the virtual situation, the
resulting sensory experience and subsequent feeling of immersion can be very strong. This can be
revealed by physiological or behavioral reactions of users. A good example is the vertigo sensation felt
when positioned at the top of a virtual pit and looking down. People would often refuse to jump into
the virtual void, even though they know they are safely standing on a flat floor in reality. Interestingly,
when a real physical edge is added on the floor, for instance using a wooden plate, the physiological
reaction is increasing significantly compare to the situation with no physical edge (Meehan, 2002).
Thus, the vertigo sensation is stronger in presence of an additional and consistent tactile cue. The
immersion feeling is higher in presence of a sensory redundancy in VR.
An immersive experience relies on realistic sensory stimulations, essentially visual, but sometimes
audio or tactile (haptic). In practice, it is often impossible to perfectly reproduce a multi-sensory
experience in VR with all the sensory stimulations involved. In a paradoxical manner, the VR
experience is usually a situation of sensory deprivation”: a perceptual isolation with multiple senses
being removed or cut off. Incidentally, in terms of interaction capabilities, the user is mostly in
situation of handicap, being unable to achieve basic operations, provided with limited possibilities of
perception and action.

Figure 1: What happens when vision and touch are not collocated ? In our experiment (Congedo, 2006),
participants could watch and grasp a rotating virtual handle under two different conditions: VHc (Visual and
Haptic information are spatially Collocated- as in HMD-based VR settings) and VHd (Vision and Haptics
are “De-located”, as in screen-based VR settings).
Sensory redundancy is an effective means for gaining a higher immersion. But the sensory
stimulations are expected to remain consistent spatially and temporally. A spatial or temporal
discrepancy between sensory sources is expected to decrease the plausibility of the virtual experience.
In (Congedo, 2006), we could show that when visual and haptic information are not spatially co-
located, the multi-sensory integration is negatively impacted, and the weight given to the haptic
modality decreases strongly in favor of vision (see Figure 1). The spatial offset between the visual and
haptic displays ends up with a masking of the haptic sensation. Interestingly, designers of VR systems
can relate these findings to the relevance of the two sensory channels. For instance, if the contribution
of touch is important for the task, great efforts should be undertaken to collocate as much as possible
the visual and haptic percepts. On the other hand, in presence of a low-quality haptic feedback, the
interest may be to contain the limitations of the haptic device by keeping distant the two displays.
Virtual Reality can be used to create experimental situations that can sometimes hardly be reproduced
in a real setup such as artificial sensory conflicts. A sensory conflict implies that the information
coming from one modality differs from the information coming from another one. Sensory conflicts
can be a source of problems in VR. According to the sensory conflict theory, cybersickness” is
evoked as the result of an inconsistency between the visual and the vestibular or proprioceptive senses.
But sensory conflicts can also help psychologists to better understand how humans perceive multi-
sensory information, enabling for instance to compute the relative weights attributed to the various
sensory channels (Ernst, 2002). For instance, when spatial interaction tasks are concerned, visuo-
haptic perception was found to be characterized by a strong visual dominance (Rock, 1964). In this
context the concept of sensory coherence is also central when perceiving and representing the
environment with multiple senses. In this perspective, sensory signals are not processed to directly
estimate the relevant variables, but rather to estimate the difference between mental estimations and
the relevant variables (Cornilleau-Pérès, 1993).
In this article, we will show how sensory conflicts can become a source of inspiration for VR
designers and, more generally, how combinations of visual and haptic feedbacks can be exploited to
generate alternate percepts and novel 3D interaction schemes. We will insist on three series of results
obtained recently on this research topic: pseudo-haptic effects, self-motion sensations, and body-
ownership illusions. All of them were obtained by playing with redundant or conflicting visual and
haptic cues in virtual environments, bordering systematically on sensory illusions.
Haptic illusions and “pseudo-haptic feedback”
As a first example, pseudo-haptic feedback intends to produce a wide range of haptic sensations,
such as friction, relief or stiffness, without using a haptic interface but playing with visual feedback.

We state four key assertions concerning pseudo-haptic feedback (Lécuyer, 2009). First, pseudo-haptic
feedback implies one or more sensory conflicts between visual and haptic cues. Second, pseudo-haptic
feedback relies on the sensory dominance of vision over touch when perceiving spatial properties
(distance, position, size, displacement, etc). Third, pseudo-haptic feedback corresponds to a new and
coherent representation of the environment resulting from a combination of haptic and visual
information. Fourth, pseudo-haptic feedback can create a haptic illusion, i.e., the perception of a haptic
property different from the one present in the real environment.
Since our first article published on this topic in 2000 (Lécuyer, 2009), we have designed and studied
numerous examples of pseudo-haptic effects. We provide hereafter a representative set of successful
studies and setups.
The most famous technique based on pseudo-haptic feedback has been originally designed in a 2D
context: the pseudo-haptic textures(Lécuyer, 2009). This pseudo-haptic effect was meant to display
the relief of 2D images using a simple computer mouse. When the user manipulates the mouse, the
technique consists of altering the cursor’s visual motion as it moves over the image, i.e. manipulating
the Control/Display ratio. To create the impression that the cursor is climbing up a slope, it is slowed
down. Inversely, to simulate the cursor sliding down a slope, it is speeded up. For example, to simulate
the cursor moving over a bump (as illustrated in Figure 2A), the cursor is slowed down until it reaches
the top of the bump. Once it is past the top, the cursor accelerates, until it reaches the foot of the bump.
After that, it returns to its normal speed. This technique has been evaluated within an extensive series
of experiments that demonstrated that participants were able to well recognize and precisely draw
texture patterns simulated using pseudo-haptic textures. Later on, we proposed an extension of this
technique called Elastic Images” aiming at simulating the local elasticity of images (Argelaguet,
2013). The elasticity sensation is generated by a procedural image deformation algorithm that modifies
the image according to its simulated physical properties and to the virtual pressure exerted by the user
(see Figure 2C). The simulated pressure depends on the time the user keeps the mouse button pressed.
A psychophysical experiment showed that users were able to recognize up to eight different elasticity
configurations.
More recently pseudo-haptic feedback has been studied in the context of 3D interaction with virtual
environments for improving the selection or manipulation of virtual objects. One example illustrated
in Figure 2B improves the selection of items enclosed in a 3D carousel designed for virtual
showcasing (Gaucher, 2013). The carousel is a 3D ring menu rendered on a 3D display. The 3D
interaction with the carousel is achieved by tracking user’s gestures: in order to rotate the carousel, the
user has to perform swipe gestures. Our pseudo-haptic effect is introduced to highlight relevant items,
such as promotional products, by locally modifying the friction of the carousel. This effect is expected
to “attract” the user towards these specific items when interacting with the carousel. When facing an
item with a strong friction coefficient, the user must increase the amplitude of movement of the hand
to move to the following or previous item. A second example is a unique interaction paradigm called
the “Virtual Mitten” (Achibet, 2014). It is meant to simulate 3D manipulation of objects using grip
forces. It is based on the passive haptic feedback provided by a handheld elastic input device (an
engineered hand-exerciser), and the visual metaphor of a mitten that enables to grasp and manipulate
3D objects (see Figure 2E). The grip force exerted on the device enables to grasp objects and achieve
various manipulation tasks, such as opening a drawer or pulling a lever. The grasping performed by
the virtual mitten is directly correlated with the grip force applied on the elastic device. A pseudo-
haptic effect is then introduced in order to generate the haptic perception of different levels of grasping
effort. A psychophysical experiment could show that that participants were well able to perceive
different levels of effort during several manipulation tasks thanks to this pseudo-haptic approach.
Pseudo-haptic feedback has then been introduced for simulating perception and interaction with a
highly complex 3D object: the user’s self-avatar. We indeed introduced the notion of “Pseudo-Haptic
Avatars” (Gomez, 2014) and showed how the visual animation of a self-avatar could be artificially
modified in real-time in order to generate different haptic perceptions. In our experimental setup
participants could watch their self-avatar in a virtual environment in mirror mode (see Figure 2D).

They could map their gestures on the self-animated avatar in real-time using a Kinect. The
experimental task consisted in a weight lifting with virtual dumbbells that participants could
manipulate by means of a tangible stick. We tested three kinds of modification of the visual animation
of the self-avatar: an amplification (or reduction) of the user motion, a change in the dynamic profile
of the motion (temporal animation), or a change in the posture of the avatar (angle of inclination).
Thus, to simulate the lifting of a heavy” dumbbell, the avatar animation was distorted in real-time
using: a decrease in user’s visual motion, a slower dynamics, and a larger angle of inclination of the
avatar. Experimental results showed that users were well able to discriminate weights using this
pseudo-haptic feedback by relying only on the avatar motion and posture. This technique could for
instance be used in applications such as sport training, exercise games, or industrial training.
(A) (B)
(C) (D)
(E)
Figure 2: Pseudo-haptic effects. (A) Pseudo-haptic texture: simulating passing over a bump shape on screen
by playing with the speed of the mouse cursor. (B) 3D carousel: a carousel-like ring menu is rotated using
swipe gestures, and is augmented with friction pseudo-haptic effects that can repulse or attract the user
towards pre-determined items. (C) Elastic image: simulating the local elasticity of a 2D image with a
procedural deformation algorithm using the “pressure” corresponding to the time elapsed when clicking. (D)
Pseudo-haptic avatar: the user can lift different virtual dumbbells using Kinect-based gestures recognition,
and then perceive different virtual weights using pseudo-haptic effects applied on the self-avatar visual
animation. (E) Virtual Mitten: an engineered hand-exerciser is used to grasp and manipulate virtual objects
via a mitten metaphor augmented with pseudo-haptic effects enabling to feel different levels of grasping
efforts.
All these examples illustrate how a spatiotemporal sensory conflict introduced and well controlled in
the perception-action loop can produce a wide range of haptic sensations and improve 3D interaction.
The visual motion is here distorted in a synchronized way with the user’s physical motion or sensory-
motor action. The resulting pseudo-haptic percept corresponds to the subjective reinterpretation of
these stimuli, and to an optimal visuo-haptic perception of a world, which must remain coherent,
depending on the interaction context. Interestingly a similar decrease in speed will be interpreted in
one context as a texture effect and in another context as a change in mass. This suggests that many

Figures
Citations
More filters
Journal Article

The Brain's Sense of Movement.

TL;DR: Alain Berthoz takes the reader on a whirlwind tour of cognitive neuroscience topics: perception, coherence, memory, prediction, and adaptation, and builds a persuasive case supporting his thesis that the brain is an anticipation machine.
Journal ArticleDOI

Model of Illusions and Virtual Reality.

TL;DR: A neuroscientific model is proposed explaining the underlying perceptual and cognitive mechanisms that enable illusions in VR and the minimum instrumentation requirements to support illusory experiences in VR, and the importance and shortcomings of the generic model are discussed.
Proceedings ArticleDOI

TORC: A Virtual Reality Controller for In-Hand High-Dexterity Finger Interaction

TL;DR: Torc as discussed by the authors is a rigid haptic controller that renders virtual object characteristics and behaviors such as texture and compliance, allowing users to interact with virtual objects by sliding their thumb on TORC's trackpad.
Proceedings ArticleDOI

FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

TL;DR: FaceHaptics provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.
Journal ArticleDOI

Experience Design, Virtual Reality and Media Hybridization for the Digital Communication Inside Museums

Eva Pietroni
TL;DR: The choice of narrative structures and styles, as well as of interaction paradigms and technologies, is deeply conditioned by a series of factors that will be examined in detail.
References
More filters
Journal ArticleDOI

Humans integrate visual and haptic information in a statistically optimal fashion.

TL;DR: The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
Journal ArticleDOI

Rubber hands ‘feel’ touch that eyes see

TL;DR: An illusion in which tactile sensations are referred to an alien limb is reported, which reveals a three-way interaction between vision, touch and proprioception, and may supply evidence concerning the basis of bodily self-identification.
Journal ArticleDOI

Vision and touch: an experimentally created conflict between the two senses.

TL;DR: The results reveal that vision is strongly dominant, often without the observer's being aware of a conflict.
Proceedings ArticleDOI

Physiological measures of presence in stressful virtual environments

TL;DR: Physiological reaction satisfied the requirements for a measure of presence, change in skin conductance did to a lesser extent, and that change inskin temperature did not, and inclusion of a passive haptic element in the VE significantly increased presence and that for presence evoked.
Journal ArticleDOI

A Virtual Presence Counter

TL;DR: A new measure for presence in immersive virtual environments (VEs) that is based on data that can be unobtrusively obtained during the course of a VE experience is described and lends support to interaction paradigms that are based on maximizing the match between sensory data and proprioception.
Related Papers (5)
Frequently Asked Questions (17)
Q1. What are the contributions in "Playing with senses in vr: alternate perceptions combining vision and touch" ?

In this article the authors show how to exploit these ambiguous sensorial situations in order to generate new kinds of percept but also plausible 3D interactions in virtual environments. Incidentally, in terms of interaction capabilities, the user is mostly in situation of handicap, being unable to achieve basic operations, provided with limited possibilities of perception and action. In ( Congedo, 2006 ), the authors could show that when visual and haptic information are not spatially colocated, the multi-sensory integration is negatively impacted, and the weight given to the haptic modality decreases strongly in favor of vision ( see Figure 1 ). In this article, the authors will show how sensory conflicts can become a source of inspiration for VR designers and, more generally, how combinations of visual and haptic feedbacks can be exploited to generate alternate percepts and novel 3D interaction schemes. The authors will insist on three series of results obtained recently on this research topic: pseudo-haptic effects, self-motion sensations, and bodyownership illusions. Since their first article published on this topic in 2000 ( Lécuyer, 2009 ), the authors have designed and studied numerous examples of pseudo-haptic effects. The authors provide hereafter a representative set of successful studies and setups. Later on, the authors proposed an extension of this technique called “ Elastic Images ” aiming at simulating the local elasticity of images ( Argelaguet, 2013 ). More recently pseudo-haptic feedback has been studied in the context of 3D interaction with virtual environments for improving the selection or manipulation of virtual objects. When facing an item with a strong friction coefficient, the user must increase the amplitude of movement of the hand to move to the following or previous item. It is based on the passive haptic feedback provided by a handheld elastic input device ( an engineered hand-exerciser ), and the visual metaphor of a mitten that enables to grasp and manipulate 3D objects ( see Figure 2E ). The authors indeed introduced the notion of “ Pseudo-Haptic Avatars ” ( Gomez, 2014 ) and showed how the visual animation of a self-avatar could be artificially modified in real-time in order to generate different haptic perceptions. 

In any case the authors need further works and fundamental studies to better understand and characterize multi-sensory integration in virtual environments. The authors thus invite the VR community to further enter this playground and “ play with senses ” in virtual environments. 

pseudo-haptic feedback relies on the sensory dominance of vision over touch when perceiving spatial properties (distance, position, size, displacement, etc). 

The 3D interaction with the carousel is achieved by tracking user’s gestures: in order to rotate the carousel, the user has to perform swipe gestures. 

In their first example, introducing a spatiotemporal conflict and distorting visual motions can generate various haptic sensations and pseudo-haptic effects. 

A good way to generate selfmotion sensation in VR consists in watching a visual motion on a large screen or wearing an HMD with a large field of view. 

In their third example, a tactile stimulation presented synchronously with visual feedback but with a spatial offset, enables to map the tactile sensations onto a fake virtual limb and generate body-ownership illusion. 

Experimental results showed that users were well able to discriminate weights using this pseudo-haptic feedback by relying only on the avatar motion and posture. 

Two small paintbrushes were manipulated by an experimenter to stroke simultaneously both the real hand and the rubber hand, synchronizing the timing of the two brushings as closely as possible. 

One advantage of this technique, that the authors called “Haptic Motion”, compare to the use of a classical motion platform, is the possibility to generate a sensation of acceleration during a very long time, and in any 3D direction or orientation. 

As a first example, “pseudo-haptic feedback” intends to produce a wide range of haptic sensations, such as friction, relief or stiffness, without using a haptic interface but playing with visual feedback. 

In the case when the virtual brush was stroking the sixth digit, the real ring finger was synchronously stroked so to provide a consistent tactile stimulation. 

pseudo-haptic feedback corresponds to a new and coherent representation of the environment resulting from a combination of haptic and visual information. 

Such conditioning protocol relying on synchronized visuo-tactile stimulations shows great potential for augmenting virtual embodiment in VR. 

The elasticity sensation is generated by a procedural image deformation algorithm that modifies the image according to its simulated physical properties and to the virtual pressure exerted by the user (see Figure 2C). 

In this case, the motion platform would be substituted by the use of an actuated driving wheel, able to exert forces at the level of the hands of the driver. 

The authors got inspired from the RHI conditioning method to foster the appropriation of the 6-finger hand and generate a “6-Finger Illusion” (6FI).