scispace - formally typeset
Proceedings ArticleDOI

NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers

Reads0
Chats0
TLDR
It is found that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of mechanically-actuated hand-held controllers.
Abstract
We present an investigation of mechanically-actuated hand-held controllers that render the shape of virtual objects through physical shape displacement, enabling users to feel 3D surfaces, textures, and forces that match the visual rendering. We demonstrate two such controllers, NormalTouch and TextureTouch, which are tracked in 3D and produce spatially-registered haptic feedback to a user's finger. NormalTouch haptically renders object surfaces and provides force feedback using a tiltable and extrudable platform. TextureTouch renders the shape of virtual objects including detailed surface structure through a 4×4 matrix of actuated pins. By moving our controllers around while keeping their finger on the actuated platform, users obtain the impression of a much larger 3D shape by cognitively integrating output sensations over time. Our evaluation compares the effectiveness of our controllers with the two de-facto standards in Virtual Reality controllers: device vibration and visual feedback only. We find that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of our controllers.

read more

Content maybe subject to copyright    Report

NormalTouch and TextureTouch: High-fidelity 3D Haptic
Shape Rendering on Handheld Virtual Reality Controllers
Hrvoje Benko, Christian Holz, Mike Sinclair, Eyal Ofek
Microsoft Research, Redmond, WA, USA
{benko, cholz, sinclair, eyalofek}@microsoft.com
ABSTRACT
We present an investigation of mechanically-actuated hand-
held controllers that render the shape of virtual objects
through physical shape displacement, enabling users to feel
3D surfaces, textures, and forces that match the visual ren-
dering. We demonstrate two such controllers, NormalTouch
and TextureTouch. Both controllers are tracked with 6 DOF
and produce spatially-registered haptic feedback to a user’s
finger. NormalTouch haptically renders object surfaces and
provides force feedback using a tiltable and extrudable plat-
form. TextureTouch renders the shape of virtual objects in-
cluding detailed surface structure through a 4×4 matrix of
actuated pins. By moving our controllers around in space
while keeping their finger on the actuated platform, users ob-
tain the impression of a much larger 3D shape by cognitively
integrating output sensations over time. Our evaluation com-
pares the effectiveness of our controllers with the two de-
facto standards in Virtual Reality controllers: device vibra-
tion and visual feedback only. We find that haptic feedback
significantly increases the accuracy of VR interaction, most
effectively by rendering high-fidelity shape output as in the
case of our controllers. Participants also generally found
NormalTouch and TextureTouch realistic in conveying the
sense of touch for a variety of 3D objects.
Author Keywords
Haptics; Controller Design; Tactile Display; Virtual Reality.
ACM Classification Keywords
H.5.1 [Information Interfaces and Presentation]: Multimedia
Inform
ation Systems-Artificial, Augmented, and Virtual Re-
alities; H.5.2 [User Int
erfaces]: Haptic I/O.
INTRODUCTION
The capabilities of current devices to render meaningful hap-
tics lag far behind their abilities to render highly realistic vis-
ual or audio content. In fact, the de-facto standard of haptic
output on commodity devices is vibrotactile feedback (e.g.,
built into mobile devices and game controllers). While ubiq-
uitous and small, these vibrotactile actuators produce haptic
sensations by varying the duration and intensity of vibra-
tions. This makes them well suited for user-interface notifi-
cations, but fairly limited in conveying a sense of shape,
force, or surface structure.
In Virtual Reality (VR), higher fidelity haptic rendering be-
yond vibrotactile feedback has been extensively explored
through actuated gloves [11], exoskeletons [6, 10], or sta-
tionary robotic arms [13, 25, 26, 34]. While these solutions
offer richer haptic rendering, they limit the convenience of
use because they either restrict the user to a small working
area or they require users to put on and wear additional gear.
As a result, handheld controllers—not gloves or exoskele-
tons—have emerged as the dominant interaction interface for
current VR devices and applications (e.g., Oculus Rift, HTC
Vive, and Sony PlayStation VR). The haptic feedback these
VR controllers provide, however, is vibrotactile—much like
on mobile phones and regular game controllers.
In this paper, we explore haptic 3D shape output on handheld
controllers that enables users to feel shapes, surfaces, forces,
and surface textures. We present two novel devices, Nor-
malTouch and TextureTouch, each using a different actuation
Permission to make digital or hard copies of all or part of this work fo
r
p
ersonal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. Copyrights for com-
p
onents of this work owned by others than the author(s) must be honored.
Abstracting with credit is permitted. To copy otherwise, or republish, to
p
ost on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from Permissions@acm.org.
UIST '16, October 16 - 19, 2016, Tokyo, Japan
Copyright is held by the owner/author(s). Publication rights licensed to
ACM.
ACM 978-1-4503-4189-9/16/10…$15.00
DOI: http://dx.doi.org/10.1145/2984511.2984526
Figure 1: (a) Our 3D haptic shape controllers allow the Virtual Reality user to touch and feel what they would other-
wise only see. (b) Our controllers enable users to explore virtual 3D objects with their finger. (c) NormalTouch renders
the surface height and orientation using a tiltable and height-adjustable platform. (d) TextureTouch renders the
detailed surface texture of virtual objects using a 4×4 pin array, which users experience on their finger pad.

method to render haptic 3D shape output. As shown in Figure
1c, NormalTouch renders objects’ 3D surfaces and provides
force feedback to touch input using an active tiltable and ex-
trudable platform, on which the user rests their finger. Tex-
tureTouch (Figure 1d) houses a 4×4 matrix of actuated pins
underneath the user’s fingertip that individually render the
3D shape of virtual objects, including the coarse structure of
the surface texture. While we chose VR as an immersive en-
vironment for integrating our controllers, they would be
equally suitable for haptic output in other scenarios, such as
video games, 3D modeling applications, teleoperation, or
Augmented Reality.
In contrast to previous approaches, our controllers integrate
shape output inside a lightweight tracked handheld form fac-
tor as shown in Figure 1a. The low weight of the controllers
and the ability to track them in 3D throughout a larger envi-
ronment enables users to obtain the sensation of much larger
shapes by freely moving the controllers around and mentally
integrating output sensations over time. We argue that this
combination of haptic 3D shape output, movability, and 3D
tracking produces a much higher-fidelity immersion in vir-
tual scenes than other methods. Using our controllers, users
explore 3D scenes tactually in addition to visually, by feeling
the virtual objects and surfaces around them with their finger
(Figure 1b). Users can hold the controllers in either hand and
comfortably explore haptic output using either their index
finger or thumb.
Below, we describe the implementation and design of our
haptic 3D shape output controllers, their haptic rendering ca-
pabilities, as well as their implications for interaction in vir-
tual environments. Finally, we report the results of our three-
part evaluation that compared our controllers against vi-
brotactile feedback and a visual-only baseline condition. The
results we found indicate that users get a significantly more
accurate sense of virtual objects using haptic feedback, aug-
menting their perception of shape in virtual 3D scenes.
Contributions
Our paper makes the following four specific contributions:
1. NormalTouch, a handheld controller that renders haptics
through an active tiltable and extrudable platform and
senses force input from the user upon touch.
2. TextureTouch, a handheld controller that renders the 3D
surface structures via a 4×4 array of actuated pins.
3. The integration of shape controllers in a VR system as
well as a series of solutions to interaction challenges,
such as object penetration and dynamic object behavior.
4. A user study comparing our two controllers with a vis-
ual-only and a vibrotactile feedback baseline, showing
gains in accuracy and fidelity of haptic feedback.
RELATED WORK
There is a wide spectrum of haptic solutions, each with
unique benefits and limitations. We focus our review of the
related work on haptic devices that provide feedback to the
user’s hand in VR, wearable and mobile haptics, and tactile
array displays.
Hand Haptics in Virtual Reality
Our research shares the same primary goal as many other
hand haptics VR devices: to effectively render collisions,
shapes and forces between the user’s hand and the virtual
scene.
As discussed in the introduction, the most widely used form
of haptic feedback to the hand is vibrotactile actuation. Most
common, vibrotactile actuators (including voice coils, eccen-
tric weight motors, and solenoids) are frequently integrated
into handheld controllers (e.g., HTC Vive, PlayStation Move
and Nintendo Wii controllers), styluses [22, 23], or gloves
[11]. For example, commercially available CyberTouch
glove [11] uses 6 vibrating motors placed at each fingertip as
well as on the palm of the hand, to render simulated tactile
sensations to the hand. While most commonly used for sim-
ple touch notification, vibrotactile actuators have been used
to render an illusion of one-dimensional force [30] and have
also been found effective in rendering varying surface stiff-
ness in virtual environments [37]. In our experiments, we
used vibrotactile actuation as a baseline technique to com-
pare against our shape rendering haptic actuators.
Larger forces and collisions have traditionally been rendered
in VR by actuated articulated arms (e.g., PHANToM [25],
Haptic Master [34], Virtuose 6D [13], Falcon [26], Snake
Charmer [2]). More recently, a robotic arm actuator has been
combined with a touch display in TouchMover 2.0 [32],
which is capable of rendering both large forces in one dimen-
sion as well as haptic texture feedback via two voice-coils
mounted on the display. While such devices render forces
and collisions with relatively high fidelity, they sacrifice mo-
bility and offer very restricted operating space. In particular,
haptic arms with a physical earth reference such as the
PHANToM device are well suitable for teleoperation of ro-
bots or tele-surgery where the user is stationary [5].
Large forces can also be rendered to the hand via the use of
glove-based exoskeletons [6, 10]. CyberGrasp glove [10]
uses five tendon actuators routed to the fingertips via the ex-
oskeleton to add resistive force to each finger and prevent the
user’s fingers from penetrating a virtual object. The Rutgers
Master II-ND [6] is a haptic glove that uses pneumatic actu-
ators between the palm and the fingers to render large grasp-
ing forces to the fingertip.
Our NormalTouch device is closest to the Marionette [21],
which uses tilt-platforms to convey the surface normal un-
derneath four fingertips while the user is moving the device
like a mouse. In contrast to the Marionette, our work focuses
on handheld 3D interactions in VR scenarios. We were in-
spired by the effectiveness of tilt platforms in conveying the
surface normal underneath the fingertip [1, 38].
In contrast to the actively actuated haptic devices, passive
haptics can also be used to provide highly realistic haptic
sensations in VR. With passive haptics, a stand-in physical

proxy object may provide appropriate haptics for a rendered
virtual object. For example, Azmandian et al. [3] recently
demonstrated how a user can be redirected to reuse the same
passive physical proxy for multiple virtual objects.
Wearable and Mobile Haptics
It is well understood in the haptic literature that the stimuli
experienced by the hand when holding or exploring the shape
of an object has both kinesthetic and cutaneous components
[14]. Kinesthetic feedback requires larger actuation forces
and provides the user with information about the relative po-
sition of the parts of their body (e.g., joints). Cutaneous in-
formation is felt by the pressure receptors in the skin and is
a direct measure of the direction and intensity of contact
forces as well as texture.
Other than the previously mentioned exoskeleton gloves,
most of the wearable haptic actuators in the literature focus
on rendering cutaneous stimuli. Prattichizzo et al. [28] offer
a wearable haptic device that uses a three-string actuated
platform capable of rendering cutaneous forces at the finger-
tip. Choi et al. [9] have developed a haptic actuator as a wear-
able Braille display based on dielectric elastomer which can
be manufactured on a flexible substrate and wrapped around
the fingertip. Similarly, Brewster and Brown [7] produced
small wearable “Tactons”, capable of rendering non-visual
messages through Braille-like miniature pins. Velasquez et
al. provide a comprehensive survey of similar haptic technol-
ogies targeted at the blind population [35]. In contrast, our
devices are not wearable, but held in the hand, tracked with
6 DOF, and able to provide both kinesthetic and cutaneous
feedback.
Cutaneous feedback has also been explored in mobile device
form factors. Luk et al. [24] present a handheld haptic display
platform based on the concept of lateral skin stretch. Hem-
mert et al. [15] applied shape changing and weight shifting
in conceptual mobile devices to convey the sense of direction
during interaction with the device.
UltraHaptics [8] demonstrated another related technology for
providing ultrasonically created haptics to the hand in mid-
air without the need for the user to hold a device. While
promising technology, the sensations created are subtle and
the working area is very restricted.
Tactile Array Displays
Our TextureTouch device builds upon the rich history of
Tactile Array displays which use an array of electro-mechan-
ically actuated pins/rods to render a dense tactile surface [12,
18, 27, 29, 36]. For example, the Exeter touch array [33] used
piezo actuators to move 100 small pins in a 1.5cm square
area underneath the fingertip. Lumen device used a coarser
13x13 array of illuminated rods to explore on-demand UI el-
ements [29]. inFORM [12] explored the affordances of shape
and object actuation when using a larger tactile array display
with 30x30 actuated “pixels” that cover the area of approxi-
mately 15 sq. inches. Recently, Jang et al [19] used a single
dimensional actuated tactile array integrated along the edge
of a conceptual smartphone to convey haptic notifications.
Closest to our TextureTouch are solutions that mounted a
tactile array display on a stylus. For example, UbiPen [22]
had a tactile array on a stylus that added texture in addition
to vibrotactile feedback when using the pen on the tablet.
Kim at al. attached a similar stylus to a PHANToM device
[25] for a palpation simulation application [20]. In contrast
to that work, with TextureTouch we explored the interaction
capabilities in a much more free-movement VR scenarios,
where a tactile array is integrated in a highly movable, 3D
tracked handheld controller.
HANDHELD CONTROLLERS FOR 3D SHAPE OUTPUT
The design goal of our haptic controllers is seamless use in a
virtual environment. As such, they satisfy three require-
ments: 1) deliver 3D shape output in a handheld form factor,
2) a compact and lightweight form factor to facilitate unen-
cumbered mid-air operation, and 3) provide a human-scale
force in rendering 3D shapes for both cutaneous (i.e., haptic
sensations on the finger surface) and kinesthetic feedback
(sensation of actuating and displacing the finger).
Figure 2: Five device prototypes (three NormalTouch versions
and two TextureTouch versions) that we produced in our iter-
ative process. Each design built on the learnings from the pre-
vious version, improving weight, robustness and mobility.
NormalTouch TextureTouch
Basicprinciple Tiltplatform Tactilearray
Heightdynamicrange 2.6cm 1.4 cm
Renderingresolution 1movingplatform 4×4pinarray
Actuatedarea 3.3 cmdiameterdisk 1.3×1.3 cm
2
Maximumrenderedangle 45degrees 90degrees
Weight 150g 600g
Dimensions(excl.markers) 7×20×5cm
17×18×5cm
Fingerforcesensing Yes(FSR) No
Table 1: Comparison of haptic shape controller properties.
We identified two promising technologies that meet these
goals: tilt platforms [1, 21, 38] and tactile arrays [12, 18, 27,
29, 36]. While tilt platforms better render surface normals
and are simpler to implement, tactile arrays render features
smaller than the user’s finger using individual pins. Through
an iterative design process of the prototypes shown in Figure
2, we implemented two fully functioning prototype control-
lers, each one built around one of these two core technolo-
gies. Table 1 summarizes the main properties of each of our
two haptic shape controllers: NormalTouch and Tex-
tureTouch. For illustration purposes in the descriptions be-
low, we assume that the user moves their finger and the con-
troller in a virtual scene with a variety of 3D objects. In the

virtual environment, the user’s hand is represented by a 3D
hand model with matching 3D positions and 3D orientations.
NormalTouch: A 3D Tiltable and Extrudable Platform
As shown in Figure 3, the core of NormalTouch is an acetal
(Delrin) platform that is actuated by three servo motors. A
force sensor inside the disk detects touch input at a range of
forces. The handle of the controller encloses all electronics,
including the motor controller. The small retroreflective
spheres mounted around the motors serve as markers to track
NormalTouch in 3D with surrounding cameras.
Figure 3: (left) NormalTouch during interaction.
(right) Close-up of the tiltable and extrudable platform.
When moved around in the virtual scene and making contact
with virtual objects in the scene, NormalTouch replicates the
surface normal of these objects. NormalTouch’s default state
is a fully retracted platform. As soon as the user makes con-
tact with a virtual object, NormalTouch tilts its platform to
the relative 3D orientation of the object’s surface and ex-
trudes the platform according to the user’s movement of the
controller in the physical space. This causes the user’s finger
to remain in the same 3D position—outside the virtual ob-
ject’s boundary, which is registered in the physical space as
shown in Figure 4.
Figure 4. (A) Illustration of NormalTouch operation while
rendering the surface of a 3D virtual object (gray). (B) Device
height depicted in our 3D scene. Device’s height and angle
change to faithfully render the surface at the point of touch.
The core components of NormalTouch are the three servo
motors that impart the mechanical three-dimensional free-
dom of the platform. We used three Hitec HS-5035HD nano
servos arranged in a 3-DOF Stewart Platform as shown in
Figure 3b. The servos are connected from the servos’ control
arms with revolute joints, through small rigid linkages to
ball-and-socket spherical joints under the platform. The rigid
linkages are restricted in movement to be always perpendic-
ular to the servo’s axis. This allows the three degrees of free-
dom imparted by the three servos to be mechanically trans-
formed to the finger pad’s yaw and pitch angles plus linear
movement along the roll axis (towards and away from the
user). All components are designed in CAD and mostly laser
cut in Delrin plastic. An advantage of our configuration is
that the overall 3D mechanism occupies a minimum volume
compared to other implementations.
To control the servos, we integrated an off-the-shelf multi-
servo USB controller (Pololu.com Mini Maestro-12) into a
3D printed controller handle. NormalTouch draws 375mA in
average use (620 mA peak current). When our device is out-
fitted with a 3000mAh LiPo rechargeable battery for wireless
operation, it yields ~8 hr battery life. The controller also
senses analog voltages, in our case to detect force.
Force Sensing
NormalTouch senses force input using an off-the-shelf force
transducer (Interlink Electronics FSR-402) in an end effec-
tor. We chose this implementation rather than sensing motor
current because the latter can cause compromises from gear
and bearing friction. The force sensor is a 13 mm disk using
force sensing resistor material with electrodes, detecting
forces between 0.2–20 N, which is adequate for our use.
The sensor is configured such that with applied force levels
of less than 0.2 N, one of the two electrodes in the sensor is
not in contact with the FSR material and results in infinite
resistance and no voltage to the ADC, allowing us to reliably
detect moments during which no touch is present. A small
force applied to the sensor (~0.2 N) results in electrode con-
tact and a reliable force reading.
Figure 5: Design of NormalTouch’s force-sensing platform.
To overcome this initial non-linearity and increase the low
force sensitivity, the platform is composed of two separate
Delrin-cut layers (Figure 5). In the top layer, the finger
touches a smooth depression that we added for finger place-
ment. The border of the platform disk is cut into a partial
three-legged spiral spring to allow for an adjustable and com-
pliant preload on the force sensor housed in the bottom disk.
Figure 6: (left) TextureTouch showing the 16 servo motors
and gear assembly. (right) Close up of the 4×4 array of pins.
TextureTouch: 3D Pixel Shape Output
As shown in Figure 6, TextureTouch packs a 4x4 actuated
pin array as the primary method for haptically rendering 3D

shapes and structures onto the user’s finger. All electronics
are attached to the side of the device. Similar to Nor-
malTouch, small retroreflective spheres mounted to the base
serve as markers for tracking the controller in 3D space.
TextureTouch behaves similarly to NormalTouch during
use, except that this time 16 virtual probe lines detect contact
with the surfaces of virtual objects in the scene as shown in
Figure 7. This individual probing enables TextureTouch to
detect fine-grained surface structure and relay that to the ex-
trusion of the individual pins for the user to feel on their fin-
ger. Similar to NormalTouch, when the user’s finger is out-
side all virtual objects, all pins in TextureTouchs array are
fully retracted and the finger rests flat on the platform.
Figure 7: (A) Illustration of TextureTouch operation while
rendering the surface of a 3D virtual object (gray). (B) Device
pin heights depicted in our 3D scene.
TextureTouch comprises 16 linearly actuated adjacent pins
in a 4×4 configuration. Each pin is individually driven by a
small servo motor (HiTec HS-5035HD). We used rack and
pinion mechanisms to convert the servos’ rotary output to
linear travel. An additional rack and pinion pair turns the mo-
tion at right angles for an optimized configuration and mini-
mum volume as shown in Figure 8. A Pololu Mini Maestro-
24 servo controller relays the extrusion levels determined by
the virtual reality system from the PC to each servo motor.
TextureTouch draws 800mA in average use (1.5A peak).
Figure 8: TextureTouch actuation mechanism for a single pin.
INTEGRATION OF HAPTIC CONTROLLERS INTO VR
The basic concept of haptic shape rendering in simulated 3D
environments is a well understood topic [31]. In principle,
one determines the collisions of the haptic proxy object (in
our case a 3D fingertip) with 3D virtual objects in the scene,
computes the resulting forces, and renders an equal and op-
posite force on the haptic device. This works well if the hap-
tic device is stationary and capable of exerting enough force
back to the user to prevent the user from moving further (oth-
erwise the device simply gives up).
However, haptic shape rendering on a handheld device is
more complicated, because both the platform and the actu-
ated point are moving and held by the same hand. Therefore,
any force our controllers render at the user’s fingertip will
inevitably be felt against the rest of the palm which holds the
handle of the controller. In practice, however, the fingertip’s
sensitivity to kinesthetic and cutaneous forces is much higher
than the sensitivity of the rest of the hand [28], rendering the
actuation experience convincing. See our last experiment be-
low for an evaluation of rendering fidelities.
In their most basic operation, both NormalTouch and Tex-
tureTouch operate on the same principle. When the tracked
controller penetrates the surface of a virtual object, the con-
troller’s articulation point(s) extends to compensate for the
penetration, thus rendering the surface in contact (Figure 4
and Figure 7). NormalTouch has a single extension platform
and we additionally orient the platform to relay the surface
normal at the collision point. TextureTouch individually per-
forms this calculation for every one of its 16 pins.
To further support the haptic sensations with visual feed-
back, we animate the joints of the virtual finger when touch-
ing virtual objects to signal to the user that a collision has
occurred (Figure 9).
Figure 9: The joints of the virtual finger are animated to re-
flect contact with an object and match the haptic actuation.
Handling Surface Penetration
Since our devices do not have a physical earth reference, it is
impossible to prevent the user from penetrating virtual ob-
jects in the scene with their controller hand. How the device
behaves when the dynamic range of its actuator is exhausted
and the fingertip penetrates the surface can have significant
impact on the quality of the experience.
Figure 10: Illustration of basic height rendering for Normal-
Touch. When in contact with the surface, only the controller
base keeps moving and the finger remains at the same location
within the dynamic range of the device. Note: the same behav-
ior applies to each pin in TextureTouch.
We deliberately chose to keep the platform fully extended
upon complete penetration (Figure 10). While this does not
give the user a clear signal that they have penetrated the sur-
face, it renders a more consistent behavior. Alternatively, re-
tracting the platform upon penetration (i.e., rendering 0N
force) frequently results in undesirable behavior. The expla-
nation for this is that, most of the time, penetration was not
intended, and the user will correct their behavior and retract
SpurGear
Mountedon
Servo
Servo
RackGear
SpurGear
Pin(1of16)
Gear
Aperture
platform

Citations
More filters
Journal ArticleDOI

Wearable Haptic Systems for the Fingertip and the Hand: Taxonomy, Review, and Perspectives

TL;DR: This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges, and reports on the future perspectives of the field.
Proceedings ArticleDOI

Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller

TL;DR: Haptic Revolver is a handheld virtual reality controller that renders fingertip haptics when interacting with virtual surfaces through an actuated wheel that raises and lowers underneath the finger to render contact with a virtual surface.
Proceedings ArticleDOI

CLAW: A Multifunctional Handheld Haptic Controller for Grasping, Touching, and Triggering in Virtual Reality

TL;DR: This work describes the design considerations for CLAW, a handheld virtual reality controller that augments the typical controller functionality with force feedback and actuated movement to the index finger, and evaluates its performance through two user studies.
Proceedings ArticleDOI

Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality

TL;DR: Grabity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR, is evaluated, finding promising ability to simulate different levels of weight with convincing object rigidity.
Proceedings ArticleDOI

Teaching Language and Culture with a Virtual Reality Game

TL;DR: Crystallize, a 3D video game for learning Japanese, is adapted so that it can be played in virtual reality with the Oculus Rift, and it is shown that the virtual reality design trained players how and when to bow, and that it increased participants' sense of involvement in Japanese culture.
References
More filters
Book

3D User Interfaces: Theory and Practice

TL;DR: This book discusses 3D User Interfaces, the history and roadmap of 3D UIs, and strategies for Designing and Developing 3D user Interfaces.
Proceedings Article

The PHANToM Haptic Interface: A Device for Probing Virtual Objects

TL;DR: The design rationale, novel kinematics and mechanics of the PHANToM, a device which measures a user’s finger tip position and exerts a precisely controlled force vector on the finger tip, are discussed.
Proceedings ArticleDOI

TeslaTouch: electrovibration for touch surfaces

TL;DR: The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface, which enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch.
Journal ArticleDOI

Haptic interfaces and devices

TL;DR: A description of the components and the modus operandi of haptic interfaces are described, followed by a list of current and prospective applications and a discussion of a cross‐section of current device designs.

Tactons: Structured Tactile Messages for Non-Visual Information Display

TL;DR: In this paper, the authors describe a new form of tactile output, called tactile icons, which can be used to communicate messages non-visually, and describe the parameters used to construct them and some possible ways to design them.
Related Papers (5)