scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2018"


Journal ArticleDOI
TL;DR: This novel manufacturing approach enables the seamless integration of multiple ionically conductive and fluidic features within elastomeric matrices to produce SSAs with the desired bioinspired sensing and actuation capabilities.
Abstract: Humans possess manual dexterity, motor skills, and other physical abilities that rely on feedback provided by the somatosensory system. Herein, a method is reported for creating soft somatosensitive actuators (SSAs) via embedded 3D printing, which are innervated with multiple conductive features that simultaneously enable haptic, proprioceptive, and thermoceptive sensing. This novel manufacturing approach enables the seamless integration of multiple ionically conductive and fluidic features within elastomeric matrices to produce SSAs with the desired bioinspired sensing and actuation capabilities. Each printed sensor is composed of an ionically conductive gel that exhibits both long-term stability and hysteresis-free performance. As an exemplar, multiple SSAs are combined into a soft robotic gripper that provides proprioceptive and haptic feedback via embedded curvature, inflation, and contact sensors, including deep and fine touch contact sensors. The multimaterial manufacturing platform enables complex sensing motifs to be easily integrated into soft actuating systems, which is a necessary step toward closed-loop feedback control of soft robots, machines, and haptic devices.

385 citations


Journal ArticleDOI
TL;DR: The technology behind creating artificial touch sensations and the relevant aspects of human touch are reviewed and the need to consider the neuroscience and perception behind the human sense of touch in the design and control of haptic devices is addressed.
Abstract: This article reviews the technology behind creating artificial touch sensations and the relevant aspects of human touch We focus on the design and control of haptic devices and discuss the best practices for generating distinct and effective touch sensations Artificial haptic sensations can present information to users, help them complete a task, augment or replace the other senses, and add immersiveness and realism to virtual interactions We examine these applications in the context of different haptic feedback modalities and the forms that haptic devices can take We discuss the prior work, limitations, and design considerations of each feedback modality and individual haptic technology We also address the need to consider the neuroscience and perception behind the human sense of touch in the design and control of haptic devices

214 citations


Proceedings ArticleDOI
19 Apr 2018
TL;DR: Haptic Revolver is a handheld virtual reality controller that renders fingertip haptics when interacting with virtual surfaces through an actuated wheel that raises and lowers underneath the finger to render contact with a virtual surface.
Abstract: We present Haptic Revolver, a handheld virtual reality controller that renders fingertip haptics when interacting with virtual surfaces. Haptic Revolver's core haptic element is an actuated wheel that raises and lowers underneath the finger to render contact with a virtual surface. As the user's finger moves along the surface of an object, the controller spins the wheel to render shear forces and motion under the fingertip. The wheel is interchangeable and can contain physical textures, shapes, edges, or active elements to provide different sensations to the user. Because the controller is spatially tracked, these physical features can be spatially registered with the geometry of the virtual environment and rendered on-demand. We evaluated Haptic Revolver in two studies to understand how wheel speed and direction impact perceived realism. We also report qualitative feedback from users who explored three application scenarios with our controller.

180 citations


Journal ArticleDOI
TL;DR: This survey focuses on how the fifth generation of mobile networks will allow haptic applications to take life, in combination with the haptic data communication protocols, bilateral teleoperation control schemes and hapticData processing needed.
Abstract: Touch is currently seen as the modality that will complement audition and vision as a third media stream over the Internet in a variety of future haptic applications which will allow full immersion and that will, in many ways, impact society. Nevertheless, the high requirements of these applications demand networks which allow ultra-reliable and low-latency communication for the challenging task of applying the required quality of service for maintaining the user’s quality of experience at optimum levels. In this survey, we enlist, discuss, and evaluate methodologies and technologies of the necessary infrastructure for haptic communication. Furthermore, we focus on how the fifth generation of mobile networks will allow haptic applications to take life, in combination with the haptic data communication protocols, bilateral teleoperation control schemes and haptic data processing needed. Finally, we state the lessons learned throughout the surveyed research material along with the future challenges and infer our conclusions.

179 citations


Proceedings ArticleDOI
Inrak Choi1, Eyal Ofek2, Hrvoje Benko2, Mike Sinclair2, Christian Holz2 
21 Apr 2018
TL;DR: This work describes the design considerations for CLAW, a handheld virtual reality controller that augments the typical controller functionality with force feedback and actuated movement to the index finger, and evaluates its performance through two user studies.
Abstract: CLAW is a handheld virtual reality controller that augments the typical controller functionality with force feedback and actuated movement to the index finger. Our controller enables three distinct interactions (grasping virtual object, touching virtual surfaces, and triggering) and changes its corresponding haptic rendering by sensing the differences in the user's grasp. A servo motor coupled with a force sensor renders controllable forces to the index finger during grasping and touching. Using position tracking, a voice coil actuator at the index fingertip generates vibrations for various textures synchronized with finger movement. CLAW also supports a haptic force feedback in the trigger mode when the user holds a gun. We describe the design considerations for CLAW and evaluate its performance through two user studies. The first study obtained qualitative user feedback on the naturalness, effectiveness, and comfort when using the device. The second study investigated the ease of the transition between grasping and touching when using our device.

178 citations


Journal ArticleDOI
TL;DR: Physical haptic feedback mechanism is introduced to result in muscle activity that would generate EMG signals in a natural manner, in order to achieve intuitive human impedance transfer through a designed coupling interface.
Abstract: It has been established that the transfer of human adaptive impedance is of great significance for physical human–robot interaction (pHRI). By processing the electromyography (EMG) signals collected from human muscles, the limb impedance could be extracted and transferred to robots. The existing impedance transfer interfaces rely only on visual feedback and, thus, may be insufficient for skill transfer in a sophisticated environment. In this paper, physical haptic feedback mechanism is introduced to result in muscle activity that would generate EMG signals in a natural manner, in order to achieve intuitive human impedance transfer through a designed coupling interface. Relevant processing methods are integrated into the system, including the spectral collaborative representation-based classifications method used for hand motion recognition; fast smooth envelop and dimensionality reduction algorithm for arm endpoint stiffness estimation. The tutor’s arm endpoint motion trajectory is directly transferred to the robot by the designed coupling module without the restriction of hands. Haptic feedback is provided to the human tutor according to skill learning performance to enhance the teaching experience. The interface has been experimentally tested by a plugging-in task and a cutting task. Compared with the existing interfaces, the developed one has shown a better performance. Note to Practitioners —This paper is motivated by the limited performance of skill transfer in the existing human–robot interfaces. Conventional robots perform tasks independently without interaction with humans. However, the new generation of robots with the characteristics, such as flexibility and compliance, become more involved in interacting with humans. Thus, advanced human robot interfaces are required to enable robots to learn human manipulation skills. In this paper, we propose a novel interface for human impedance adaptive skill transfer in a natural and intuitive manner. The developed interface has the following functionalities: 1) it transfers human arm impedance adaptive motion to the robot intuitively; 2) it senses human motion signals that are decoded into human hand gesture and arm endpoint stiffness that ia employed for natural human robot interaction; and 3) it provides human tutor haptic feedback for enhanced teaching experience. The interface can be potentially used in pHRI, teleoperation, human motor training systems, etc.

172 citations


Proceedings ArticleDOI
19 Apr 2018
TL;DR: Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world, was showed to be a promising tool that enabled visually impaired participants to navigate different virtual spaces.
Abstract: Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: (1) physical resistance generated by a wearable programmable brake mechanism that physically impedes the controller when the virtual cane comes in contact with a virtual object; (2) vibrotactile feedback that simulates the vibrations when a cane hits an object or touches and drags across various surfaces; and (3) spatial 3D auditory feedback simulating the sound of real-world cane interactions. We designed indoor and outdoor VR scenes to evaluate the effectiveness of our controller. Our study showed that Canetroller was a promising tool that enabled visually impaired participants to navigate different virtual spaces. We discuss potential applications supported by Canetroller ranging from entertainment to mobility training.

127 citations


Journal ArticleDOI
31 Jan 2018
TL;DR: ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment, is presented.
Abstract: This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user’s waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user’s upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device’s capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.

122 citations


Proceedings ArticleDOI
21 Apr 2018
TL;DR: The Force Jacket is described, a novel array of pneumatically-actuated airbags and force sensors that provide precisely directed force and high frequency vibrations to the upper body and the use of those effects in prototype virtual reality applications is discussed.
Abstract: Immersive experiences seek to engage the full sensory system in ways that words, pictures, or touch alone cannot. With respect to the haptic system, however, physical feedback has been provided primarily with handheld tactile experiences or vibration-based designs, largely ignoring both pressure receptors and the full upper-body area as conduits for expressing meaning that is consistent with sight and sound. We extend the potential for immersion along these dimensions with the Force Jacket, a novel array of pneumatically-actuated airbags and force sensors that provide precisely directed force and high frequency vibrations to the upper body. We describe the pneumatic hardware and force control algorithms, user studies to verify perception of airbag location and pressure magnitude, and subsequent studies to define full-torso, pressure and vibration-based feel effects such as punch, hug, and snake moving across the body. We also discuss the use of those effects in prototype virtual reality applications.

113 citations


Journal ArticleDOI
TL;DR: A robot controller that concurrently adapts feedforward force, impedance, and reference trajectory when interacting with an unknown environment is developed, which can outperform conventional controllers in contact tooling.
Abstract: Humans can skilfully use tools and interact with the environment by adapting their movement trajectory, contact force, and impedance. Motivated by the human versatility, we develop here a robot controller that concurrently adapts feedforward force, impedance, and reference trajectory when interacting with an unknown environment. In particular, the robot's reference trajectory is adapted to limit the interaction force and maintain it at a desired level, while feedforward force and impedance adaptation compensates for the interaction with the environment. An analysis of the interaction dynamics using Lyapunov theory yields the conditions for convergence of the closed-loop interaction mediated by this controller. Simulations exhibit adaptive properties similar to human motor adaptation. The implementation of this controller for typical interaction tasks including drilling, cutting, and haptic exploration shows that this controller can outperform conventional controllers in contact tooling.

110 citations


Proceedings ArticleDOI
11 Oct 2018
TL;DR: DxtrES, a flexible and wearable haptic glove which integrates both kinesthetic and cutaneous feedback in a thin and light form factor, is introduced and it is demonstrated that the approach can provide rich haptic feedback under dexterous articulation of the user's hands and provides effective haptic Feedback across a variety of different grasps.
Abstract: We introduce DextrES, a flexible and wearable haptic glove which integrates both kinesthetic and cutaneous feedback in a thin and light form factor (weight is less than 8g). Our approach is based on an electrostatic clutch generating up to 20 N of holding force on each finger by modulating the electrostatic attraction between flexible elastic metal strips to generate an electrically-controlled friction force. We harness the resulting braking force to rapidly render on-demand kinesthetic feedback. The electrostatic brake is mounted onto the the index finger and thumb via modular 3D printed articulated guides which allow the metal strips to glide smoothly. Cutaneous feedback is provided via piezo actuators at the fingertips. We demonstrate that our approach can provide rich haptic feedback under dexterous articulation of the user's hands and provides effective haptic feedback across a variety of different grasps. A controlled experiment indicates that DextrES improves the grasping precision for different types of virtual objects. Finally, we report on results of a psycho-physical study which identifies discrimination thresholds for different levels of holding force.

Proceedings ArticleDOI
11 Oct 2018
TL;DR: MetaArms is introduced, wearable anthropomorphic robotic arms and hands with six degrees of freedom operated by the user's legs and feet that demonstrate the feasibility of body-remapping approach in designing robotic limbs that may help to re-imagine what the human body could do.
Abstract: We introduce MetaArms, wearable anthropomorphic robotic arms and hands with six degrees of freedom operated by the user's legs and feet. Our overall research goal is to re-imagine what our bodies can do with the aid of wearable robotics using a body-remapping approach. To this end, we present an initial exploratory case study. MetaArms' two robotic arms are controlled by the user's feet motion, and the robotic hands can grip objects according to the user's toes bending. Haptic feedback is also presented on the user's feet that correlate with the touched objects on the robotic hands, creating a closed-loop system. We present formal and informal evaluations of the system, the former using a 2D pointing task according to Fitts' Law. The overall throughput for 12 users of the system is reported as 1.01 bits/s (std 0.39). We also present informal feedback from over 230 users. We find that MetaArms demonstrate the feasibility of body-remapping approach in designing robotic limbs that may help us re-imagine what the human body could do.

Proceedings ArticleDOI
19 Apr 2018
TL;DR: This work proposes a solely software based approach of simulating weight in VR by deliberately using perceivable tracking offsets that nudge users to lift their arm higher and result in a visual and haptic perception of weight.
Abstract: Virtual reality (VR) technology strives to enable a highly immersive experience for the user by including a wide variety of modalities (e.g. visuals, haptics). Current VR hardware however lacks a sufficient way of communicating the perception of weight of an object, resulting in scenarios where users can not distinguish between lifting a bowling ball or a feather. We propose a solely software based approach of simulating weight in VR by deliberately using perceivable tracking offsets. These tracking offsets nudge users to lift their arm higher and result in a visual and haptic perception of weight. We conducted two user studies showing that participants intuitively associated them with the sensation of weight and accept them as part of the virtual world. We further show that compared to no weight simulation, our approach led to significantly higher levels of presence, immersion and enjoyment. Finally, we report perceptional thresholds and offset boundaries as design guidelines for practitioners.

Proceedings ArticleDOI
21 Apr 2018
TL;DR: A study on haptic search tasks comparing spatial manipulation of a shape display for egocentric exploration of a map versus exploration using a fixed display and a touch pad shows a 30% decrease in navigation path lengths and a 15% drop in mental demand are shown.
Abstract: We explore interactions enabled by 2D spatial manipulation and self-actuation of a tabletop shape display. To explore these interactions, we developed shapeShift, a compact, high-resolution (7 mm pitch), mobile tabletop shape display. shapeShift can be mounted on passive rollers allowing for bimanual interaction where the user can freely manipulate the system while it renders spatially relevant content. shapeShift can also be mounted on an omnidirectional-robot to provide both vertical and lateral kinesthetic feedback, display moving objects, or act as an encountered-type haptic device for VR. We present a study on haptic search tasks comparing spatial manipulation of a shape display for egocentric exploration of a map versus exploration using a fixed display and a touch pad. Results show a 30% decrease in navigation path lengths, 24% decrease in task time, 15% decrease in mental demand and 29% decrease in frustration in favor of egocentric navigation.

Proceedings ArticleDOI
21 May 2018
TL;DR: In this paper, a deep recurrent model was proposed to predict the forces a garment will apply to a person's body during dressing, which can be used to provide better dressing assistance for people with disabilities.
Abstract: Robot-assisted dressing offers an opportunity to benefit the lives of many people with disabilities, such as some older adults. However, robots currently lack common sense about the physical implications of their actions on people. The physical implications of dressing are complicated by non-rigid garments, which can result in a robot indirectly applying high forces to a person's body. We present a deep recurrent model that, when given a proposed action by the robot, predicts the forces a garment will apply to a person's body. We also show that a robot can provide better dressing assistance by using this model with model predictive control. The predictions made by our model only use haptic and kinematic observations from the robot's end effector, which are readily attainable. Collecting training data from real world physical human-robot interaction can be time consuming, costly, and put people at risk. Instead, we train our predictive model using data collected in an entirely self-supervised fashion from a physics-based simulation. We evaluated our approach with a PR2 robot that attempted to pull a hospital gown onto the arms of 10 human participants. With a 0.2s prediction horizon, our controller succeeded at high rates and lowered applied force while navigating the garment around a persons fist and elbow without getting caught. Shorter prediction horizons resulted in significantly reduced performance with the sleeve catching on the participants' fists and elbows, demonstrating the value of our model's predictions. These behaviors of mitigating catches emerged from our deep predictive model and the controller objective function, which primarily penalizes high forces.

Proceedings ArticleDOI
21 Apr 2018
TL;DR: A mobile system that enhances mixed reality experiences and games with force feedback by means of electrical muscle stimulation (EMS) while keeping the users' hands free to interact unencumbered, and demonstrates how this supports three classes of applications along the mixed-reality continuum.
Abstract: We present a mobile system that enhances mixed reality experiences and games with force feedback by means of electrical muscle stimulation (EMS). The benefit of our approach is that it adds physical forces while keeping the users' hands free to interact unencumbered-not only with virtual objects, but also with physical objects, such as props and appliances. We demonstrate how this supports three classes of applications along the mixed-reality continuum: (1) entirely virtual objects, such as furniture with EMS friction when pushed or an EMS-based catapult game. (2) Virtual objects augmented via passive props with EMS-constraints, such as a light control panel made tangible by means of a physical cup or a balance-the-marble game with an actuated tray. (3) Augmented appliances with virtual behaviors, such as a physical thermostat dial with EMS-detents or an escape-room that repurposes lamps as levers with detents. We present a user-study in which participants rated the EMS-feedback as significantly more realistic than a no-EMS baseline.

Proceedings ArticleDOI
25 Nov 2018
TL;DR: Results show that VRHapticDrones is best suited to simulate objects that are expected to feel either light-weight or have yielding surfaces, as well as insights for future VR haptic feedback systems.
Abstract: We present VRHapticDrones, a system utilizing quadcopters as levitating haptic feedback proxy. A touchable surface is attached to the side of the quadcopters to provide unintrusive, flexible, and programmable haptic feedback in virtual reality. Since the users' sense of presence in virtual reality is a crucial factor for the overall user experience, our system simulates haptic feedback of virtual objects. Quadcopters are dynamically positioned to provide haptic feedback relative to the physical interaction space of the user. In a first user study, we demonstrate that haptic feedback provided by VRHapticDrones significantly increases users' sense of presence compared to vibrotactile controllers and interactions without additional haptic feedback. In a second user study, we explored the quality of induced feedback regarding the expected feeling of different objects. Results show that VRHapticDrones is best suited to simulate objects that are expected to feel either light-weight or have yielding surfaces. With VRHapticDrones we contribute a solution to provide unintrusive and flexible feedback as well as insights for future VR haptic feedback systems.

Proceedings ArticleDOI
21 Apr 2018
TL;DR: The authors' user evaluation results confirm that users can perceive many two-handed objects or interactions as more realistic with Haptic Links than with typical unlinked VR controllers.
Abstract: We present Haptic Links, electro-mechanically actuated physical connections capable of rendering variable stiffness between two commodity handheld virtual reality (VR) controllers. When attached, Haptic Links can dynamically alter the forces perceived between the user's hands to support the haptic rendering of a variety of two-handed objects and interactions. They can rigidly lock controllers in an arbitrary configuration, constrain specific degrees of freedom or directions of motion, and dynamically set stiffness along a continuous range. We demonstrate and compare three prototype Haptic Links: Chain, Layer-Hinge, and Ratchet-Hinge. We then describe interaction techniques and scenarios leveraging the capabilities of each. Our user evaluation results confirm that users can perceive many two-handed objects or interactions as more realistic with Haptic Links than with typical unlinked VR controllers.

Journal ArticleDOI
TL;DR: A survey across 61 VR users to understand common interruptions and scenarios that would benefit from some form of notifications; a design exercise with VR professionals to explore possible notification methods; and an empirical study on the noticeability and perception of 5 different VR interruption scenarios.
Abstract: The proliferation of high resolution and affordable virtual reality (VR) headsets is quickly making room-scale VR experiences available in our homes. Most VR experiences strive to achieve complete immersion by creating a disconnect from the real world. However, due to the lack of a standardized notification management system and minimal context awareness in VR, an immersed user may face certain situations such as missing an important phone call (digital scenario), tripping over wandering pets (physical scenario), or losing track of time (temporal scenario). In this paper, we present the results of 1) a survey across 61 VR users to understand common interruptions and scenarios that would benefit from some form of notifications; 2) a design exercise with VR professionals to explore possible notification methods; and 3) an empirical study on the noticeability and perception of 5 different VR interruption scenarios across 6 modality combinations (e.g., audio, visual, haptic, audio + haptic, visual + haptic, and audio + visual) implemented in Unity and presented using the HTC Vive headset. Finally, we combine key learnings from each of these steps along with participant feedback to present a set of observations and recommendations for notification design in VR.

Proceedings ArticleDOI
11 Oct 2018
TL;DR: Pop-up Prop on Palm (PuPoP), a light-weight pneumatic shape-proxy interface worn on the palm that pops several airbags up with predefined primitive shapes for grasping that is believed to be a simple yet effective way to convey haptic shapes in VR.
Abstract: The sensation of being able to feel the shape of an object when grasping it in Virtual Reality (VR) enhances a sense of presence and the ease of object manipulation. Though most prior works focus on force feedback on fingers, the haptic emulation of grasping a 3D shape requires the sensation of touch using the entire hand. Hence, we present Pop-up Prop on Palm (PuPoP), a light-weight pneumatic shape-proxy interface worn on the palm that pops several airbags up with predefined primitive shapes for grasping. When a user's hand encounters a virtual object, an airbag of appropriate shape, ready for grasping, is inflated by way of the use of air pumps; the airbag then deflates when the object is no longer in play. Since PuPoP is a physical prop, it can provide the full sensation of touch to enhance the sense of realism for VR object manipulation. For this paper, we first explored the design and implementation of PuPoP with multiple shape structures. We then conducted two user studies to further understand its applicability. The first study shows that, when in conflict, visual sensation tends to dominate over touch sensation, allowing a prop with a fixed size to represent multiple virtual objects with similar sizes. The second study compares PuPoP with controllers and free-hand manipulation in two VR applications. The results suggest that utilization of dynamically-changing PuPoP, when grasped by users in line with the shapes of virtual objects, enhances enjoyment and realism. We believe that PuPoP is a simple yet effective way to convey haptic shapes in VR.

Journal ArticleDOI
18 Apr 2018
TL;DR: During virtual experiences, enhanced haptic feedback incongruent with other sensory cues can reduce subjective realism, producing an uncanny valley of haptics.
Abstract: During teleoperation and virtual reality experiences, enhanced haptic feedback incongruent with other sensory cues can reduce subjective realism, producing an uncanny valley of haptics.

Proceedings ArticleDOI
19 Apr 2018
TL;DR: In this paper, the authors employ visuo-haptic illusions to improve the perceived performance of encountered-type haptic devices, specifically shape displays, in virtual reality by redirecting sloped lines with angles less than 40 degrees onto a horizontal line.
Abstract: In this work, we utilize visuo-haptic illusions to improve the perceived performance of encountered-type haptic devices, specifically shape displays, in virtual reality. Shape displays are matrices of actuated pins that travel vertically to render physical shapes; however, they have limitations such as low resolution, small display size, and low pin speed. To address these limitations, we employ illusions such as redirection, scaling, and retargeting that take advantage of the visual dominance effect, the idea that vision often dominates when senses conflict. Our evaluation of these techniques suggests that redirecting sloped lines with angles less than 40 degrees onto a horizontal line is an effective technique for increasing the perceived resolution of the display. Scaling up the virtual object onto the shape display by a factor less than 1.8x can also increase the perceived resolution. Finally, using vertical redirection a perceived 3x speed increase can be achieved.

Proceedings ArticleDOI
25 Mar 2018
TL;DR: A device for creating a continuous lateral motion on the arm to mimic a subset of the gestures used in social touch, composed of a linear array of voice coil actuators that is embedded in a fabric sleeve is presented.
Abstract: Touch is an essential method for communicating emotions between individuals. Humans use a variety of different gestures to convey these emotions, including squeezes, pats, and strokes. This paper presents a device for creating a continuous lateral motion on the arm to mimic a subset of the gestures used in social touch. The device is composed of a linear array of voice coil actuators that is embedded in a fabric sleeve. The voice coils are controlled to sequentially press into the user's arm to create the sensation of linear travel up the arm. We evaluate the device in a human-subject study to confirm that a linear lateral motion can be created using only normal force, and to determine the optimal actuation parameters for creating a continuous and pleasant sensation. The results of the study indicated that the voice coils should be controlled with a long duration for each indentation and a short delay between the onset of indentation between adjacent actuators to maximize both continuity and pleasantness.

Journal ArticleDOI
TL;DR: 3D printed models and guides and haptic simulators may help operators plan and tackle complicated non-surgical and surgical endodontic treatment and may aid skill acquisition.
Abstract: The adoption and adaptation of recent advances in digital technology, such as three-dimensional (3D) printed objects and haptic simulators, in dentistry have influenced teaching and/or management of cases involving implant, craniofacial, maxillofacial, orthognathic and periodontal treatments. 3D printed models and guides may help operators plan and tackle complicated non-surgical and surgical endodontic treatment and may aid skill acquisition. Haptic simulators may assist in the development of competency in endodontic procedures through the acquisition of psycho-motor skills. This review explores and discusses the potential applications of 3D printed models and guides, and haptic simulators in the teaching and management of endodontic procedures. An understanding of the pertinent technology related to the production of 3D printed objects and the operation of haptic simulators are also presented.

Proceedings ArticleDOI
23 Sep 2018
TL;DR: Results show that combining gestures with mid-air haptic feedback was particularly promising, reducing the number of long glances and mean off-road glance time associated with the in-vehicle tasks.
Abstract: Employing a 2x2 within-subjects design, forty-eight experienced drivers (28 male, 20 female) undertook repeated button selection and 'slider-bar' manipulation tasks, to compare a traditional touchscreen with a virtual mid-air gesture interface in a driving simulator Both interfaces were tested with and without haptic feedback generated using ultrasound Results show that combining gestures with mid-air haptic feedback was particularly promising, reducing the number of long glances and mean off-road glance time associated with the in-vehicle tasks For slider-bar tasks in particular, gestures-with-haptics was also associated with the shortest interaction times, highest number of correct responses and least 'overshoots', and was favoured by participants In contrast, for button-selection tasks, the touchscreen was most popular, enabling the highest accuracy and quickest responses, particularly when combined with haptic feedback to guide interactions, although this also increased visual demand The study shows clear potential for gestures with mid-air ultrasonic haptic feedback in the automotive domain

Journal ArticleDOI
01 Jan 2018
TL;DR: Haptic feedback provided by the wearable device improved the performance of both experiments of haptic navigation and showed similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.
Abstract: We present a wearable skin stretch device for the forearm. It is composed of four cylindrical end effectors, evenly distributed around the user's forearm. They can generate independent skin stretch stimuli at the palmar, dorsal, ulnar, and radial sides of the arm. When the four end effectors rotate in the same direction, the wearable device provides cutaneous stimuli about a desired pronation/supination of the forearm. On the other hand, when two opposite end effectors rotate in different directions, the cutaneous device provides cutaneous stimuli about a desired translation of the forearm. To evaluate the effectiveness of our device in providing navigation information, we carried out two experiments of haptic navigation. In the first one, subjects were asked to translate and rotate the forearm toward a target position and orientation, respectively. In the second experiment, subjects were asked to control a 6-DoF robotic manipulator to grasp and lift a target object. Haptic feedback provided by our wearable device improved the performance of both experiments with respect to providing no haptic feedback. Moreover, it showed similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.

Journal ArticleDOI
TL;DR: This paper presents the integration of a haptic vest with a multimodalvirtual environment, consisting of video, audio, and haptic feedback, with the main objective of determining how users, who interact with the virtual environment, benefit from tactile and thermal stimuli provided by the haptic Vest.
Abstract: This paper presents the integration of a haptic vest with a multimodal virtual environment, consisting of video, audio, and haptic feedback, with the main objective of determining how users, who interact with the virtual environment, benefit from tactile and thermal stimuli provided by the haptic vest. Some experiments are performed using a game application of a train station after an explosion. The participants of this experiment have to move inside the environment, while receiving several stimuli to check if any improvement in presence or realism in that environment is reflected on the vest. This is done by comparing the experimental results with those similar scenarios, obtained without haptic feedback. These experiments are carried out by three groups of participants who are classified on the basis of their experience in haptics and virtual reality devices. Some differences among the groups have been found, which can be related to the levels of realism and synchronization of all the elements in the multimodal environment that fulfill the expectations and maximum satisfaction level. According to the participants in the experiment, two different levels of requirements are to be defined by the system to comply with the expectations of professional and conventional users.

Journal ArticleDOI
28 Feb 2018
TL;DR: A wearable haptic device for the forearm and its application in robotic teleoperation, able to provide skin stretch, pressure, and vibrotactile stimuli, making it wearable and unobtrusive.
Abstract: This letter presents a wearable haptic device for the forearm and its application in robotic teleoperation. The device is able to provide skin stretch, pressure, and vibrotactile stimuli. Two servo motors, housed in a 3D printed lightweight platform, actuate an elastic fabric belt, wrapped around the arm. When the two servo motors rotate in opposite directions, the belt is tightened (or loosened), thereby compressing (or decompressing) the arm. On the other hand, when the two motors rotate in the same direction, the belt applies a shear force to the arm skin. Moreover, the belt houses four vibrotactile motors, positioned evenly around the arm at 90° from each other. The device weights 220 g for 115 × 122 × 50 mm of dimensions, making it wearable and unobtrusive. We carried out a perceptual characterization of the device as well as two human-subjects teleoperation experiments in a virtual environment, employing a total of 34 subjects. In the first experiment, participants were asked to control the motion of a robotic manipulator for grasping an object; in the second experiment, participants were asked to teleoperate the motion of a quadrotor fleet along a given path. In both scenarios, the wearable haptic device provided feedback information about the status of the slave robot(s) and of the given task. Results showed the effectiveness of the proposed device. Performance on completion time, length trajectory, and perceived effectiveness when using the wearable device improved of 19.8%, 25.1%, and 149.1% than when wearing no device, respectively. Finally, all subjects but three preferred the conditions including wearable haptics.

Journal ArticleDOI
TL;DR: This paper presents a methodology which enables a human operator to precisely control the motion of a multi-DOF piezo-actuated flexure mechanism with haptic feedback, and demonstrates precisions of approximately 30 nm can be achieved during low speed interactions.

Journal ArticleDOI
TL;DR: Two hand remapping techniques are studied: one that introduces a static translational offset between the virtual and physical hand before a reaching action, and one that dynamically interpolates the position of the virtual hand during a reaching motion.
Abstract: Virtual reality often uses motion tracking to incorporate physical hand movements into interaction techniques for selection and manipulation of virtual objects. To increase realism and allow direct hand interaction, real-world physical objects can be aligned with virtual objects to provide tactile feedback and physical grasping. However, unless a physical space is custom configured to match a specific virtual reality experience, the ability to perfectly match the physical and virtual objects is limited. Our research addresses this challenge by studying methods that allow one physical object to be mapped to multiple virtual objects that can exist at different virtual locations in an egocentric reference frame. We study two such techniques: one that introduces a static translational offset between the virtual and physical hand before a reaching action, and one that dynamically interpolates the position of the virtual hand during a reaching motion. We conducted two experiments to assess how the two methods affect reaching effectiveness, comfort, and ability to adapt to the remapping techniques when reaching for objects with different types of mismatches between physical and virtual locations. We also present a case study to demonstrate how the hand remapping techniques could be used in an immersive game application to support realistic hand interaction while optimizing usability. Overall, the translational technique performed better than the interpolated reach technique and was more robust for situations with larger mismatches between virtual and physical objects.