scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2015"


Proceedings ArticleDOI
05 Nov 2015
TL;DR: This work presents impacto, a device designed to render the haptic sensation of hitting or being hit in virtual reality, and demonstrates how to assemble multiple impacto units into a simple haptic suit.
Abstract: We present impacto, a device designed to render the haptic sensation of hitting or being hit in virtual reality. The key idea that allows the small and light impacto device to simulate a strong hit is that it decomposes the stimulus: it renders the tactile aspect of being hit by tapping the skin using a solenoid; it adds impact to the hit by thrusting the user's arm backwards using electrical muscle stimulation. The device is self-contained, wireless, and small enough for wearable use, thus leaves the user unencumbered and able to walk around freely in a virtual environment. The device is of generic shape, allowing it to also be worn on legs, so as to enhance the experience of kicking, or merged into props, such as a baseball bat. We demonstrate how to assemble multiple impacto units into a simple haptic suit. Participants of our study rated impact simulated using impacto's combination of solenoid hit and electrical muscle stimulation as more realistic than either technique in isolation.

185 citations


Journal ArticleDOI
TL;DR: This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss.
Abstract: Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage.

169 citations


Journal ArticleDOI
TL;DR: Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.
Abstract: Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

139 citations


Patent
Cheol Ho Cheong1, Chul-Hwan Lee1, Byoung Tack Roh1, Yeo Jaeyung1, Im Yo Ywang1 
14 Apr 2015
TL;DR: In this paper, a haptic support module is used to perform screen information analysis, input information analysis and execution information analysis with the aim to allocate haptic information according to the analysis result.
Abstract: An electronic device is provided. The electronic device includes a haptic support module configured to perform at least one of a screen information analysis, an input information analysis, and an execution information analysis, and to allocate haptic information according to the analysis result, and a haptic module configured to output a haptic feedback corresponding to the haptic information.

130 citations


Proceedings ArticleDOI
18 Apr 2015
TL;DR: The findings from a study exploring the communication of emotions through a haptic system that uses tactile stimulation in mid-air are presented and the non-arbitrary mapping between emotions and haptic descriptions across three groups is demonstrated.
Abstract: Touch is a powerful vehicle for communication between humans. The way we touch (how) embraces and mediates certain emotions such as anger, joy, fear, or love. While this phenomenon is well explored for human interaction, HCI research is only starting to uncover the fine granularity of sensory stimulation and responses in relation to certain emotions. Within this paper we present the findings from a study exploring the communication of emotions through a haptic system that uses tactile stimulation in mid-air. Here, haptic descriptions for specific emotions (e.g., happy, sad, excited, afraid) were created by one group of users to then be reviewed and validated by two other groups of users. We demonstrate the non-arbitrary mapping between emotions and haptic descriptions across three groups. This points to the huge potential for mediating emotions through mid-air haptics. We discuss specific design implications based on the spatial, directional, and haptic parameters of the created haptic descriptions and illustrate their design potential for HCI based on two design ideas.

126 citations


Journal ArticleDOI
TL;DR: The RML glove as mentioned in this paper is a lightweight, portable, and self-contained mechatronic system that fits on a bare hand and provides haptic force feedback to each finger of the hand without constraining their movement.
Abstract: This paper presents the design, implementation, and experimental validation of a haptic glove mechanism: the RML glove (Robotics and Mechatronics Lab). The designed haptic interface is a lightweight, portable, and self-contained mechatronic system that fits on a bare hand and provides haptic force feedback to each finger of the hand without constraining their movement. In order to experimentally test the new design, teleportation with this glove for mobile robot navigation is also studied. By comparing teleportation experiments with and without force feedback, the results show that this new admittance (using force as input and position as output) glove with force feedback can provide effective force feedback to the user and augment telepresence.

124 citations


Journal ArticleDOI
TL;DR: A Willow Garage PR2 robot is augmented with a pair of SynTouch BioTac sensors to capture rich tactile signals during the execution of four exploratory procedures on 60 household objects and several machine-learning algorithms were developed to discover the meaning of each adjective from the robot's sensory data.

123 citations


Journal ArticleDOI
TL;DR: In this paper, a synthetic sensory analog that can be 3D printed, using direct ink writing (DIW) onto soft, fluidic elastomer actuators (FEAs), is demonstrated.

120 citations


Journal ArticleDOI
TL;DR: Cutaneous feedback was outperformed by full haptic feedback provided by grounded haptic interfaces, but it outperformed conditions providing no force feedback at all and always kept the system stable, even in the presence of destabilizing factors such as communication delays and hard contacts.
Abstract: Cutaneous haptic feedback can be used to enhance the performance of robotic teleoperation systems while guaranteeing their safety. Delivering ungrounded cutaneous cues to the human operator conveys in fact information about the forces exerted at the slave side and does not affect the stability of the control loop. In this work we analyze the feasibility, effectiveness, and implications of providing solely cutaneous feedback in robotic teleoperation. We carried out two peg-in-hole experiments, both in a virtual environment and in a real teleoperated environment. Two novel 3-degree-of-freedom fingertip cutaneous displays deliver a suitable amount of cutaneous feedback at the thumb and index fingers. Results assessed the feasibility and effectiveness of the proposed approach. Cutaneous feedback was outperformed by full haptic feedback provided by grounded haptic interfaces, but it outperformed conditions providing no force feedback at all. Moreover, cutaneous feedback always kept the system stable, even in the presence of destabilizing factors such as communication delays and hard contacts.

111 citations


Journal ArticleDOI
TL;DR: The current status and benefits of haptic VR simulation-based medical training for bone and dental surgery, intubation procedures, eye surgery, and minimally invasive and endoscopic surgery are reviewed.
Abstract: Virtual reality (VR) medical simulations deliver a tailored learning experience that can be standardized, and can cater to different learning styles in ways that cannot be matched by traditional teaching These simulations also facilitate self-directed learning, allow trainees to develop skills at their own pace and allow unlimited repetition of specific scenarios that enable them to remedy skills deficiencies in a safe environment A number of simulators have been validated and have shown clear benefits to medical training However, while graphical realism is high, realistic haptic feedback and interactive tissues are limited for many simulators This paper reviews the current status and benefits of haptic VR simulation-based medical training for bone and dental surgery, intubation procedures, eye surgery, and minimally invasive and endoscopic surgery

101 citations


Patent
29 Sep 2015
TL;DR: In this article, a plurality of haptic output variations are organized into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of hapt outputs that share characteristics between related events.
Abstract: Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.

Journal ArticleDOI
TL;DR: Empirical research in which participants had to drive a vehicle in a real or simulated environment, were able to control the heading and/or speed of the vehicle, and a haptic signal was provided, indicated that a clear distinction can be made between warning systems (using vibrations) and guidance systems ( using continuous forces).
Abstract: A large number of haptic driver support systems have been described in the scientific literature. However, there is little consensus regarding the design, evaluation methods, and effectiveness of these systems. This literature survey aimed to investigate: (1) what haptic systems (in terms of function, haptic signal, channel, and supported task) have been experimentally tested, (2) how these haptic systems have been evaluated, and (3) their reported effects on driver performance and behaviour. We reviewed empirical research in which participants had to drive a vehicle in a real or simulated environment, were able to control the heading and/or speed of the vehicle, and a haptic signal was provided to them. The results indicated that a clear distinction can be made between warning systems (using vibrations) and guidance systems (using continuous forces). Studies typically used reaction time measures for evaluating warning systems and vehicle-centred performance measures for evaluating guidance systems. In general, haptic warning systems reduced the reaction time of a driver compared to no warnings, although these systems may cause annoyance. Guidance systems generally improved the performance of drivers compared to non-aided driving, but these systems may suffer from after-effects. Longitudinal research is needed to investigate the transfer and retention of effects caused by haptic support systems.

Proceedings ArticleDOI
17 Dec 2015
TL;DR: A light and simple wearable device for the distributed mechano-tactile stimulation of the user's arm skin with pressure and stretch cues, related to normal and tangential forces, respectively, which is capable to deliver in a reliable manner grasping force information, thus eliciting a good softness discrimination in users and enhancing the overall grasping experience.
Abstract: Rendering forces to the user is one of the main goals of haptic technology. While most force-feedback interfaces are robotic manipulators, attached to a fixed frame and designed to exert forces on the users while being moved, more recent haptic research introduced two novel important ideas. On one side, cutaneous stimulation aims at rendering haptic stimuli at the level of the skin, with a distributed, rather than, concentrated approach. On the other side, wearable haptics focuses on highly portable and mobile devices, which can be carried and worn by the user as the haptic equivalent of an mp3 player. This paper presents a light and simple wearable device (CUFF) for the distributed mechano-tactile stimulation of the user's arm skin with pressure and stretch cues, related to normal and tangential forces, respectively. The working principle and the mechanical and control implementation of the CUFF device are presented. Then, after a basic functional validation, a first application of the device is shown, where it is used to render the grasping force of a robotic hand (the Pisa/IIT SoftHand). Preliminary results show that the device is capable to deliver in a reliable manner grasping force information, thus eliciting a good softness discrimination in users and enhancing the overall grasping experience.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a dynamic friction model and backlash hysteresis nonlinearity for a pair of TSM slave manipulators to deal with the nonlinear friction and backlash.

Proceedings ArticleDOI
01 Sep 2015
TL;DR: Findings suggest that the haptic seat can play a significant role in keeping drivers aware of surrounding traffic during automated driving, and consequently facilitate the control transitions between the vehicle and the driver.
Abstract: Drivers' situation awareness is known to be remarkably low in the automated driving mode, which can result in a delayed and inefficient response when requested to resume control of the vehicle. The present study examined the usefulness of a haptic seat that projects spatial information on approaching vehicles to facilitate drivers' preparedness to take control of the vehicle. The results of a simulator study on 26 participants using behavioral and eye tracking techniques showed that when required to regain control, having haptic seat led to faster reactions in scenarios requiring lane changing. The haptic seat also reduced the probability that the participants would slow down below acceptable speeds on a freeway. Eye tracking showed that drivers had a more systematic scan of the environment in the first two seconds following the transition of control with a haptic seat. Overall, these findings suggest that the haptic seat can play a significant role in keeping drivers aware of surrounding traffic during automated driving, and consequently facilitate the control transitions between the vehicle and the driver.

Proceedings ArticleDOI
22 Jun 2015
TL;DR: A method to synthesize a haptic holographic image using spatial modulation of ultrasound to create a completely silent image without temporal ultrasonic modulation noise that is free of the problems caused by feedback delay and errors is discussed.
Abstract: A method to present volumetric haptic objects in the air using spatial modulation of ultrasound is proposed. Previous methods of airborne ultrasonic tactile display were based on vibrotactile radiation pressure and sensor feedback systems, which result in low spatial receptive resolution. The proposed approach produces a spatially standing haptic image using stationary ultrasonic waves that enable users to touch 3D images without depending on vibrotactile stimulation and sensor feedback. The omnidirectional spatial modulated haptic images are generated by a phased array surrounding a workspace, which enables enough power to feel shapes without vibrotactile technique. Compared with previous methods, the proposed method can create a completely silent image without temporal ultrasonic modulation noise that is free of the problems caused by feedback delay and errors. To investigate the active touch profiles of an ultrasonic image, this paper discusses a method to synthesize a haptic holographic image, the evaluation of our algorithm, and the results of pressure measurement and subjective experiments.

Journal ArticleDOI
TL;DR: This study illustrates that both surgeons and non-surgeons prefer instrument vibration feedback during robotic surgery, indicating that this technology provides valuable tactile information to the surgeon.
Abstract: Clinical robotic surgery systems do not currently provide haptic feedback because surgical instrument interactions are difficult to measure and display. Our laboratory recently developed a technology that allows surgeons to feel and/or hear the high-frequency vibrations of robotic instruments as they interact with patient tissue and other tools. Until now, this type of feedback had not been carefully evaluated by users. We conducted two human-subject studies to discover whether surgeons and non-surgeons value the addition of vibration feedback from surgical instruments during robotic surgery. In the first experiment, 10 surgeons and 10 non-surgeons (n = 20) used an augmented Intuitive da Vinci Standard robot to repeatedly perform up to four dry-lab tasks both with and without haptic and audio feedback. In the second experiment, 68 surgeons and 26 non-surgeons (n = 94) tested the same robot at a surgical conference: each participant spent approximately 5 min performing one or two tasks. Almost all subjects in both experiments (95 and 98 %, respectively) preferred receiving feedback of tool vibrations, and all subjects in the second experiment thought it would be useful for surgeons to have the option of such feedback. About half of the subjects (50, 60 %) preferred haptic and audio feedback together, and almost all the rest (45, 35 %) preferred haptic feedback alone. Subjects stated that the feedback made them more aware of tool contacts and did not interfere with use of the robot. There were no significant differences between the responses of different subject populations for any questions in either experiment. This study illustrates that both surgeons and non-surgeons prefer instrument vibration feedback during robotic surgery. Some participants found audio feedback useful but most preferred haptic feedback overall. This strong preference for tool vibration feedback indicates that this technology provides valuable tactile information to the surgeon.

Journal ArticleDOI
01 May 2015
TL;DR: This work has proposed modality-mismatched stimulation and demonstrated that this promotes self-attribution of an alien hand on normally limbed subjects and opens up promising possibilities in this field.
Abstract: Tactile feedback is essential to intuitive control and to promote the sense of self-attribution of a prosthetic limb. Recent findings showed that amputees can be tricked to experience this embodiment, when synchronous and modality-matched stimuli are delivered to biological afferent structures and to an alien rubber hand. Hence, it was suggested to exploit this effect by coupling touch sensors in a prosthesis to an array of haptic tactile stimulators in the prosthetic socket. However, this approach is not clinically viable due to physical limits of current haptic devices. To address this issue we have proposed modality-mismatched stimulation and demonstrated that this promotes self-attribution of an alien hand on normally limbed subjects. In this work we investigated whether similar effects could be induced in transradial amputees with referred phantom sensations in a series of experiments fashioned after the Rubber Hand Illusion using vibrotactile stimulators. Results from three independent measures of embodiment demonstrated that vibrotactile sensory substitution elicits body-ownership of a rubber hand in transradial amputees. These results open up promising possibilities in this field; indeed miniature, safe and inexpensive vibrators could be fitted into commercially available prostheses and sockets to induce the illusion every time the prosthesis manipulates an object.

Journal ArticleDOI
TL;DR: A physical model developed to simulate accurate external ventricular drain placement with realistic haptic and visual feedbacks to serve as a platform for complete procedural training and a phantom brain mold based on 3D scans of a plastinated human brain is built.
Abstract: In this paper, the authors present a physical model developed to simulate accurate external ventricular drain (EVD) placement with realistic haptic and visual feedbacks to serve as a platform for complete procedural training Insertion of an EVD via ventriculostomy is a common neurosurgical procedure used to monitor intracranial pressures and/or drain CSF Currently, realistic training tools are scarce and mainly limited to virtual reality simulation systems The use of 3D printing technology enables the development of realistic anatomical structures and customized design for physical simulators In this study, the authors used the advantages of 3D printing to directly build the model geometry from stealth head CT scans and build a phantom brain mold based on 3D scans of a plastinated human brain The resultant simulator provides realistic haptic feedback during a procedure, with visualization of catheter trajectory and fluid drainage A multiinstitutional survey was also used to prove content validity of

Proceedings ArticleDOI
23 Mar 2015
TL;DR: The Elastic-Arm is a novel approach for incorporating haptic feedback in immersive virtual environments in a simple and cost-effective way and could pave the way for the design of new interaction techniques based on “human-scale” egocentric haptic Feedback.
Abstract: Haptic feedback is known to improve 3D interaction in virtual environments but current haptic interfaces remain complex and tailored to desktop interaction. In this paper, we introduce the “Elastic-Arm”, a novel approach for incorporating haptic feedback in immersive virtual environments in a simple and cost-effective way. The Elastic-Arm is based on a body-mounted elastic armature that links the user's hand to her shoulder. As a result, a progressive resistance force is perceived when extending the arm. This haptic feedback can be incorporated with various 3D interaction techniques and we illustrate the possibilities offered by our system through several use cases based on well-known examples such as the Bubble technique, Redirected Touching and pseudo-haptics. These illustrative use cases provide users with haptic feedback during selection and navigation tasks but they also enhance their perception of the virtual environment. Taken together, these examples suggest that the Elastic-Arm can be transposed in numerous applications and with various 3D interaction metaphors in which a mobile hap-tic feedback can be beneficial. It could also pave the way for the design of new interaction techniques based on “human-scale” egocentric haptic feedback.

Journal ArticleDOI
01 Nov 2015
TL;DR: Simulation and experimental results show the potential of the proposed system in rehabilitation therapy and virtual reality applications, and demonstrate that the SAFE Glove is capable of reliably modeling hand kinematics, measuring finger motion and assisting hand grasping motion.
Abstract: This paper presents the design, implementation and experimental validation of a novel robotic haptic exoskeleton device to measure the user's hand motion and assist hand motion while remaining portable and lightweight. The device consists of a five-finger mechanism actuated with miniature DC motors through antagonistically routed cables at each finger, which act as both active and passive force actuators. The SAFE Glove is a wireless and self-contained mechatronic system that mounts over the dorsum of a bare hand and provides haptic force feedback to each finger. The glove is adaptable to a wide variety of finger sizes without constraining the range of motion. This makes it possible to accurately and comfortably track the complex motion of the finger and thumb joints associated with common movements of hand functions, including grip and release patterns. The glove can be wirelessly linked to a computer for displaying and recording the hand status through 3D Graphical User Interface (GUI) in real-time. The experimental results demonstrate that the SAFE Glove is capable of reliably modeling hand kinematics, measuring finger motion and assisting hand grasping motion. Simulation and experimental results show the potential of the proposed system in rehabilitation therapy and virtual reality applications.

Patent
20 Mar 2015
TL;DR: In this article, a system and method of using a peripheral device for interfacing with a virtual reality scene generated by a computer for presentation on a head mounted display is described. But the haptic feedback controller is not considered.
Abstract: A system and method of using a peripheral device for interfacing with a virtual reality scene generated by a computer for presentation on a head mounted display. The peripheral device includes a haptic device capable of being placed in contact with a user and a haptic feedback controller for processing instructions for outputting a haptic signal to the haptic device. The haptic feedback controller receiving the instructions from the computer so that haptic feedback of the haptic device changes to correspond to a user's virtual interactions with a virtual object in the virtual reality scene as presented on the head mounted display.

Proceedings ArticleDOI
13 Jul 2015
TL;DR: A data-driven approach to incrementally acquire reference signals from experience and decide online when and to which successive behavior to switch, ensuring successful task execution and is robust against perturbation and sensor noise.
Abstract: One of the main challenges in autonomous manipulation is to generate appropriate multi-modal reference trajectories that enable feedback controllers to compute control commands that compensate for unmodeled perturbations and therefore to achieve the task at hand. We propose a data-driven approach to incrementally acquire reference signals from experience and decide online when and to which successive behavior to switch, ensuring successful task execution. We reformulate this online decision making problem as a pair of related classification problems. Both process the current sensor readings, composed from multiple sensor modalities, in real-time (at 30 Hz). Our approach exploits that movement generation can dictate sensor feedback. Thus, enforcing stereotypical behavior will yield stereotypical sensory events which can be accumulated and stored along with the movement plan. Such movement primitives, augmented with sensor experience, are called Associative Skill Memories (ASMs). Sensor experience consists of (real) sensors, including haptic, auditory information and visual information, as well as additional (virtual) features. We show that our approach can be used to teach dexterous tasks, e.g. a bimanual manipulation task on a real platform that requires precise manipulation of relatively small objects. Task execution is robust against perturbation and sensor noise, because our method decides online whether or not to switch to alternative ASMs due to unexpected sensory signals.

Journal ArticleDOI
TL;DR: Comparing continuous versus bandwidth haptic steering guidance in terms of lane-keeping behavior, aftereffects, and satisfaction is useful for designers of haptic guidance systems and support critical thinking about the costs and benefits of automation support systems.
Abstract: OBJECTIVE: The aim of this study was to compare continuous versus bandwidth haptic steering guidance in terms of lane-keeping behavior, aftereffects, and satisfaction. BACKGROUND: An important human factors question is whether operators should be supported continuously or only when tolerance limits are exceeded. We aimed to clarify this issue for haptic steering guidance by investigating costs and benefits of both approaches in a driving simulator. METHODS: Thirty-two participants drove five trials, each with a different level of haptic support: no guidance (Manual); guidance outside a 0.5-m bandwidth (Band1); a hysteresis version of Band1, which guided back to the lane center once triggered (Band2); continuous guidance (Cont); and Cont with double feedback gain (ContS). Participants performed a reaction time task while driving. Toward the end of each trial, the guidance was unexpectedly disabled to investigate aftereffects. RESULTS: All four guidance systems prevented large lateral errors (>0.7 m). Cont and especially ContS yielded smaller lateral errors and higher time to line crossing than Manual, Band1, and Band2. Cont and ContS yielded short-lasting aftereffects, whereas Band1 and Band2 did not. Cont yielded higher self-reported satisfaction and faster reaction times than Band1. CONCLUSIONS: Continuous and bandwidth guidance both prevent large driver errors. Continuous guidance yields improved performance and satisfaction over bandwidth guidance at the cost of aftereffects and variability in driver torque (indicating human-automation conflicts). APPLICATION: The presented results are useful for designers of haptic guidance systems and support critical thinking about the costs and benefits of automation support systems. Language: en

Patent
22 Jun 2015
TL;DR: In this article, a wearable haptic feedback device is used to provide haptic information to a user associated with an electronic game, the electronic game associated with a primary object and a distal secondary object.
Abstract: An electronic game feedback system includes a wearable haptic feedback device and a processing circuit. The wearable haptic feedback device includes a plurality of haptic elements configured to provide haptic feedback to a user. The processing circuit is configured to provide a display to the user associated with an electronic game, the electronic game associated with a primary object and a distal secondary object; receive first positional data regarding the primary object; receive second data regarding the secondary object; and control operation of the wearable haptic feedback device to provide the feedback to the user based on the first positional data and the second data.

Journal ArticleDOI
TL;DR: In this article, the authors used a paired comparison design to investigate the effectiveness of 12 potential eco-driving interfaces, including a visual dashboard display, a multimodal visual dashboard and auditory tone combination, or a haptic accelerator pedal.
Abstract: This high-fidelity driving simulator study used a paired comparison design to investigate the effectiveness of 12 potential eco-driving interfaces. Previous work has demonstrated fuel economy improvements through the provision of in-vehicle eco-driving guidance using a visual or haptic interface. This study uses an eco-driving assistance system that advises the driver of the most fuel efficient accelerator pedal angle, in real time. Assistance was provided to drivers through a visual dashboard display, a multimodal visual dashboard and auditory tone combination, or a haptic accelerator pedal. The style of advice delivery was varied within each modality. The effectiveness of the eco-driving guidance was assessed via subjective feedback, and objectively through the pedal angle error between system-requested and participant-selected accelerator pedal angle. Comparisons amongst the six haptic systems suggest that drivers are guided best by a force feedback system, where a driver experiences a step change in force applied against their foot when they accelerate inefficiently. Subjective impressions also identified this system as more effective than a stiffness feedback system involving a more gradual change in pedal feedback. For interfaces with a visual component, drivers produced smaller pedal errors with an in-vehicle visual display containing second order information on the required rate of change of pedal angle, in addition to current fuel economy information. This was supported by subjective feedback. The presence of complementary audio alerts improved eco-driving performance and reduced visual distraction from the roadway. The results of this study can inform the further development of an in-vehicle assistance system that supports ‘green’ driving.

Proceedings ArticleDOI
07 Sep 2015
TL;DR: PneuHaptic is a pneumatically-actuated arm-worn haptic interface that triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers.
Abstract: PneuHaptic is a pneumatically-actuated arm-worn haptic interface. The system triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers. We detail the implementation of our functional prototype and explore the possibilities for interaction enabled by the system.

Proceedings ArticleDOI
05 Nov 2015
TL;DR: Mango is presented, an editing tool for animators, including its rendering pipeline and perceptually-optimized interpolation algorithm for sparse vibrotactile grids, and the tactile animation object is introduced, a directly manipulated phantom tactile sensation.
Abstract: Chairs, wearables, and handhelds have become popular sites for spatial tactile display. Visual animators, already expert in using time and space to portray motion, could readily transfer their skills to produce rich haptic sensations if given the right tools. We introduce the tactile animation object, a directly manipulated phantom tactile sensation. This abstraction has two key benefits: 1) efficient, creative, iterative control of spatiotemporal sensations, and 2) the potential to support a variety of tactile grids, including sparse displays. We present Mango, an editing tool for animators, including its rendering pipeline and perceptually-optimized interpolation algorithm for sparse vibrotactile grids. In our evaluation, professional animators found it easy to create a variety of vibrotactile patterns, with both experts and novices preferring the tactile animation object over controlling actuators individually.

Proceedings ArticleDOI
22 Jun 2015
TL;DR: Results showed that participants performed the grasping task more precisely and with grasping forces closer to the expected natural behavior when the proposed device provided haptic feedback.
Abstract: A novel wearable haptic device for modulating skin stretch at the fingertip is presented. Rendering of skin stretch in 3 degrees of freedom (DoF), with contact - no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace and minimum inter-finger interference. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the position of the skin tactor. Experiments involving the grasping of a virtual object were conducted using two devices (thumb and index fingers) in a group of 4 subjects: results showed that participants performed the grasping task more precisely and with grasping forces closer to the expected natural behavior when the proposed device provided haptic feedback.

Journal ArticleDOI
12 Jan 2015
TL;DR: The results in a small group of subjects suggest that providing haptic information in the VE did not affect the validity of reaching and grasping movement, and comparable kinematics between environments and conditions is encouraging for the incorporation of high quality VEs in rehabilitation programs aimed at improving upper limb recovery.
Abstract: Reaching and grasping parameters with and without haptic feedback were characterized in people with chronic post-stroke behaviors. Twelve (67 $ \pm $ 10 years) individuals with chronic stroke and arm/hand paresis (Fugl-Meyer Assessment-Arm: $\geq $ 46/66 pts) participated. Three dimensional (3-D) temporal and spatial kinematics of reaching and grasping movements to three objects (can: cylindrical grasp; screwdriver: power grasp; pen: precision grasp) in a physical environment (PE) with and without additional haptic feedback and a 3-D virtual environment (VE) with haptic feedback were recorded.