scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2017"


Journal ArticleDOI
TL;DR: This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges, and reports on the future perspectives of the field.
Abstract: In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand.

473 citations


Journal ArticleDOI
TL;DR: A novel control scheme is developed for a teleoperation system, combining the radial basis function (RBF) neural networks (NNs) and wave variable technique to simultaneously compensate for the effects caused by communication delays and dynamics uncertainties.
Abstract: In this paper, a novel control scheme is developed for a teleoperation system, combining the radial basis function (RBF) neural networks (NNs) and wave variable technique to simultaneously compensate for the effects caused by communication delays and dynamics uncertainties. The teleoperation system is set up with a TouchX joystick as the master device and a simulated Baxter robot arm as the slave robot. The haptic feedback is provided to the human operator to sense the interaction force between the slave robot and the environment when manipulating the stylus of the joystick. To utilize the workspace of the telerobot as much as possible, a matching process is carried out between the master and the slave based on their kinematics models. The closed loop inverse kinematics (CLIK) method and RBF NN approximation technique are seamlessly integrated in the control design. To overcome the potential instability problem in the presence of delayed communication channels, wave variables and their corrections are effectively embedded into the control system, and Lyapunov-based analysis is performed to theoretically establish the closed-loop stability. Comparative experiments have been conducted for a trajectory tracking task, under the different conditions of various communication delays. Experimental results show that in terms of tracking performance and force reflection, the proposed control approach shows superior performance over the conventional methods.

297 citations


Journal ArticleDOI
TL;DR: In this article, the authors review some of the most stringent design challenges for the Tactile Internet and propose first avenues for specific solutions to enable the tactile Internet revolution, as well as propose first solutions for specific applications.
Abstract: Prior Internet designs encompassed the fixed, mobile, and lately the "things" Internet. In a natural evolution to these, the notion of the Tactile Internet is emerging, which allows one to transmit touch and actuation in real-time. With voice and data communications driving the designs of the current Internets, the Tactile Internet will enable haptic communications, which in turn will be a paradigm shift in how skills and labor are digitally delivered globally. Design efforts for both the Tactile Internet and the underlying haptic communications are in its infancy. The aim of this article is thus to review some of the most stringent design challenges, as well as propose first avenues for specific solutions to enable the Tactile Internet revolution.

266 citations


Proceedings ArticleDOI
02 May 2017
TL;DR: This work explores how to add haptics to walls and other heavy objects in virtual reality by creating a counter force that pulls the user's arm backwards when a user tries to push such an object, and accomplishes this in a wearable form factor.
Abstract: We explore how to add haptics to walls and other heavy objects in virtual reality. When a user tries to push such an object, our system actuates the user's shoulder, arm, and wrist muscles by means of electrical muscle stimulation, creating a counter force that pulls the user's arm backwards. Our device accomplishes this in a wearable form factor. In our first user study, participants wearing a head-mounted display interacted with objects provided with different types of EMS effects. The repulsion design (visualized as an electrical field) and the soft design (visualized as a magnetic field) received high scores on "prevented me from passing through" as well as "realistic". In a second study, we demonstrate the effectiveness of our approach by letting participants explore a virtual world in which all objects provide haptic EMS effects, including walls, gates, sliders, boxes, and projectiles.

221 citations


Journal ArticleDOI
TL;DR: A systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery found 35 studies that were ultimately selected and were categorized into 3 representative topics: VR/ AR-based preoperative planning, navigation, and training.
Abstract: Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses Medicine is considered to be one of the most effective applications of VR/AR In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed

181 citations


Journal ArticleDOI
TL;DR: It is shown that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness, and specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.
Abstract: We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty . This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.

176 citations


Proceedings ArticleDOI
20 Oct 2017
TL;DR: Grabity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR, is evaluated, finding promising ability to simulate different levels of weight with convincing object rigidity.
Abstract: Ungrounded haptic devices for virtual reality (VR) applications lack the ability to convincingly render the sensations of a grasped virtual object's rigidity and weight. We present Grabity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR. The device is mounted on the index finger and thumb and enables precision grasps with a wide range of motion. A unidirectional brake creates rigid grasping force feedback. Two voice coil actuators create virtual force tangential to each finger pad through asymmetric skin deformation. These forces can be perceived as gravitational and inertial forces of virtual objects. The rotational orientation of the voice coil actuators is passively aligned with the real direction of gravity through a revolute joint, causing the virtual forces to always point downward. This paper evaluates the performance of Grabity through two user studies, finding promising ability to simulate different levels of weight with convincing object rigidity. The first user study shows that Grabity can convey various magnitudes of weight and force sensations to users by manipulating the amplitude of the asymmetric vibration. The second user study shows that users can differentiate different weights in a virtual environment using Grabity.

170 citations


Proceedings ArticleDOI
01 Feb 2017
TL;DR: The aim is to delve deeper into some of the challenges faced and propose solutions which would result in the breakthrough of tactile internet.
Abstract: The notion of the Tactile Internet is burgeoning which permits us to transmit actuation and touch in real-time. With technological advancement, communication comprising of both voice and data which is pushing forward the structure of the current network, haptic communications will be enabled by tactile internet, which in turn will be a fundamental change to the way the global skill sets would be delivered. The tactile internet is at the grass root level meaning that the research and development related to this particular field is still in progress. The aim is to delve deeper into some of the challenges faced and propose. solutions which would result in the breakthrough of tactile internet.

169 citations


Proceedings ArticleDOI
02 May 2017
TL;DR: A class of passive haptics that is a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes called Sparse Haptic Proxy, which predicts users' intentions during interaction in the virtual space by analyzing their gaze and hand motions, and consequently redirect their hand to a matching primitive of the proxy.
Abstract: We propose a class of passive haptics that we call Sparse Haptic Proxy: a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes. Unlike previous passive haptics that replicate the virtual environment in physical space, a Sparse Haptic Proxy simulates a scene's detailed geometry by redirecting the user's hand to a matching primitive of the proxy. To bridge the divergence of the scene from the proxy, we augment an existing Haptic Retargeting technique with an on-the-fly target remapping: We predict users' intentions during interaction in the virtual space by analyzing their gaze and hand motions, and consequently redirect their hand to a matching part of the proxy. We conducted three user studies on haptic retargeting technique and implemented a system from three main results: 1) The maximum angle participants found acceptable for retargeting their hand is 40°, with an average rating of 4.6 out of 5. 2) Tracking participants' eye gaze reliably predicts their touch intentions (97.5%), even while simultaneously manipulating the user's hand-eye coordination for retargeting. 3) Participants preferred minimized retargeting distances over better-matching surfaces of our Sparse Haptic Proxy when receiving haptic feedback for single-finger touch input. We demonstrate our system with two virtual scenes: a flight cockpit and a room quest game. While their scene geometries differ substantially, both use the same sparse haptic proxy to provide haptic feedback to the user during task completion.

162 citations


Journal ArticleDOI
TL;DR: This paper presents an extreme learning machine (ELM)-based control scheme for uncertain robot manipulators to perform haptic identification, using ELM to compensate for the unknown nonlinearity in the manipulator dynamics.
Abstract: This paper presents an extreme learning machine (ELM)-based control scheme for uncertain robot manipulators to perform haptic identification. ELM is used to compensate for the unknown nonlinearity in the manipulator dynamics. The ELM enhanced controller ensures that the closed-loop controlled manipulator follows a specified reference model, in which the reference point as well as the feedforward force is adjusted after each trial for haptic identification of geometry and stiffness of an unknown object. A neural learning law is designed to ensure finite-time convergence of the neural weight learning, such that exact matching with the reference model can be achieved after the initial iteration. The usefulness of the proposed method is tested and demonstrated by extensive simulation studies.

145 citations


Proceedings ArticleDOI
01 May 2017
TL;DR: A wearable system to provide situational awareness for blind and visually impaired people using techniques from computer vision and motion planning to identify walkable space, plan step-by-step a safe motion trajectory in the space, and recognize and locate certain types of objects.
Abstract: This work introduces a wearable system to provide situational awareness for blind and visually impaired people. The system includes a camera, an embedded computer and a haptic device to provide feedback when an obstacle is detected. The system uses techniques from computer vision and motion planning to (1) identify walkable space; (2) plan step-by-step a safe motion trajectory in the space, and (3) recognize and locate certain types of objects, for example the location of an empty chair. These descriptions are communicated to the person wearing the device through vibrations. We present results from user studies with low- and high-level tasks, including walking through a maze without collisions, locating a chair, and walking through a crowded environment while avoiding people.

Proceedings ArticleDOI
02 May 2017
TL;DR: It is demonstrated that fingertip skin deformation devices can provide a compelling haptic experience appropriate for virtual reality scenarios involving object manipulation.
Abstract: One of the main barriers to immersivity during object manipulation in virtual reality is the lack of realistic haptic feedback. Our goal is to convey compelling interactions with virtual objects, such as grasping, squeezing, pressing, lifting, and stroking, without requiring a bulky, world-grounded kinesthetic feedback device (traditional haptics) or the use of predetermined passive objects (haptic retargeting). To achieve this, we use a pair of finger-mounted haptic feedback devices that deform the skin on the fingertips to convey cutaneous force information from object manipulation. We show that users can perceive differences in virtual object weight and that they apply increasing grasp forces when lifting virtual objects as rendered mass is increased. Moreover, we show how naive users perceive changes of a virtual object's physical properties when we use skin deformation to render objects with varying mass, friction, and stiffness. These studies demonstrate that fingertip skin deformation devices can provide a compelling haptic experience appropriate for virtual reality scenarios involving object manipulation.

Journal ArticleDOI
TL;DR: This work demonstrates how novel mid-air haptic technology can make art more emotionally engaging and stimulating, especially abstract art that is often open to interpretation.
Abstract: The use of the senses of vision and audition as interactive means has dominated the field of Human-Computer Interaction (HCI) for decades, even though nature has provided us with many more senses for perceiving and interacting with the world around us. That said, it has become attractive for HCI researchers and designers to harness touch, taste, and smell in interactive tasks and experience design. In this paper, we present research and design insights gained throughout an interdisciplinary collaboration on a six-week multisensory display – Tate Sensorium – exhibited at the Tate Britain art gallery in London, UK. This is a unique and first time case study on how to design art experiences whilst considering all the senses (i.e., vision, sound, touch, smell, and taste), in particular touch, which we exploited by capitalizing on a novel haptic technology, namely, mid-air haptics. We first describe the overall set up of Tate Sensorium and then move on to describing in detail the design process of the mid-air haptic feedback and its integration with sound for the Full Stop painting by John Latham (1961). This was the first time that mid-air haptic technology was used in a museum context over a prolonged period of time and integrated with sound to enhance the experience of visual art. As part of an interdisciplinary team of curators, sensory designers, sound artists, we selected a total of three variations of the mid-air haptic experience (i.e., haptic patterns), which were alternated at dedicated times throughout the six-week exhibition. We collected questionnaire-based feedback from 2500 visitors and conducted 50 interviews to gain quantitative and qualitative insights on visitors’ experiences and emotional reactions. Whilst the questionnaire results are generally very positive with only a small variation of the visitors’ arousal ratings across the three tactile experiences designed for the Full Stop painting, the interview data shed light on the differences in the visitors’ subjective experiences. Our findings suggest multisensory designers and art curators can ensure a balance between surprising experiences versus the possibility of free exploration for visitors. In addition, participants expressed that experiencing art with the combination of mid-air haptic and sound was immersive and provided an up-lifting experience of touching without touch. We are convinced that the insights gained from this large-scale and real-world field exploration of multisensory experience design exploiting a new and emerging technology provide a solid starting point for the HCI community, creative industries, and art curators to think beyond conventional art experiences. Specifically, our work demonstrates how novel mid-air technology can make art more emotionally engaging and stimulating, especially abstract art that is often open to interpretation.

Proceedings ArticleDOI
02 May 2017
TL;DR: ThermoVR integrated five thermal feedback modules on the HMD to provide thermal feedback directly onto the user's face to demonstrate the ThermoVR's directional cueing and immersive experience.
Abstract: Head Mounted Displays (HMDs) provide a promising opportunity for providing haptic feedback on the head for an enhanced immersive experience. In ThermoVR, we integrated five thermal feedback modules on the HMD to provide thermal feedback directly onto the user's face. We conducted evaluations with 15 participants using two approaches: Firstly, we provided simultaneously actuated thermal stimulations (hot and cold) as directional cues and evaluated the accuracy of recognition; secondly, we evaluated the overall immersive thermal experience that the users experience when provided with thermal feedback on the face. Results indicated that the recognition accuracy for cold stimuli were of approx. 89.5% accuracy while the accuracy for hot stimuli were 68.6%. Also, participants reported that they felt a higher level of immersion on the face when all modules were simultaneously stimulated (hot and cold). The presented applications demonstrate the ThermoVR's directional cueing and immersive experience.

Journal ArticleDOI
TL;DR: A summary of the requirements for haptic communications is offered, followed by an overview of challenges in realizing the Tactile Internet, and possible solutions to these challenges are proposed and discussed.
Abstract: The Tactile Internet presently constitutes a vision of an Internet over which, in addition to current communications modalities, a sense of touch can be transported. In that case, people would no longer need to be physically near the systems they operate, but could control them remotely. The main problem that needs to be solved to realize the Tactile Internet is summarized by the “1 ms challenge.” If the response time of a system is below 1 ms, the end-user will not be able to tell the difference between controlling a system locally or from another location. This paper offers a summary of the requirements for haptic communications, followed by an overview of challenges in realizing the Tactile Internet. In addition, possible solutions to these challenges are proposed and discussed. For example, the development of the fifth generation mobile communication networks will provide a good foundation upon which a Tactile Internet could be built. This paper also describes the design of a modular testbed needed for testing of a wide variety of haptic system applications.

Journal ArticleDOI
TL;DR: This survey provides an overview of work on haptic technology for social touch, reflecting on the current state of research into social touch technology, and providing suggestions for future research and applications.
Abstract: This survey provides an overview of work on haptic technology for social touch. Social touch has been studied extensively in psychology and neuroscience. With the development of new technologies, it is now possible to engage in social touch at a distance or engage in social touch with artificial social agents. Social touch research has inspired research into technology mediated social touch, and this line of research has found effects similar to actual social touch. The importance of haptic stimulus qualities, multimodal cues, and contextual factors in technology mediated social touch is discussed. This survey is concluded by reflecting on the current state of research into social touch technology, and providing suggestions for future research and applications.

Journal ArticleDOI
TL;DR: Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks and subjects significantly preferred conditions providing wearable haptics.
Abstract: Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games “Pokemon GO” and “Ingress” or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.


Journal ArticleDOI
TL;DR: A model in which haptic information, provided by touch and proprioception, enables interacting individuals to estimate the partner’s movement goal and use it to improve their own motor performance is proposed.
Abstract: From a parent helping to guide their child during their first steps, to a therapist supporting a patient, physical assistance enabled by haptic interaction is a fundamental modus for improving motor abilities. However, what movement information is exchanged between partners during haptic interaction, and how this information is used to coordinate and assist others, remains unclear1. Here, we propose a model in which haptic information, provided by touch and proprioception2, enables interacting individuals to estimate the partner’s movement goal and use it to improve their own motor performance. We use an empirical physical interaction task3 to show that our model can explain human behaviours better than existing models of interaction in literature4–8. Furthermore, we experimentally verify our model by embodying it in a robot partner and checking that it induces the same improvements in motor performance and learning in a human individual as interacting with a human partner. These results promise collaborative robots that provide human-like assistance, and suggest that movement goal exchange is the key to physical assistance. Takagi and colleagues present a model of how human pairs learn movements through touch. Participants learn in the same way when the model is applied to a robotic partner. This is important for the development of physical assistance robotics.

Journal ArticleDOI
TL;DR: The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness).
Abstract: Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

Journal ArticleDOI
TL;DR: Overall results show that participants better controlled interaction forces when the cutaneous feedback was active, with significant differences between the visual and visuo-haptic experimental conditions.
Abstract: A novel wearable haptic device for modulating contact forces at the fingertip is presented. Rendering of forces by skin deformation in three degrees of freedom (DoF), with contact—no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace. The device was designed to render constant to low frequency deformation of the fingerpad in three DoF, combining light weight with relatively high output forces. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the device. The first experimental activity evaluated discrimination of different fingerpad stretch directions in a group of five subjects. The second experiment, enrolling 19 subjects, evaluated cutaneous feedback provided in a virtual pick-and-place manipulation task. Stiffness of the fingerpad plus device was measured and used to calibrate the physics of the virtual environment. The third experiment with 10 subjects evaluated interaction forces in a virtual lift-and-hold task. Although with different performance in the two manipulation experiments, overall results show that participants better controlled interaction forces when the cutaneous feedback was active, with significant differences between the visual and visuo-haptic experimental conditions.

Journal ArticleDOI
17 May 2017-Sensors
TL;DR: The proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.
Abstract: This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

Journal ArticleDOI
TL;DR: These experiments show that three DoF skin deformation enables both stiffness and friction discrimination capability in the absence of kinesthetic force feedback, demonstrating that users can perceive differences in surface friction without world-grounded kinesthetic forces.
Abstract: Virtual reality systems would benefit from a compelling force sensory substitute when workspace or stability limitations prevent the use of kinesthetic force feedback systems. We present a wearable fingertip haptic device with the ability to make and break contact in addition to rendering both shear and normal skin deformation to the fingerpad. A delta mechanism with novel bias spring and tether actuator relocation method enables the use of high-end motors and encoders, allowing precise device control: 10 Hz bandwidth and 0.255 mm RMS tracking error were achieved during testing. In the first of two experiments, participants determined the orientation of a stiff region in a surrounding compliant virtual surface with an average angular error of 7.6 degree, similar to that found in previous studies using traditional force feedback. In the second experiment, we evaluated participants’ ability to interpret differences in friction. The Just Noticeable Difference (JND) of surface friction coefficient discrimination using our skin deformation device was 0.20, corresponding with a reference friction coefficient of 0.5. While higher than that found using kinesthetic feedback, this demonstrates that users can perceive differences in surface friction without world-grounded kinesthetic forces. These experiments show that three DoF skin deformation enables both stiffness and friction discrimination capability in the absence of kinesthetic force feedback.

Journal ArticleDOI
TL;DR: This study examines how users engage with interactive visual rotation and tactile simulation features while browsing fashion clothing products on touch screen devices and thus contributes to retail touch screen research that previously focused on in-store kiosks and window displays.

Journal ArticleDOI
TL;DR: It is found that haptic designers follow a familiar design process, but face specific challenges when working with haptics, and this paper captures and summarize these challenges, make concrete recommendations to conquer them, and present a vision for the future of haptic experience design.
Abstract: From simple vibrations to roles in complex multisensory systems, haptic technology is often a critical, expected component of user experience one face of the rapid progression towards blended physical-digital interfaces. Haptic experience design, which is woven together with other multisensory design efforts, interfaces is now becoming part of many designers' jobs. We can expect it to present unique challenges, and yet we know almost nothing of what it looks like in the wild due to the field's relative youth, its technical complexity, the multisensory interactions between haptics, sight, and sound, and the difficulty of accessing practitioners in professional and proprietary environments. In this paper, we analyze interviews with six professional haptic designers to document and articulate haptic experience design by observing designers' goals and processes and finding themes at three levels of scope: the multisensory nature of haptic experiences, a map of the collaborative ecosystem, and the cultural context of haptics. Our findings are augmented by feedback obtained in a recent design workshop at an international haptics conference. We find that haptic designers follow a familiar design process, but face specific challenges when working with haptics. We capture and summarize these challenges, make concrete recommendations to conquer them, and present a vision for the future of haptic experience design.

Journal ArticleDOI
31 Jan 2017
TL;DR: This work focuses on a prototypical task of tactile exploration over surface features such as edges or ridges, which is a principal exploratory procedure of humans to recognize object shape, and brings together active perception and haptic exploration as instantiations of a common active touch algorithm.
Abstract: A key unsolved problem in tactile robotics is how to combine tactile perception and control to interact robustly and intelligently with the surroundings. Here, we focus on a prototypical task of tactile exploration over surface features such as edges or ridges, which is a principal exploratory procedure of humans to recognize object shape. Our methods were adapted from an approach for biomimetic active touch that perceives stimulus location and identity while controlling location to aid perception. With minor modification to the control policy, to rotate the sensor to maintain a relative orientation and move tangentially (tactile servoing), the method applies also to tactile exploration. Robust exploratory tactile servoing is then attained over various two-dimensional objects, ranging from the edge of a circular disk, a volute laminar, and circular or spiral ridges. Conceptually, the approach brings together active perception and haptic exploration as instantiations of a common active touch algorithm, and has potential to generalize to more complex tasks requiring the flexibility and robustness of human touch.

Journal ArticleDOI
19 Jan 2017
TL;DR: This is the first soft sensorized 3-D-printed gripper coupled with a soft fabric-based haptic glove that has the potential to improve the robotic grasping manipulation by introducing haptic feedback to the users.
Abstract: This paper presents a hybrid tele-manipulation system, comprising of a sensorized 3-D-printed soft robotic gripper and a soft fabric-based haptic glove that aim at improving grasping manipulation and providing sensing feedback to the operators. The flexible 3-D-printed soft robotic gripper broadens what a robotic gripper can do, especially for grasping tasks where delicate objects, such as glassware, are involved. It consists of four pneumatic finger actuators, casings with through hole for housing the actuators, and adjustable base. The grasping length and width can be configured easily to suit a variety of objects. The soft haptic glove is equipped with flex sensors and soft pneumatic haptic actuator, which enables the users to control the grasping, to determine whether the grasp is successful, and to identify the grasped object shape. The fabric-based soft pneumatic haptic actuator can simulate haptic perception by producing force feedback to the users. Both the soft pneumatic finger actuator and haptic actuator involve simple fabrication technique, namely 3-D-printed approach and fabric-based approach, respectively, which reduce fabrication complexity as compared to the steps involved in a traditional silicone-based approach. The sensorized soft robotic gripper is capable of picking up and holding a wide variety of objects in this study, ranging from lightweight delicate object weighing less than 50 g to objects weighing 1100 g. The soft haptic actuator can produce forces of up to 2.1 N, which is more than the minimum force of 1.5 N needed to stimulate haptic perception. The subjects are able to differentiate the two objects with significant shape differences in the pilot test. Compared to the existing soft grippers, this is the first soft sensorized 3-D-printed gripper, coupled with a soft fabric-based haptic glove that has the potential to improve the robotic grasping manipulation by introducing haptic feedback to the users.

Proceedings ArticleDOI
01 Sep 2017
TL;DR: A step towards soft robot grippers capable of a complex range of motions and proprioception, which will help future robots better understand the environments with which they interact, and has the potential to increase physical safety in human-robot interaction.
Abstract: Robots are becoming increasingly prevalent in our society in forms where they are assisting or interacting with humans in a variety of environments, and thus they must have the ability to sense and detect objects by touch. An ongoing challenge for soft robots has been incorporating flexible sensors that can recognize complex motions and close the loop for tactile sensing. We present sensor skins that enable haptic object visualization when integrated on a soft robotic gripper that can twist an object. First, we investigate how the design of the actuator modules impact bend angle and motion. Each soft finger is molded using a silicone elastomer, and consists of three pneumatic chambers which can be inflated independently to achieve a range of complex motions. Three fingers are combined to form a soft robotic gripper. Then, we manufacture and attach modular, flexible sensory skins on each finger to measure deformation and contact. These sensor measurements are used in conjunction with an analytical model to construct 2D and 3D tactile object models. Our results are a step towards soft robot grippers capable of a complex range of motions and proprioception, which will help future robots better understand the environments with which they interact, and has the potential to increase physical safety in human-robot interaction. Please see the accompanying video for additional details.

Journal ArticleDOI
TL;DR: How haptic technology works, its devices, applications, and disadvantages are described and a description on some of its future applications and a few limitations of this technology is provided.

Journal ArticleDOI
06 Sep 2017
TL;DR: The development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects and according to the psychometric experiment result, average Weber fraction values for visual only feedback was improved to 0.25 by adding the tactile feedback.
Abstract: In this paper, we discuss the development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects. Our experimental setup consists of haptic device with five vibrotactile actuators, virtual reality environment tailored in Unity 3D integrating the Oculus Rift Head Mounted Display (HMD) and the Leap Motion controller. The virtual environment is able to capture touch inputs from users. Interaction forces are then rendered at 500 Hz and fed back to the wearable setup stimulating fingertips with ERM vibrotactile actuators. Amplitude and frequency of vibrations are modulated proportionally to the interaction force to simulate the stiffness of a virtual object. A quantitative and qualitative study is done to compare the discrimination of stiffness on virtual linear spring in three sensory modalities: visual only feedback, tactile only feedback, and their combination. A common psychophysics method called the Two Alternative Forced Choice (2AFC) approach is used for quantitative analysis using Just Noticeable Difference (JND) and Weber Fractions (WF). According to the psychometric experiment result, average Weber fraction values of 0.39 for visual only feedback was improved to 0.25 by adding the tactile feedback.