scispace - formally typeset
Search or ask a question

Showing papers on "Haptic technology published in 2016"


Proceedings ArticleDOI
07 May 2016
TL;DR: The study results indicate that all the haptic retargeting techniques improve the sense of presence when compared to typical wand-based 3D control of virtual objects, and a hybrid technique which combines both world and body manipulation achieves the highest satisfaction and presence scores.
Abstract: Manipulating a virtual object with appropriate passive haptic cues provides a satisfying sense of presence in virtual reality. However, scaling such experiences to support multiple virtual objects is a challenge as each one needs to be accompanied with a precisely-located haptic proxy object. We propose a solution that overcomes this limitation by hacking human perception. We have created a framework for repurposing passive haptics, called haptic retargeting, that leverages the dominance of vision when our senses conflict. With haptic retargeting, a single physical prop can provide passive haptics for multiple virtual objects. We introduce three approaches for dynamically aligning physical and virtual objects: world manipulation, body manipulation and a hybrid technique which combines both world and body manipulation. Our study results indicate that all our haptic retargeting techniques improve the sense of presence when compared to typical wand-based 3D control of virtual objects. Furthermore, our hybrid haptic retargeting achieved the highest satisfaction and presence scores while limiting the visible side-effects during interaction.

353 citations


Journal ArticleDOI
TL;DR: The future of robotic surgery involves cost reduction, development of new platforms and technologies, creation and validation of curriculum and virtual simulators, and conduction of randomized clinical trials to determine the best applications of robotics.
Abstract: The idea of reproducing himself with the use of a mechanical robot structure has been in man's imagination in the last 3000 years However, the use of robots in medicine has only 30 years of history The application of robots in surgery originates from the need of modern man to achieve two goals: the telepresence and the performance of repetitive and accurate tasks The first "robot surgeon" used on a human patient was the PUMA 200 in 1985 In the 1990s, scientists developed the concept of "master-slave" robot, which consisted of a robot with remote manipulators controlled by a surgeon at a surgical workstation Despite the lack of force and tactile feedback, technical advantages of robotic surgery, such as 3D vision, stable and magnified image, EndoWrist instruments, physiologic tremor filtering, and motion scaling, have been considered fundamental to overcome many of the limitations of the laparoscopic surgery Since the approval of the da Vinci(®) robot by international agencies, American, European, and Asian surgeons have proved its factibility and safety for the performance of many different robot-assisted surgeries Comparative studies of robotic and laparoscopic surgical procedures in general surgery have shown similar results with regard to perioperative, oncological, and functional outcomes However, higher costs and lack of haptic feedback represent the major limitations of current robotic technology to become the standard technique of minimally invasive surgery worldwide Therefore, the future of robotic surgery involves cost reduction, development of new platforms and technologies, creation and validation of curriculum and virtual simulators, and conduction of randomized clinical trials to determine the best applications of robotics

246 citations


Proceedings ArticleDOI
16 Oct 2016
TL;DR: It is found that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of mechanically-actuated hand-held controllers.
Abstract: We present an investigation of mechanically-actuated hand-held controllers that render the shape of virtual objects through physical shape displacement, enabling users to feel 3D surfaces, textures, and forces that match the visual rendering. We demonstrate two such controllers, NormalTouch and TextureTouch, which are tracked in 3D and produce spatially-registered haptic feedback to a user's finger. NormalTouch haptically renders object surfaces and provides force feedback using a tiltable and extrudable platform. TextureTouch renders the shape of virtual objects including detailed surface structure through a 4×4 matrix of actuated pins. By moving our controllers around while keeping their finger on the actuated platform, users obtain the impression of a much larger 3D shape by cognitively integrating output sensations over time. Our evaluation compares the effectiveness of our controllers with the two de-facto standards in Virtual Reality controllers: device vibration and visual feedback only. We find that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of our controllers.

228 citations


Journal ArticleDOI
P. Geethanjali1
TL;DR: The myoelectric control-based prosthetic hand aids to restore activities of daily living of amputees in order to improve the self-esteem of the user and to suit the user requirement with the different operating features.
Abstract: Myoelectric signals (MES) have been used in various applications, in particular, for identification of user intention to potentially control assistive devices for amputees, orthotic devices, and exoskeleton in order to augment capability of the user. MES are also used to estimate force and, hence, torque to actuate the assistive device. The application of MES is not limited to assistive devices, and they also find potential applications in teleoperation of robots, haptic devices, virtual reality, and so on. The myoelectric control-based prosthetic hand aids to restore activities of daily living of amputees in order to improve the self-esteem of the user. All myoelectric control-based prosthetic hands may not have similar operations and exhibit variation in sensing input, deciphering the signals, and actuating prosthetic hand. Researchers are focusing on improving the functionality of prosthetic hand in order to suit the user requirement with the different operating features. The myoelectric control differs in operation to accommodate various external factors. This article reviews the state of the art of myoelectric prosthetic hand, giving description of each control strategy.

228 citations


Proceedings ArticleDOI
07 May 2016
TL;DR: Dexmo as mentioned in this paper is an exoskeleton system for motion capture and force feedback in virtual reality applications, which combines multiple types of sensors, actuation units and link rod structures to provide users with a pleasant virtual reality experience.
Abstract: We present Dexmo: an inexpensive and lightweight mechanical exoskeleton system for motion capturing and force feedback in virtual reality applications. Dexmo combines multiple types of sensors, actuation units and link rod structures to provide users with a pleasant virtual reality experience. The device tracks the user's motion and uniquely provides passive force feedback. In combination with a 3D graphics rendered environment, Dexmo provides the user with a realistic sensation of interaction when a user is for example grasping an object. An initial evaluation with 20 participants demonstrate that the device is working reliably and that the addition of force feedback resulted in a significant reduction in error rate. Informal comments by the participants were overwhelmingly positive.

206 citations


Proceedings ArticleDOI
16 May 2016
TL;DR: In this paper, a purely visual haptic prediction model was proposed to enable a robot to "feel" without physical interaction, and they demonstrate that using both visual and physical interaction signals together yields more accurate haptic classification.
Abstract: Robots which interact with the physical world will benefit from a fine-grained tactile understanding of objects and surfaces. Additionally, for certain tasks, robots may need to know the haptic properties of an object before touching it. To enable better tactile understanding for robots, we propose a method of classifying surfaces with haptic adjectives (e.g., compressible or smooth) from both visual and physical interaction data. Humans typically combine visual predictions and feedback from physical interactions to accurately predict haptic properties and interact with the world. Inspired by this cognitive pattern, we propose and explore a purely visual haptic prediction model. Purely visual models enable a robot to “feel” without physical interaction. Furthermore, we demonstrate that using both visual and physical interaction signals together yields more accurate haptic classification. Our models take advantage of recent advances in deep neural networks by employing a unified approach to learning features for physical interaction and visual observations. Even though we employ little domain specific knowledge, our model still achieves better results than methods based on hand-designed features.

199 citations


Proceedings ArticleDOI
01 Oct 2016
TL;DR: The Wolverine is a mobile, wearable haptic device designed for simulating the grasping of rigid objects in a virtual reality interface that renders a force directly between the thumb and three fingers to simulate objects held in pad opposition (precision) type grasps.
Abstract: The Wolverine is a mobile, wearable haptic device designed for simulating the grasping of rigid objects in a virtual reality interface. In contrast to prior work on wearable force feedback gloves, we focus on creating a low cost and lightweight device that renders a force directly between the thumb and three fingers to simulate objects held in pad opposition (precision) type grasps. Leveraging low-power brake-based locking sliders, the system can withstand over 100N of force between each finger and the thumb, and only consumes 0.24 mWh (0.87 joules) for each braking interaction. Integrated sensors are used both for feedback control and user input: time-of-flight sensors provide the position of each finger and an IMU provides overall orientation tracking. This paper describes the mechanical design, control strategy, and performance analysis of the Wolverine system and provides a comparison with several existing wearable haptic devices.

180 citations


Journal ArticleDOI
TL;DR: Objectives and challenges of deploying haptic technologies in surgical robotics are discussed, and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery.
Abstract: Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man–machine collaborative type, surgical robots are seen as media that transfer pre- and intraoperative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper, objectives and challenges of deploying haptic technologies in surgical robotics are discussed, and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It attempts to encompass both classical works and the state-of-the-art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts.

163 citations


Journal ArticleDOI
TL;DR: This paper presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperator's stability, and implemented this solution on an Intuitive Surgical da Vinci Standard robot.
Abstract: Despite its expected clinical benefits, current teleoperated surgical robots do not provide the surgeon with haptic feedback largely because grounded forces can destabilize the system's closed-loop controller. This paper presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperator's stability. We implemented our cutaneous feedback solution on an Intuitive Surgical da Vinci Standard robot by mounting a SynTouch BioTac tactile sensor to the distal end of a surgical instrument and a custom cutaneous display to the corresponding master controller. As the user probes the remote environment, the contact deformations, dc pressure, and ac pressure (vibrations) sensed by the BioTac are directly mapped to input commands for the cutaneous device's motors using a model-free algorithm based on look-up tables. The cutaneous display continually moves, tilts, and vibrates a flat plate at the operator's fingertip to optimally reproduce the tactile sensations experienced by the BioTac. We tested the proposed approach by having eighteen subjects use the augmented da Vinci robot to palpate a heart model with no haptic feedback, only deformation feedback, and deformation plus vibration feedback. Fingertip deformation feedback significantly improved palpation performance by reducing the task completion time, the pressure exerted on the heart model, and the subject's absolute error in detecting the orientation of the embedded plastic stick. Vibration feedback significantly improved palpation performance only for the seven subjects who dragged the BioTac across the model, rather than pressing straight into it.

163 citations


Proceedings ArticleDOI
07 May 2016
TL;DR: Annexing Reality; a system that opportunistically annexes physical objects from a user's current physical environment to provide the best-available haptic sensation for virtual objects to allow content creators to a priori specify haptic experiences that adapt to the user'sCurrent setting.
Abstract: Advances in display and tracking technologies hold the promise of increasingly immersive augmented-reality experiences. Unfortunately, the on-demand generation of haptic experiences is lagging behind these advances in other feedback channels. We present Annexing Reality; a system that opportunistically annexes physical objects from a user's current physical environment to provide the best-available haptic sensation for virtual objects. It allows content creators to a priori specify haptic experiences that adapt to the user's current setting. The system continuously scans user's surrounding, selects physical objects that are similar to given virtual objects, and overlays the virtual models on to selected physical ones reducing the visual-haptic mismatch. We describe the developer's experience with the Annexing Reality system and the techniques utilized in realizing it. We also present results of a developer study that validates the usability and utility of our method of defining haptic experiences.

155 citations


Journal ArticleDOI
TL;DR: The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices and focuses on stitching, palpation, dental procedures, endoscopy, laparoscopy and orthopaedics.
Abstract: Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.

Proceedings ArticleDOI
14 Feb 2016
TL;DR: Snake-charmer is an attempt to provide physical form to virtual objects by revisiting the concept of Robotic Graphics or Encountered-type Haptic interfaces with current commodity hardware and explores what it means to truly interact with an object.
Abstract: Augmented and virtual reality have the potential of being indistinguishable from the real world. Holographic displays, including head mounted units, support this vision by creating rich stereoscopic scenes, with objects that appear to float in thin air - often within arm's reach. However, one has but to reach out and grasp nothing but air to destroy the suspension of disbelief. Snake-charmer is an attempt to provide physical form to virtual objects by revisiting the concept of Robotic Graphics or Encountered-type Haptic interfaces with current commodity hardware. By means of a robotic arm, Snake-charmer brings physicality to a virtual scene and explores what it means to truly interact with an object. We go beyond texture and position simulation and explore what it means to have a physical presence inside a virtual scene. We demonstrate how to render surface characteristics beyond texture and position, including temperature; how to physically move objects; and how objects can physically interact with the user's hand. We analyze our implementation, present the performance characteristics, and provide guidance for the construction of future physical renderers.

Journal ArticleDOI
TL;DR: A teleoperated robotic-assisted surgery and psychophysics-based collision discrimination control scheme was presented and a human operator-centered haptic interface design concept is first introduced into actuator choice and design to address the lack of haptic sensation in telesurgery scenario.
Abstract: In catheter minimally invasive neurosurgery (CMINS), catheter tip collision with the blood vessel detection during the surgery practice is important. Moreover, successful CMINS is dependent on the discrimination of collision by a skilled surgeon in direct operation. However, in the context of teleoperated scenario, the surgeon was physically separated. Therefore, the lack of haptic sensation is a major challenge for a telesurgery scenario. A human operator-centered haptic interface is adopted to address this problem. In this paper, a teleoperated robotic-assisted surgery and psychophysics-based safety operation consciousness theory was presented. Moreover, a human operator-centered haptic interface design concept is first introduced into actuator choice and design. A semiactive haptic interface was designed and fabricated through taking full advantage of MR fluids. Furthermore, a mechanical model (force/torque model) was established. In addition, in case of no collision, transparency of a teleoperated system was realized; in case of collision, psychophysics-based collision discrimination control scheme was first presented to provide safety operation consciousness. Experiments demonstrate the usability of the designed haptic interface and correctness of the safety operation consciousness control scheme.

Journal ArticleDOI
TL;DR: The proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations, so the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.
Abstract: Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a ‘haptic glance’). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.

Journal ArticleDOI
TL;DR: It is shown that during natural interactions with ordinary objects, mechanical energy originating at finger contact propagates through the whole hand, and that vibration signals that are captured remotely contain sufficient information to discriminate between gestures and between the touched objects.
Abstract: We investigated the propagation patterns of cutaneous vibration in the hand during interactions with touched objects. Prior research has highlighted the importance of vibrotactile signals during haptic interactions, but little is known of how vibrations propagate throughout the hand. Furthermore, the extent to which the patterns of vibrations reflect the nature of the objects that are touched, and how they are touched, is unknown. Using an apparatus comprised of an array of accelerometers, we mapped and analyzed spatial distributions of vibrations propagating in the skin of the dorsal region of the hand during active touch, grasping, and manipulation tasks. We found these spatial patterns of vibration to vary systematically with touch interactions and determined that it is possible to use these data to decode the modes of interaction with touched objects. The observed vibration patterns evolved rapidly in time, peaking in intensity within a few milliseconds, fading within 20–30 ms, and yielding interaction-dependent distributions of energy in frequency bands that span the range of vibrotactile sensitivity. These results are consistent with findings in perception research that indicate that vibrotactile information distributed throughout the hand can transmit information regarding explored and manipulated objects. The results may further clarify the role of distributed sensory resources in the perceptual recovery of object attributes during active touch, may guide the development of approaches to robotic sensing, and could have implications for the rehabilitation of the upper extremity.

Journal ArticleDOI
TL;DR: Experimental results illustrated that the designed haptic catheter operation system can be used for teleoperation and for training the surgeon for the non-experience.
Abstract: Minimally invasive surgery and therapy is popularly used both for diagnosis and for surgery. Teleoperation, a promising surgery, is used to protect the surgeon from X-ray radiation as well as to address the problem of lacking experienced surgeons in remote rural areas. However, surgery success ratio should be considered because the surgeon was separated from the patient remotely. A most effective addressing method to improve success ratio is design of a haptic interface as a master console, which can provide the “immersive” operation to the surgeon. In this study, a haptic catheter operation system for teleoperation through exploiting magnetorheological fluids is proposed to solve the safety problem. The haptic sensation is provided by varying the viscosity of the magnetorheological fluids by adjusting the magnetic field, which is dependent on the force measured in the slave manipulator. Therefore, three parts of the haptic interface were designed and fabricated: magnetic field, magnetorheological fluids ...

Journal ArticleDOI
TL;DR: An overview of the recent achievements in affective haptics is presented and a thorough discussion about the effectiveness of using the haptic channel to communicate affective information through direct and mediated means is provided.
Abstract: Touch plays a prominent role in communicating emotions and intensifying interpersonal communication. Affective haptics is an emerging field, which focuses on the analysis, design, and evaluation of systems that can capture, process, or display emotions through the sense of touch. The objective of this paper is to present an overview of the recent achievements in affective haptics and to discuss how the sense of touch can elicit or influence human emotions. We first introduce a definition to the term affective haptics and describe its multidisciplinary nature—as a field that integrates ideas from affective computing, haptic technology, and user experience. Second, we provide a thorough discussion about the effectiveness of using the haptic channel to communicate affective information through direct and mediated means. Third, we present a variety of applications in the area ranging from interhuman social interaction systems to human robot interaction applications. Finally, we discuss some of the key findings discerned from the various surveyed papers, and present some of the challenges and trends in this field. We extract the following conclusions pertaining to affective haptics: 1) haptic stimulation can be successfully used to achieve a higher level of emotional immersion during media consumption or emotional telepresence; 2) existing research has demonstrated that haptics is effective in communicating valence and arousal, and the emotions of happiness, sadness, anger and fear, and less focus have been given to the communication of disgust and surprise; 3) the haptic-based affect detection remains an understudied topic, whereas the haptic-based affect display is a well-established subject; and 4) the interpretation of the haptic stimulation by human beings is highly contextual.

Journal ArticleDOI
TL;DR: One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training.

Proceedings ArticleDOI
16 May 2016
TL;DR: A novel decentralized algorithm that coordinates the forces of a group of robots during a cooperative manipulation task that allows the leader to be a human, and it is proved that using this algorithm, all followers' forces will synchronize to the direction of the force applied by one leader robot.
Abstract: This paper proposes a novel decentralized algorithm that coordinates the forces of a group of robots during a cooperative manipulation task. The highlight of our approach is that no communication is needed between any two robots. Our underlying intuition is that every follower robot can measure the direction of the movement of the object and then applies its force along that direction to reinforce the movement. We prove that using our algorithm, all followers' forces will synchronize to the direction of the force applied by one leader robot, who guides the robotic fleet to its destination. We first verify our algorithm by simulation in a physics engine, where 20 robots transport a chair collectively. We then validate our algorithm in hardware experiments by building four low-cost robots, equipped with force and velocity sensors, to transport a cardboard box in a laboratory environment. In addition, our algorithm allows the leader to be a human, and we also demonstrate the human-swarm cooperation in our manipulation experiments.

Proceedings ArticleDOI
08 Apr 2016
TL;DR: A perceptual experiment executed in a teleoperated environment with kinesthetic feedback showed that the addition of tactile feedback, provided through the Haptic Thimble, significantly improved performance of an exploratory task.
Abstract: This work presents the Haptic Thimble, a novel wearable haptic device for surface exploration. The Haptic Thimble combines rendering of surface orientation with fast transient and wide frequency bandwidth tactile cues. Such features allow surface exploration with rich tactile feedback, including reactive contact — no contact transition, rendering of collisions, surface asperities and textures. Above capabilities were obtained through a novel serial kinematics wrapped around the finger, actuated by compact servo motor for orienting the last link, and by a custom voice coil for actuating the plate in contact with the fingerpad. Performance of the voice coil were measured at the bench in static and dynamic conditions, assessing the capability of reproducing generic, wide-bandwidth (0–300 Hz) tactile cues. Overall usability of the Haptic Thimble was explored within a virtual environment involving exploration of virtual surfaces. Finally, a perceptual experiment executed in a teleoperated environment with kinesthetic feedback, showed that the addition of tactile feedback, provided through the Haptic Thimble, significantly improved performance of an exploratory task.

Journal ArticleDOI
TL;DR: Experimental findings show that this ultra-thin SPA and the unique integration process of the discrete lead zirconate titanate (PZT) based piezoelectric sensors achieve high resolution of soft contact sensing as well as accurate control on vibrotactile feedback by closing the control loop.
Abstract: The latest wearable technologies demand more intuitive and sophisticated interfaces for communication, sensing, and feedback closer to the body. Evidently, such interfaces require flexibility and conformity without losing their functionality even on rigid surfaces. Although there has been various research efforts in creating tactile feedback to improve various haptic interfaces and master-slave manipulators, we are yet to see a comprehensive device that can both supply vibratory actuation and tactile sensing. This paper describes a soft pneumatic actuator (SPA) based, SPA-skin prototype that allows bidirectional tactile information transfer to facilitate simpler and responsive wearable interface. We describe the design and fabrication of a 1.4 mm-thick vibratory SPA - skin that is integrated with piezoelectric sensors. We examine in detail the mechanical performance compared to the SPA model and the sensitivity of the sensors for the application in vibrotactile feedback. Experimental findings show that this ultra-thin SPA and the unique integration process of the discrete lead zirconate titanate (PZT) based piezoelectric sensors achieve high resolution of soft contact sensing as well as accurate control on vibrotactile feedback by closing the control loop.

Proceedings ArticleDOI
08 Apr 2016
TL;DR: A novel wearable cutaneous device for the proximal finger phalanx, called "hRing", which consists of two servo motors that move a belt placed in contact with the user's finger skin that improves the performance and perceived effectiveness of the considered task of 20% and 47% with respect to not providing any force feedback, respectively.
Abstract: The wearable electronics business has powered over $14 billion in 2014 and it is estimated to power over $70 billion by 2024. However, commercially-available wearable devices still provide very limited haptic feedback, mainly focusing on vibrotactile sensations. Towards a more realistic feeling of interacting with virtual and remote objects, we propose a novel wearable cutaneous device for the proximal finger phalanx, called "hRing". It consists of two servo motors that move a belt placed in contact with the user's finger skin. When the motors spin in opposite directions, the belt presses into the user's finger, while when the motors spin in the same direction, the belt applies a shear force to the skin. Its positioning on the proximal finger phalanx improves the capability of this device to be used together with unobtrusive hand tracking systems, such as the LeapMotion controller and the Kinect sensor. The viability of the proposed approach is demonstrated through a pick-and-place experiment involving seven human subjects. Providing cutaneous feedback through the proposed device improved the performance and perceived effectiveness of the considered task of 20% and 47% with respect to not providing any force feedback, respectively. All subjects found no difference in the quality of the tracking when carrying out the task wearing the device versus barehanded.

Proceedings ArticleDOI
07 May 2016
TL;DR: Several novel haptic interactions for the Haptic Edge Display are described including dynamic physical affordances, shape display, non-dominant hand interactions, and also in-pocket ``pull' style haptic notifications.
Abstract: Current mobile devices do not leverage the rich haptic channel of information that our hands can sense, and instead focus primarily on touch based graphical interfaces. Our goal is to enrich the user experience of these devices through bi-directional haptic and tactile interactions (display and control) around the edge of hand-held devices. We propose a novel type of haptic interface, a Haptic Edge Display, consisting of actuated pins on the side of a display, to form a linear array of tactile pixels (taxels). These taxels are implemented using small piezoelectric actuators, which can be made cheaply and have ideal characteristics for mobile devices. We developed two prototype Haptic Edge Displays, one with 24 actuated pins (3.75mm in pitch) and a second with 40 pins (2.5mm in pitch). This paper describes several novel haptic interactions for the Haptic Edge Display including dynamic physical affordances, shape display, non-dominant hand interactions, and also in-pocket ``pull' style haptic notifications. In a laboratory experiment we investigated the limits of human perception for Haptic Edge Displays, measuring the just-noticeable difference for pin width and height changes for both in-hand and simulated in-pocket conditions.

Proceedings ArticleDOI
Adnan Aijaz1
03 Apr 2016
TL;DR: This paper investigates radio resource allocation for haptic communications in LTE-A networks and develops a novel heuristic algorithm to solve the resource allocation problem.
Abstract: The Tactile Internet will be able to transport touch and actuation in real-time. The primary application running over the Tactile Internet will be haptic communications. Design efforts for both the Tactile Internet and the haptic communications are at a nascent stage. It is expected that the next generation (5G) wireless networks will play a key role in realizing the Tactile Internet. On the other hand, Long Term Evolution Advanced (LTE-A) networks would be an integral component of the 5G ecosystem. Therefore, exploring the potential of LTE-A networks for haptic communications would be an important step towards realizing the Tactile Internet. To this end, the main objective of this paper is to investigate radio resource allocation for haptic communications in LTE-A networks. The radio resource requirements of haptic communications have been translated into a unique resource allocation problem which becomes particularly challenging due to the specific constraints of multiple access schemes in LTE-A networks. Novel heuristic algorithm has been developed to solve the resource allocation problem. Performance evaluation has been conducted using simulation studies for a recently proposed 5G air-interface design.

Proceedings ArticleDOI
07 May 2016
TL;DR: A novel interactive system that mutually copies adjacent 3D environments optically and physically and realizes mutual user interactions through haptics without wearing any devices is proposed.
Abstract: In this paper, we propose a novel interactive system that mutually copies adjacent 3D environments optically and physically. The system realizes mutual user interactions through haptics without wearing any devices. A realistic volumetric image is displayed using a pair of micro-mirror array plates (MMAPs). The MMAP transmissively reflects the rays from an object, and a pair of them reconstructs the floating aerial image of the object. Our system can optically copy adjacent environments based on this technology. Haptic feedback is also given by using an airborne ultrasound tactile display (AUTD). Converged ultrasound can give force feedback in midair. Based on the optical characteristics of the MMAPs, the cloned image and the user share an identical coordinate system. When a user touches the transferred clone image, the system gives force feedback so that the user can feel the mechanical contact and reality of the floating image.

Journal ArticleDOI
01 Dec 2016
TL;DR: A hand rehabilitation learning system, the SAFE Glove, a device that can be utilized to enhance the rehabilitation of subjects with disabilities and is able to learn fingertip motion and force for grasping different objects and then record and analyze the common movements of hand function.
Abstract: This paper presents a hand rehabilitation learning system, the SAFE Glove, a device that can be utilized to enhance the rehabilitation of subjects with disabilities. This system is able to learn fingertip motion and force for grasping different objects and then record and analyze the common movements of hand function including grip and release patterns. The glove is then able to reproduce these movement patterns in playback fashion to assist a weakened hand to accomplish these movements, or to modulate the assistive level based on the user’s or therapist’s intent for the purpose of hand rehabilitation therapy. Preliminary data have been collected from healthy hands. To demonstrate the glove’s ability to manipulate the hand, the glove has been fitted on a wooden hand and the grasping of various objects was performed. To further prove that hands can be safely driven by this haptic mechanism, force sensor readings placed between each finger and the mechanism are plotted. These experimental results demonstrate the potential of the proposed system in rehabilitation therapy.

28 Mar 2016
TL;DR: In this paper, the authors investigated the propagation patterns of cutaneous vibration in the hand during interactions with touched objects and found that the observed vibration patterns evolved rapidly in time, peaking in intensity within a few milliseconds, fading within 20-30 ms, and yielding interaction-dependent distributions of energy in frequency bands that span the range of vibrotactile sensitivity.
Abstract: We investigated the propagation patterns of cutaneous vibration in the hand during interactions with touched objects. Prior research has highlighted the importance of vibrotactile signals during haptic interactions, but little is known of how vibrations propagate throughout the hand. Furthermore, the extent to which the patterns of vibrations reflect the nature of the objects that are touched, and how they are touched, is unknown. Using an apparatus comprised of an array of accelerometers, we mapped and analyzed spatial distributions of vibrations propagating in the skin of the dorsal region of the hand during active touch, grasping, and manipulation tasks. We found these spatial patterns of vibration to vary systematically with touch interactions and determined that it is possible to use these data to decode the modes of interaction with touched objects. The observed vibration patterns evolved rapidly in time, peaking in intensity within a few milliseconds, fading within 20–30 ms, and yielding interaction-dependent distributions of energy in frequency bands that span the range of vibrotactile sensitivity. These results are consistent with findings in perception research that indicate that vibrotactile information distributed throughout the hand can transmit information regarding explored and manipulated objects. The results may further clarify the role of distributed sensory resources in the perceptual recovery of object attributes during active touch, may guide the development of approaches to robotic sensing, and could have implications for the rehabilitation of the upper extremity.

Journal ArticleDOI
TL;DR: This work deployed a social robot, the Haptic Creature, in an interaction designed to be calming: participants held the robot on their laps and stroked it as it was breathing, and reported themselves as calmer and happier.
Abstract: With advances in sensor and actuator design, intelligent computing techniques and personal care robotics, today's robots hold promise as fully interactive, therapeutic human companions. To achieve this ambitious goal, key interaction components must be identified and then systematically designed and evaluated. Based on successes of human-animal therapy, we propose affective touch as one such component. Delivering this adjunct in a controllable robot form allows us to examine its efficacy for therapeutic applications such as anxiety management. With an approach grounded in social cognitive theories for human-animal relations, we deployed a social robot, the Haptic Creature, in an interaction designed to be calming: participants held the robot on their laps and stroked it as it was breathing. As a result, their heart and respiration rates significantly decreased relative to stroking a non-breathing robot. They also reported themselves as calmer and happier.

Journal ArticleDOI
TL;DR: The design, development, and evaluation of Haptogram; a system designed to provide point-cloud tactile display via acoustic radiation pressure, which shows that all displayed tactile objects are perceivable by the human skin.
Abstract: Studies of the stimulating effect of ultrasound as a tactile display have recently become more intensive in the haptic domain. In this paper, we present the design, development, and evaluation of Haptogram; a system designed to provide point-cloud tactile display via acoustic radiation pressure. A tiled 2-D array of ultrasound transducers is used to produce a focal point that is animated to produce arbitrary 2-D and 3-D tactile shapes. The switching speed is very high, so that humans feel the distributed points simultaneously. The Haptogram system comprises a software component and a hardware component. The software component enables users to author and/or select a tactile object, create a point-cloud representation, and generate a sequence of focal points to drive the hardware. The hardware component comprises a tiled 2-D array of ultrasound transducers, each driven by an FPGA. A quantitative analysis is conducted to measure the Haptogram ability to display various tactile shapes, including a single point, 2-D shapes (a straight line and a circle) and a 3-D object (a hemisphere). Results show that all displayed tactile objects are perceivable by the human skin (an average of 2.65 kPa for 200 focal points). A usability study is also conducted to evaluate the ability of humans to recognize 2-D shapes. Results show that the recognition rate was well above the chance level (average of 59.44% and standard deviation of 12.75%) while the recognition time averaged 13.87 s (standard deviation of 3.92 s).

Journal ArticleDOI
TL;DR: To enable the identification of virtual 3D objects without visual feedback, a haptic display based on a vibrotactile glove and multiple points of contact gives users an enhanced sensation of touching a virtual object with their hands.
Abstract: The emergence of off-screen interaction devices is bringing the field of virtual reality to a broad range of applications where virtual objects can be manipulated without the use of traditional peripherals. However, to facilitate object interaction, other stimuli such as haptic feedback are necessary to improve the user experience. To enable the identification of virtual 3D objects without visual feedback, a haptic display based on a vibrotactile glove and multiple points of contact gives users an enhanced sensation of touching a virtual object with their hands. Experimental results demonstrate the capacity of this technology in practical applications.