scispace - formally typeset
Search or ask a question
Author

Olivier Bau

Bio: Olivier Bau is an academic researcher from Samsung. The author has contributed to research in topics: Signal & Haptic technology. The author has an hindex of 17, co-authored 31 publications receiving 1821 citations. Previous affiliations of Olivier Bau include French Institute for Research in Computer Science and Automation & University of Paris-Sud.

Papers
More filters
Proceedings ArticleDOI
03 Oct 2010
TL;DR: The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface, which enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch.
Abstract: We present a new technology for enhancing touch interfaces with tactile feedback. The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface. When combined with an interactive display and touch input, it enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch. We present the principles of operation and an implementation of the technology. We also report the results of three controlled psychophysical experiments and a subjective user evaluation that describe and characterize users' perception of this technology. We conclude with an exploration of the design space of tactile touch screens using two comparable setups, one based on electrovibration and another on mechanical vibrotactile actuation.

740 citations

Proceedings ArticleDOI
19 Oct 2008
TL;DR: OctoPocus is described, an example of a dynamic guide that combines on-screen feedforward and feedback to help users learn, execute and remember gesture sets and can be applied to a wide range of single-stroke gestures and recognition algorithms.
Abstract: We describe OctoPocus, an example of a dynamic guide that combines on-screen feedforward and feedback to help users learn, execute and remember gesture sets. OctoPocus can be applied to a wide range of single-stroke gestures and recognition algorithms and helps users progress smoothly from novice to expert performance. We provide an analysis of the design space and describe the results of two experi-ments that show that OctoPocus is significantly faster and improves learning of arbitrary gestures, compared to con-ventional Help menus. It can also be adapted to a mark-based gesture set, significantly improving input time compared to a two-level, four-item Hierarchical Marking menu.

276 citations

Journal ArticleDOI
01 Jul 2012
TL;DR: This paper expands tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world with a broad range of application scenarios where the technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.
Abstract: REVEL is an augmented reality (AR) tactile technology that allows for change to the tactile feeling of real objects by augmenting them with virtual tactile textures using a device worn by the user. Unlike previous attempts to enhance AR environments with haptics, we neither physically actuate objects or use any force- or tactile-feedback devices, nor require users to wear tactile gloves or other apparatus on their hands. Instead, we employ the principle of reverse electrovibration where we inject a weak electrical signal anywhere on the user body creating an oscillating electrical field around the user's fingers. When sliding his or her fingers on a surface of the object, the user perceives highly distinctive tactile textures augmenting the physical object. By tracking the objects and location of the touch, we associate dynamic tactile sensations to the interaction context. REVEL is built upon our previous work on designing electrovibration-based tactile feedback for touch surfaces [Bau, et al. 2010]. In this paper we expand tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world. We demonstrate a broad range of application scenarios where our technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.

174 citations

Proceedings ArticleDOI
07 May 2011
TL;DR: Applications for the visually impaired to interpret and create 2D tactile information based on TeslaTouch are demonstrated and the technology's potential in supporting communication among visually impaired individuals is discussed.
Abstract: TeslaTouch is a technology that provides tactile sensation to moving fingers on touch screens. Based on TeslaTouch, we have developed applications for the visually impaired to interpret and create 2D tactile information. In this paper, we demonstrate these applications, present observations from the interaction, and discuss TeslaTouch's potential in supporting communication among visually impaired individuals.

114 citations

Patent
05 Sep 2012
TL;DR: In this article, an image capturing device is combined with a touch screen to generate a tactile map of an environment, which is then processed and used to correlate a point of user contact on the touch screen with a particular tactile sensation.
Abstract: An image capturing device may be combined with a touch screen to generate a tactile map of an environment. The image capturing device captures an image of the environment which is then processed and used to correlate a point of user contact on a touch screen to a particular tactile sensation. The touch screen may then generate an electric signal (i.e., tactile feedback) corresponding to the tactile sensation which is felt by a user contacting the touch screen. By using the electrical signal as tactile feedback (e.g., electrovibration), the user may determine relative spatial locations of the objects in the environment, the objects' physical characteristics, the distance from the objects to the image capturing device, and the like.

113 citations


Cited by
More filters
Patent
Jong Hwan Kim1
13 Mar 2015
TL;DR: In this article, a mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of a body when it comes into contact with a side of an external terminal, display a first area on the touchscreen corresponding to a contact area of body and the external terminal and a second area including the content.
Abstract: A mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of the body comes into contact with one side of an external terminal, display a first area on the touchscreen corresponding to a contact area of the body and the external terminal and a second area including the content, receive an input of moving the content displayed in the second area to the first area, display the content in the first area, and share the content in the first area with the external terminal.

1,441 citations

Journal ArticleDOI
13 May 2012
TL;DR: The current status of flexible electronics is reviewed and the future promise of these pervading technologies in healthcare, environmental monitoring, displays and human-machine interactivity, energy conversion, management and storage, and communication and wireless networks is predicted.
Abstract: Thin-film electronics in its myriad forms has underpinned much of the technological innovation in the fields of displays, sensors, and energy conversion over the past four decades. This technology also forms the basis of flexible electronics. Here we review the current status of flexible electronics and attempt to predict the future promise of these pervading technologies in healthcare, environmental monitoring, displays and human-machine interactivity, energy conversion, management and storage, and communication and wireless networks.

881 citations

Proceedings ArticleDOI
03 Oct 2010
TL;DR: The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface, which enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch.
Abstract: We present a new technology for enhancing touch interfaces with tactile feedback. The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface. When combined with an interactive display and touch input, it enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch. We present the principles of operation and an implementation of the technology. We also report the results of three controlled psychophysical experiments and a subjective user evaluation that describe and characterize users' perception of this technology. We conclude with an exploration of the design space of tactile touch screens using two comparable setups, one based on electrovibration and another on mechanical vibrotactile actuation.

740 citations

01 Jan 2008
TL;DR: The concept of Reality-Based Interaction (RBI) as discussed by the authors has been proposed as a unifying concept that ties together a large subset of emerging interaction styles and provides a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs.
Abstract: We are in the midst of an explosion of emerging humancomputer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Reality-Based Interaction (RBI) as a unifying concept that ties together a large subset of these emerging interaction styles. Based on this concept of RBI, we provide a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs. We believe that viewing interaction through the lens of RBI provides insights for design and uncovers gaps or opportunities for future research.

708 citations

Proceedings ArticleDOI
06 Apr 2008
TL;DR: It is believed that viewing interaction through the lens of RBI provides insights for design and uncovers gaps or opportunities for future research.
Abstract: We are in the midst of an explosion of emerging human-computer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Reality-Based Interaction (RBI) as a unifying concept that ties together a large subset of these emerging interaction styles. Based on this concept of RBI, we provide a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs. We believe that viewing interaction through the lens of RBI provides insights for design and uncovers gaps or opportunities for future research.

631 citations