scispace - formally typeset
Search or ask a question
Patent

Gestures for touch sensitive input devices

TL;DR: In this article, the authors described a system for processing touch inputs with respect to a multipoint sensing device and identifying at least one multipoint gesture based on the data from the multi-point sensing device.
Abstract: Methods and systems for processing touch inputs are disclosed. The invention in one respect includes reading data from a multipoint sensing device such as a multipoint touch screen where the data pertains to touch input with respect to the multipoint sensing device, and identifying at least one multipoint gesture based on the data from the multipoint sensing device.
Citations
More filters
Patent
06 Sep 2007
TL;DR: In this paper, a computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen, applying one or several heuristics to the finger contacts to determine a command for the device, and processing the command.
Abstract: A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen display, applying one or more heuristics to the one or more finger contacts to determine a command for the device, and processing the command. The one or more heuristics comprise: a heuristic for determining that the one or more finger contacts correspond to a one- dimensional vertical screen scrolling command, a heuristic for determining that the one or more finger contacts correspond to a two-dimensional screen translation command, and a heuristic for determining that the one or more finger contacts correspond to a command to transition from displaying a respective item in a set of items to displaying a next item in the set of items.

2,167 citations

Patent
03 Mar 2006
TL;DR: In this paper, a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used is presented, and a GUI for each of the multiple functions of the devices is presented.
Abstract: Disclosed herein is a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used. Preferably, the multifunctional hand-held device has at most only a few physical buttons, keys, or switches so that its display size can be substantially increased. The multifunctional hand-held device also incorporates a variety of input mechanisms, including touch sensitive screens, touch sensitive housings, display actuators, audio input, etc. The device also incorporates a user-configurable GUI for each of the multiple functions of the devices.

1,844 citations

Patent
11 Jan 2011
TL;DR: In this article, an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.
Abstract: An intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The system can be implemented using any of a number of different platforms, such as the web, email, smartphone, and the like, or any combination thereof. In one embodiment, the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact.

1,462 citations

Patent
Jong Hwan Kim1
13 Mar 2015
TL;DR: In this article, a mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of a body when it comes into contact with a side of an external terminal, display a first area on the touchscreen corresponding to a contact area of body and the external terminal and a second area including the content.
Abstract: A mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of the body comes into contact with one side of an external terminal, display a first area on the touchscreen corresponding to a contact area of the body and the external terminal and a second area including the content, receive an input of moving the content displayed in the second area to the first area, display the content in the first area, and share the content in the first area with the external terminal.

1,441 citations

Patent
19 Jul 2005
TL;DR: In this article, a user interface method for detecting a touch and then determining user interface mode when a touch is detected is presented. And the method further includes activating one or more GUI elements based on the user interface modes and in response to the detected touch.
Abstract: A user interface method is disclosed. The method includes detecting a touch and then determining a user interface mode when a touch is detected. The method further includes activating one or more GUI elements based on the user interface mode and in response to the detected touch.

1,390 citations

References
More filters
Patent
04 Aug 1997
TL;DR: In this article, a touchpad assembly and method for providing a signal to a computer indicative of the location and applied pressure of an object touching the touchpad is provided. Butler et al.
Abstract: A touchpad assembly and method for providing a signal to a computer indicative of the location and applied pressure of an object touching the touchpad assembly is provided. The touchpad assembly includes X and Y position and pressure sensitive semiconductor resistance sensor layers. The X and Y sensors have a pair of spaced apart X and Y conductive traces running across opposite ends such that a resistance RX connects the pair of X traces and a resistance RY connects the pair of Y traces. The X and Y sensors come into contact at a contact point when an object asserts a pressure on the touchpad. The contact point is connected to each trace by a variable pressure resistance RZ associated with the X and Y sensors and variable position resistances of the X and Y resistances. First and second pair of timing capacitors are connected to respective ones of the pairs of X and Y traces. A microprocessor controls and monitors charging time of the capacitors to determine the position and asserted pressure of the object touching the touchpad.

367 citations

Patent
12 Aug 1997
TL;DR: An apparatus for touchpad-based scroll control and method for scroll control comprising a data packet processor working in conjunction with a touchpad is described in this article. But it does not specify a user interface.
Abstract: An apparatus for touchpad-based scroll control and method for scroll control comprising a data packet processor working in conjunction with a touchpad. A scroll zone, having a central axis, is defined on the touchpad. After detecting a user running a finger on the touchpad in a direction substantially parallel to an axis running the length of the scroll zone, the processor software sends scrolling messages to the operating system or application that owns an active window. The packet processing software is configured to not scroll on motions that are not substantially parallel to the axis of the scroll zone, thereby avoiding unwanted interference with normal program function, and also stops scrolling when the user lifts the scroll-activating finger or moves it in direction substantially perpendicular to the scroll zone.

366 citations

Patent
09 Dec 1998
TL;DR: In this paper, a system and method for manipulating virtual objects in a virtual environment, for drawing curves and ribbons in the virtual environment and for selecting and executing commands for creating, deleting, moving, changing, and resizing virtual objects using intuitive hand gestures and motions.
Abstract: A system and method for manipulating virtual objects in a virtual environment, for drawing curves and ribbons in the virtual environment, and for selecting and executing commands for creating, deleting, moving, changing, and resizing virtual objects in the virtual environment using intuitive hand gestures and motions. The system is provided with a display for displaying the virtual environment and with a video gesture recognition subsystem for identifying motions and gestures of a user's hand. The system enables the user to manipulate virtual objects, to draw free-form curves and ribbons and to invoke various command sets and commands in the virtual environment by presenting particular predefined hand gestures and/or hand movements to the video gesture recognition subsystem.

365 citations

Patent
19 Aug 2004
TL;DR: In this article, a touch pad system is disclosed, which includes mapping the touch pad into native sensor coordinates and producing native values of the native sensors coordinates when events occur on the touchpad.
Abstract: A touch pad system is disclosed. The system includes mapping the touch pad into native sensor coordinates. The system also includes producing native values of the native sensor coordinates when events occur on the touch pad. The system further includes filtering the native values of the native sensor coordinates based on the type of events that occur on the touch pad. The system additionally includes generating a control signal based on the native values of the native sensor coordinates when a desired event occurs on the touch pad.

364 citations

Proceedings ArticleDOI
01 Oct 1997
TL;DR: This TechNote reports on the initial results of realizing a computer augmented wall using an infrared camera located behind the wall called the HoloWall, which allows a user to interact with this computerized wall using ngers, hands, their body, or even a physical object such as a document folder.
Abstract: This TechNote reports on our initial results of realizing a computer augmented wall called the HoloWall. Using an infrared camera located behind the wall, this system allows a user to interact with this computerized wall using ngers, hands, their body, or even a physical object such as a document folder.

359 citations