scispace - formally typeset
Search or ask a question
Patent

Gestures for touch sensitive input devices

TL;DR: In this article, the authors described a system for processing touch inputs with respect to a multipoint sensing device and identifying at least one multipoint gesture based on the data from the multi-point sensing device.
Abstract: Methods and systems for processing touch inputs are disclosed. The invention in one respect includes reading data from a multipoint sensing device such as a multipoint touch screen where the data pertains to touch input with respect to the multipoint sensing device, and identifying at least one multipoint gesture based on the data from the multipoint sensing device.
Citations
More filters
Patent
06 Sep 2007
TL;DR: In this paper, a computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen, applying one or several heuristics to the finger contacts to determine a command for the device, and processing the command.
Abstract: A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen display, applying one or more heuristics to the one or more finger contacts to determine a command for the device, and processing the command. The one or more heuristics comprise: a heuristic for determining that the one or more finger contacts correspond to a one- dimensional vertical screen scrolling command, a heuristic for determining that the one or more finger contacts correspond to a two-dimensional screen translation command, and a heuristic for determining that the one or more finger contacts correspond to a command to transition from displaying a respective item in a set of items to displaying a next item in the set of items.

2,167 citations

Patent
03 Mar 2006
TL;DR: In this paper, a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used is presented, and a GUI for each of the multiple functions of the devices is presented.
Abstract: Disclosed herein is a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used. Preferably, the multifunctional hand-held device has at most only a few physical buttons, keys, or switches so that its display size can be substantially increased. The multifunctional hand-held device also incorporates a variety of input mechanisms, including touch sensitive screens, touch sensitive housings, display actuators, audio input, etc. The device also incorporates a user-configurable GUI for each of the multiple functions of the devices.

1,844 citations

Patent
11 Jan 2011
TL;DR: In this article, an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.
Abstract: An intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The system can be implemented using any of a number of different platforms, such as the web, email, smartphone, and the like, or any combination thereof. In one embodiment, the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact.

1,462 citations

Patent
Jong Hwan Kim1
13 Mar 2015
TL;DR: In this article, a mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of a body when it comes into contact with a side of an external terminal, display a first area on the touchscreen corresponding to a contact area of body and the external terminal and a second area including the content.
Abstract: A mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of the body comes into contact with one side of an external terminal, display a first area on the touchscreen corresponding to a contact area of the body and the external terminal and a second area including the content, receive an input of moving the content displayed in the second area to the first area, display the content in the first area, and share the content in the first area with the external terminal.

1,441 citations

Patent
19 Jul 2005
TL;DR: In this article, a user interface method for detecting a touch and then determining user interface mode when a touch is detected is presented. And the method further includes activating one or more GUI elements based on the user interface modes and in response to the detected touch.
Abstract: A user interface method is disclosed. The method includes detecting a touch and then determining a user interface mode when a touch is detected. The method further includes activating one or more GUI elements based on the user interface mode and in response to the detected touch.

1,390 citations

References
More filters
Patent
Cary Lee Bates1, Paul Reuben Day1
22 Nov 1999
TL;DR: In this paper, a navigation button is displayed corresponding to a displayed link in response to a predetermined event, such as changing the displayed portion of a hypertext document or touching the screen, especially if outside of all displayed links.
Abstract: A method, computer system, program product is provided for enhancing interaction with a hypertext document rendered by a browser on a touch screen. A navigation button is displayed corresponding to a displayed link in response to a predetermined event, such as changing the displayed portion of a hypertext document or touching the screen, especially if outside of all displayed links. When a plurality of links are present, prioritization of navigation buttons displayed is contemplated, based on those closest to the area touched and on a maximum allowable number of navigation buttons.

185 citations

Patent
21 May 1998
TL;DR: In this article, the operating system monitors the sensors and prevents the display of non-driving-related windows to the driver when the vehicle is in motion or when it has the potential for motion.
Abstract: A vehicle computer system includes a display device that is configurable for viewing by a driver of a vehicle while the vehicle is moving. The display device is responsive to a processor that executes application programs in conjunction with an operating system. A plurality of sensors are used to indicate the position of the display device, to indicate vehicle motion, and to indicate the state of a chosen vehicle control such as a parking brake. Application programs open display windows in conjunction with the operating system. In opening a window, an application program can indicate whether the window will contain driving-related information. The operating system monitors the sensors and prevents the display of non-driving-related windows to the driver when the vehicle is in motion or when it has the potential for motion. Specifically, the operating system in such a situation hides any windows that have not been specified as being driving-related. Initially, all windows are assumed to contain non-driver related information; hence, they are hidden until it is determined what type of information they display.

182 citations

Patent
04 May 2005
TL;DR: A command pattern recognition system based on a virtual keyboard layout combines pattern recognition with a virtual, graphical, or on-screen keyboard to provide a command control method with relative ease of use as mentioned in this paper.
Abstract: A command pattern recognition system based on a virtual keyboard layout combines pattern recognition with a virtual, graphical, or on-screen keyboard to provide a command control method with relative ease of use. The system allows the user conveniently issue commands on pen-based computing or communication devices. The system supports a very large set of commands, including practically all commands needed for any application. By utilizing shortcut definitions it can work with any existing software without any modification. In addition, the system utilizes various techniques to achieve reliable recognition of a very large gesture vocabulary. Further, the system provides feedback and display methods to help the user effectively use and learn command gestures for commands.

181 citations

PatentDOI
TL;DR: In a preferred embodiment, each option of a menu is associated respectively with a selectable region displayed adjacent an edge of a display, forming a perimeter menu and leaving a region in the center of the perimeter menu for the output of an application program.
Abstract: The apparatus and method of the invention relate to data entry and menu selection. Applications include: (a) data entry for ideographic languages, including Chinese, Japanese and Korean; (b) fast food ordering; (c) correction of documents generated by optical character recognition; and (d) computer access and speech synthesis by persons temporarily or permanently lacking normal motor capabilities. In a preferred embodiment, each option of a menu is associated respectively with a selectable region displayed adjacent an edge of a display, forming a perimeter menu and leaving a region in the center of the perimeter menu for the output of an application program. Selectable regions may be on the display, outside the display, or both. A menu option may be selected by clicking on the associated selectable region, by dwelling on it for a selection threshold period or by a cursor path toward the selectable region, or by a combination thereof Remaining dwell time required to select a selectable region is preferably indicated by the brightness of the selectable region. Submenus of a perimeter menu may also be perimeter menus and the location of a submenu option may be foretold by the appearance of its parent menu option. Menu options may be ideographs sharing a sound, a structure or another characteristic. Ideographs, which may be homophones of one another, may be associated with colored indicating regions and selection of an ideograph may be made by speaking the name of the associated color.

181 citations

Patent
George Francis Destefano1
15 Dec 1997
TL;DR: In this article, a computer system and method manipulate multiple windows or similar graphical user interface components using a proximity pointer that concurrently manipulates windows that are at least partially disposed within a proximity range located proximate the pointer.
Abstract: A computer system and method manipulate multiple windows or similar graphical user interface components using a proximity pointer that concurrently manipulates windows that are at least partially disposed within a proximity range located proximate the pointer. Windows may be concurrently moved or resized in response to movement of the pointer. In the alternative, windows may be concurrently moved or resized either inwardly or outwardly along radial lines extending from a common origin located proximate the pointer.

181 citations