scispace - formally typeset
Search or ask a question
Patent

Gestures for touch sensitive input devices

TL;DR: In this article, the authors described a system for processing touch inputs with respect to a multipoint sensing device and identifying at least one multipoint gesture based on the data from the multi-point sensing device.
Abstract: Methods and systems for processing touch inputs are disclosed. The invention in one respect includes reading data from a multipoint sensing device such as a multipoint touch screen where the data pertains to touch input with respect to the multipoint sensing device, and identifying at least one multipoint gesture based on the data from the multipoint sensing device.
Citations
More filters
Patent
06 Sep 2007
TL;DR: In this paper, a computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen, applying one or several heuristics to the finger contacts to determine a command for the device, and processing the command.
Abstract: A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen display, applying one or more heuristics to the one or more finger contacts to determine a command for the device, and processing the command. The one or more heuristics comprise: a heuristic for determining that the one or more finger contacts correspond to a one- dimensional vertical screen scrolling command, a heuristic for determining that the one or more finger contacts correspond to a two-dimensional screen translation command, and a heuristic for determining that the one or more finger contacts correspond to a command to transition from displaying a respective item in a set of items to displaying a next item in the set of items.

2,167 citations

Patent
03 Mar 2006
TL;DR: In this paper, a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used is presented, and a GUI for each of the multiple functions of the devices is presented.
Abstract: Disclosed herein is a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used. Preferably, the multifunctional hand-held device has at most only a few physical buttons, keys, or switches so that its display size can be substantially increased. The multifunctional hand-held device also incorporates a variety of input mechanisms, including touch sensitive screens, touch sensitive housings, display actuators, audio input, etc. The device also incorporates a user-configurable GUI for each of the multiple functions of the devices.

1,844 citations

Patent
11 Jan 2011
TL;DR: In this article, an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.
Abstract: An intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The system can be implemented using any of a number of different platforms, such as the web, email, smartphone, and the like, or any combination thereof. In one embodiment, the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact.

1,462 citations

Patent
Jong Hwan Kim1
13 Mar 2015
TL;DR: In this article, a mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of a body when it comes into contact with a side of an external terminal, display a first area on the touchscreen corresponding to a contact area of body and the external terminal and a second area including the content.
Abstract: A mobile terminal including a body; a touchscreen provided to a front and extending to side of the body and configured to display content; and a controller configured to detect one side of the body comes into contact with one side of an external terminal, display a first area on the touchscreen corresponding to a contact area of the body and the external terminal and a second area including the content, receive an input of moving the content displayed in the second area to the first area, display the content in the first area, and share the content in the first area with the external terminal.

1,441 citations

Patent
19 Jul 2005
TL;DR: In this article, a user interface method for detecting a touch and then determining user interface mode when a touch is detected is presented. And the method further includes activating one or more GUI elements based on the user interface modes and in response to the detected touch.
Abstract: A user interface method is disclosed. The method includes detecting a touch and then determining a user interface mode when a touch is detected. The method further includes activating one or more GUI elements based on the user interface mode and in response to the detected touch.

1,390 citations

References
More filters
Book
01 Jan 1967
TL;DR: The Simplex Method duality theory and sensitivity analysis for linear programming has been studied extensively in the field of operations research as mentioned in this paper, including the application of queueing theory inventory theory forecasting Markovian decision processes and applications decision analysis simulation.
Abstract: Overview of the operations research modelling approach introduction to linear programming solving linear programming problems - the Simplex Method the theory of the Simplex Method duality theory and sensitivity analysis other algorithms for linear programming the transportation and assignment problems network analysis, including Pert-CPM dynamic programming game theory integer programming non-linear programming Markov chains queueing theory the application of queueing theory inventory theory forecasting Markovian decision processes and applications decision analysis simulation.

3,830 citations

Patent
25 Jan 1999
TL;DR: In this paper, a simple proximity transduction circuit is placed under each electrode to maximize the signal-to-noise ratio and to reduce wiring complexity, and segmentation processing of each proximity image constructs a group of electrodes corresponding to each distinguishable contacts and extracts shape, position and surface proximity features for each group.
Abstract: Apparatus and methods are disclosed for simultaneously tracking multiple finger (202-204) and palm (206, 207) contacts as hands approach, touch, and slide across a proximity-sensing, compliant, and flexible multi-touch surface (2). The surface consists of compressible cushion (32), dielectric electrode (33), and circuitry layers. A simple proximity transduction circuit is placed under each electrode to maximize the signal-to-noise ratio and to reduce wiring complexity. Scanning and signal offset removal on electrode array produces low-noise proximity images. Segmentation processing of each proximity image constructs a group of electrodes corresponding to each distinguishable contacts and extracts shape, position and surface proximity features for each group. Groups in successive images which correspond to the same hand contact are linked by a persistent path tracker (245) which also detects individual contact touchdown and liftoff. Classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting into a versatile, ergonomic computer input device.

2,576 citations

Book
01 Jan 2002
TL;DR: A user interface is that portion of an interactive computer system that communicates with the user as mentioned in this paper. User interfaces include any aspect of the system that is visible to the user, such as a keyboard, mouse, or display.
Abstract: A user interface is that portion of an interactive computer system that communicates with the user. Design of the user interface includes any aspect of the system that is visible to the user. Once, all computer users were specialists in computing, and interfaces consisted of jumper wires in patch boards, punched cards (q.v.) prepared offline, and batch printouts. Today a wide range of nonspecialists use computers, and keyboards, mice, and graphical displays are the most common interface hardware. The user interface is becoming a larger and larger portion of the software in a computer system--and a more important portion, as broader groups of people use computers. As computers become more powerful, the critical bottleneck in applying computer-based systems to solve problems is more often in the user interface rather than in the computer hardware or software.

2,234 citations

Patent
06 Oct 1995
TL;DR: In this paper, the authors proposed a method of generating a signal comprising providing a capacitive touch sensor pad including a matrix of X and Y conductors, developing capacitance profiles in one of an X direction and a Y direction from the matrix, and determining an occurrence of a single gesture through an examination of the capacitance profile, the single gesture including an application of at least two objects on the capacitive sensor pad.
Abstract: A method of generating a signal comprising providing a capacitive touch sensor pad including a matrix of X and Y conductors, developing capacitance profiles in one of an X direction and a Y direction from the matrix of X and Y conductors, determining an occurrence of a single gesture through an examination of the capacitance profiles, the single gesture including an application of at least two objects on the capacitive touch sensor pad, and generating a signal indicating the occurrence of the single gesture.

2,103 citations

Journal ArticleDOI
TL;DR: A fraction of the recycle slurry is treated with sulphuric acid to convert at least some of the gypsum to calcium sulphate hemihydrate and the slurry comprising hemihYDrate is returned to contact the mixture of phosphate rock, phosphoric acid and recycle Gypsum slurry.
Abstract: The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI). In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. This has motivated a very active research area concerned with computer vision-based analysis and interpretation of hand gestures. We survey the literature on visual interpretation of hand gestures in the context of its role in HCI. This discussion is organized on the basis of the method used for modeling, analyzing, and recognizing gestures. Important differences in the gesture interpretation approaches arise depending on whether a 3D model of the human hand or an image appearance model of the human hand is used. 3D hand models offer a way of more elaborate modeling of hand gestures but lead to computational hurdles that have not been overcome given the real-time requirements of HCI. Appearance-based models lead to computationally efficient "purposive" approaches that work well under constrained situations but seem to lack the generality desirable for HCI. We also discuss implemented gestural systems as well as other potential applications of vision-based gesture recognition. Although the current progress is encouraging, further theoretical as well as computational advances are needed before gestures can be widely used for HCI. We discuss directions of future research in gesture recognition, including its integration with other natural modes of human-computer interaction.

1,973 citations