scispace - formally typeset
Search or ask a question
Author

Clifford A. Kushler

Bio: Clifford A. Kushler is an academic researcher. The author has contributed to research in topics: Interface (computing) & Keyboard computer. The author has an hindex of 4, co-authored 4 publications receiving 888 citations.

Papers
More filters
Patent
16 Jan 2004
TL;DR: In this article, a method and system for continuous stroke word-based text input, having a virtual keyboard (2104) appearing on a touch sensitive screen (2102), was presented.
Abstract: A method and system (2100) for continuous stroke word-based text input, having a virtual keyboard (2104) appearing on a touch sensitive screen (2102). The present invention allows a user to use the virtual keyboard to quickly enter words by drawing a continuous line that passes through or near the keys of each letter in a word in sequence without lifting the stylus. The user traces an input pattern (2200) for a word by contacting the virtual keyboard on or near the key of the first letter of the word, then tracing through each letter in sequence, lifting the stylus from the touch sensitive screen upon reaching the last letter. The present invention then analyzes each input pattern and presents a list of one or more possible words based on the input pattern, for selection by the user.

840 citations

Patent
16 May 2007
TL;DR: In this paper, a motion sensor is used to track an infrared target that is attached to an instrument or a body part of a user, and a click event occurs based on a predefined action of the target by the user.
Abstract: Systems and methods for a hands free mouse include a motion sensor in communication with a standard computer such that the computer receives pointer control signals from the motion sensor. The motion sensor tracks an infrared target that is attached to an instrument or a body part of a user. Therefore allowing a user to continue their task and use either their body or an instrument being used to move a pointer on a computer screen. The movement of the pointer, on the screen, correlates with the position of the pointer in space. Based on a predefined action of the infrared target by the user a click event occurs.

22 citations

Patent
18 Sep 2004
TL;DR: In this article, a method and system of the present invention are distinguished by the fact that graphical elements can be displayed to a communication partner to enhance communication beyond words and synthesized speech.
Abstract: A method and system of the present invention are distinguished by the fact that graphical elements can be displayed to a communication partner to enhance communication beyond words and synthesized speech. Extensive research in the field of augmentative communications has focused on using graphical elements, such as pictures and icons, to help a non-vocal user encode a message quicker than typing it letter-by-letter. But in spite of the well-known axiom “a picture's worth a thousand words”, none of these techniques have thought to use pictures, animations, video, or other graphical elements to output the message as well. The present invention corrects this oversight by providing two touch-sensitive, graphical, dynamic displays: one for the operator and one for the interlocutor (communication partner).

19 citations

Patent
05 Nov 2004
TL;DR: A hardware interface device controlled by assistive technology software residing on a computer as discussed by the authors posts and intercepts external keyboard and mouse events in a manner such that the commands received by the computer are indistinguishable by the operating system from those received from standard mouse and keyboard hardware.
Abstract: A hardware interface device controlled by assistive technology software residing on a computer. The hardware interface device posts and intercepts external keyboard and mouse events. The hardware interface device sends the keyboard and mouse commands into the computer from the hardware interface device in a manner such that the keyboard and mouse commands received by the computer are indistinguishable by the operating system from those received from standard mouse and keyboard hardware.

7 citations


Cited by
More filters
Patent
09 May 2008
TL;DR: In this article, the authors described a system for processing touch inputs with respect to a multipoint sensing device and identifying at least one multipoint gesture based on the data from the multi-point sensing device.
Abstract: Methods and systems for processing touch inputs are disclosed. The invention in one respect includes reading data from a multipoint sensing device such as a multipoint touch screen where the data pertains to touch input with respect to the multipoint sensing device, and identifying at least one multipoint gesture based on the data from the multipoint sensing device.

2,584 citations

Patent
11 Jan 2011
TL;DR: In this article, an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.
Abstract: An intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The system can be implemented using any of a number of different platforms, such as the web, email, smartphone, and the like, or any combination thereof. In one embodiment, the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact.

1,462 citations

Patent
19 Jul 2005
TL;DR: In this article, a user interface method for detecting a touch and then determining user interface mode when a touch is detected is presented. And the method further includes activating one or more GUI elements based on the user interface modes and in response to the detected touch.
Abstract: A user interface method is disclosed. The method includes detecting a touch and then determining a user interface mode when a touch is detected. The method further includes activating one or more GUI elements based on the user interface mode and in response to the detected touch.

1,390 citations

Patent
30 Sep 2005
TL;DR: Proximity based systems and methods that are implemented on an electronic device are disclosed in this article, where the method includes sensing an object spaced away and in close proximity to the electronic device.
Abstract: Proximity based systems and methods that are implemented on an electronic device are disclosed. The method includes sensing an object spaced away and in close proximity to the electronic device. The method also includes performing an action in the electronic device when an object is sensed.

1,337 citations

Patent
30 Jan 2007
TL;DR: In this article, methods and systems related to gesturing with multipoint sensing devices are disclosed, and a system for implementing gestures with sensing devices is described. But it is not discussed in detail.
Abstract: Methods and systems for implementing gestures with sensing devices are disclosed. More particularly, methods and systems related to gesturing with multipoint sensing devices are disclosed.

1,045 citations