scispace - formally typeset
Search or ask a question
Topic

User interface

About: User interface is a research topic. Over the lifetime, 85402 publications have been published within this topic receiving 1728377 citations. The topic is also known as: UI & input method.


Papers
More filters
Journal ArticleDOI
TL;DR: Through careful observation and analysis of user behavior, a mail interface unusable byNovices evolved into one that let novices do useful work within minutes.
Abstract: Many human-computer interfaces are designed with the assumption that the user must adapt to the system, that users must be trained and their behavior altered to fit a given interface. The research presented here proceeds from the alternative assumption: Novice behavior is inherently sensible, and the computer system can be made to adapt to it. Specifically, a measurably easy-to-use interface was built to accommodate the actual behavior of novice users. Novices attempted an electronic mail task using a command-line interface containing no help, no menus, no documentation, and no instruction. A hidden operator intercepted commands when necessary, creating the illusion of an interactive session. The software was repeatedly revised to recognize users' new commands; in essence, the interface was derived from user behavior. This procedure was used on 67 subjects. The first version of the software could recognize only 7 percent of all the subjects' spontaneously generated commands; the final version could recognize 76 percent of these commands. This experience contradicts the idea that user input is irrelevant to the design of command languages. Through careful observation and analysis of user behavior, a mail interface unusable by novices evolved into one that let novices do useful work within minutes.

289 citations

Journal ArticleDOI
TL;DR: State-of-the-art multimodal speech and gesture systems now process complex gestural input other than pointing, and new systems have been extended to process different mode combinations—the most noteworthy being speech and pen input, and speech and lip movements.
Abstract: more transparent experience than ever before. Our voice, hands, and entire body, once augmented by sensors such as microphones and cameras, are becoming the ultimate transparent and mobile multimodal input devices. The area of multimodal systems has expanded rapidly during the past five years. Since Bolt’s [1] original “Put That There” concept demonstration, which processed speech and manual pointing during object manipulation, significant achievements have been made in developing more general multimodal systems. State-of-the-art multimodal speech and gesture systems now process complex gestural input other than pointing, and new systems have been extended to process different mode combinations—the most noteworthy being speech and pen input [9], and speech and lip movements [10]. As a foundation for advancing new multimodal systems, proactive empirical work has generated predictive information on human-computer multimodal interaction, which is being used to P U I Sharon Oviatt and Philip Cohen

289 citations

Patent
06 Feb 1996
TL;DR: In this article, a video control user interface is provided for use in an interactive television system, which includes a remote control handset with a multi-purpose, multi-direction actuation pad and a set-top box configured to operate in different modes, including a movie-on-demand mode.
Abstract: A video control user interface is provided for use in an interactive television system. The video control user interface includes a remote control handset with a multi-purpose, multi-direction actuation pad and a set-top box configured to operate in different modes, including a movie-on-demand mode. In this mode, the set-top box receives digitally transmitted video data streams of a selected movie from a centralized head end server. During display of a video movie, the set-top box can cause, at the viewer's request, the television to display an icon representing a physical layout of the actuation pad on the remote control handset and one or more symbols arranged at locations relative to the icon. The symbols relate to shuttle controls for controlling viewing of the video movie. This user interface presents an intuitive visual mapping of the shuttle controls about the depicted icon onto physical actuation positions of the multi-direction pad on the remote control handset. When the viewer wishes to change the viewing mode (such as from "play" to "pause"), the viewer simply depresses the pad at an actuation position that corresponds to a desired shuttle control symbol arranged at approximately the same location relative to the pad-resembling icon that is displayed on the screen. This user interface provides intuitive video control using a multi-purpose actuator, thereby eliminating the need for dedicated shuttle control buttons on the remote control handset.

289 citations

Patent
14 Sep 2004
TL;DR: In this article, an entertainment head-end provides broadcast programming, video-on-demand services, and HTML-based interactive programming through a distribution network to client terminals in subscribers' homes.
Abstract: An entertainment head-end provides broadcast programming, video-on-demand services, and HTML-based interactive programming through a distribution network to client terminals in subscribers' homes. A number of different features are provided, including novel user interfaces, enhanced video-on-demand controls, a variety of interactive services (personalized news, jukebox, games, celebrity chat), and techniques that combine to provide user experiences evocative of conventional television.

289 citations

Patent
13 Jan 2014
TL;DR: In this paper, the processor is typically configured for displaying with the visual display (i) a physical view of the connections between the management module, the intermediate devices, and the edge devices and (ii) a first user-defined view of a management module.
Abstract: A system for managing edge devices, such as scanner devices, typically includes a management module having a processor. The processor is typically communicatively coupled to a user interface that includes a visual display and a plurality of edge devices. The processor may also be communicatively coupled to a plurality of intermediate devices. The management module enables a user of the user interface to manage various aspects of the system. The processor is typically configured for displaying with the visual display (i) a physical view of the connections between the management module, the intermediate devices, and the edge devices and (ii) a first user-defined view of the management module, the intermediate devices, and the scanner devices. The processor is typically also configured for managing operational information generated by connected edge devices, including for efficiently storing operational information and intelligently querying the operational information to assist the user in matching compatible plug-in applications with particular edge devices.

289 citations


Network Information
Related Topics (5)
Mobile computing
51.3K papers, 1M citations
87% related
Software
130.5K papers, 2M citations
87% related
Server
79.5K papers, 1.4M citations
85% related
Software development
73.8K papers, 1.4M citations
85% related
Graph (abstract data type)
69.9K papers, 1.2M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023211
2022526
20211,630
20203,004
20193,233
20183,024