scispace - formally typeset
Search or ask a question

Showing papers by "Jacob Eisenstein published in 2001"


Proceedings ArticleDOI
01 Jan 2001
TL;DR: It is claimed that without some abstract description of the UI, it is likely that the design and the development of user-interfaces for mobile computing will be very time consuming, error-prone or even doomed to failure.
Abstract: Mobile computing poses a series of unique challenges for user interface design and development: user interfaces must now accommodate the capabilities of various access devices and be suitable for different contexts of use, while preserving consistency and usability. We propose a set of techniques that will aid UI designers who are working in the domain of mobile computing. These techniques will allow designers to build UIs across several platforms, while respecting the unique constraints posed by each platform. In addition, these techniques will help designers to recognize and accommodate the unique contexts in which mobile computing occurs. Central to our approach is the development of a user-interface model that serves to isolate those features that are common to the various contexts of use, and to specify how the user-interface should adjust when the context changes. We claim that without some abstract description of the UI, it is likely that the design and the development of user-interfaces for mobile computing will be very time consuming, error-prone or even doomed to failure.

271 citations


Proceedings ArticleDOI
02 May 2001
TL;DR: This study builds on the previous work with audio and video servers and explores haptic data in support of touch and motor skills and investigates the use of clustering techniques to recognize hand signs using hapticData.
Abstract: The term multimedia has a different meaning to different communities. The computer industry uses this term to refer to a system that can display audio and video clips. Generally speaking, a multimedia system supports multiple presentation modes to convey information. Humans have five senses: sight, hearing, touch, smell and taste. In theory, a system based on this generalized definition must be able to convey information in support of all senses. This would be a step towards virtual environments that facilitate total recall of an experience. This study builds on our previous work with audio and video servers and explores haptic data in support of touch and motor skills. It investigates the use of clustering techniques to recognize hand signs using haptic data. An application of these results is communication devices for the hearing impaired.

20 citations


Proceedings ArticleDOI
05 Oct 2001
TL;DR: This study introduces a framework to store and retrieve "moving sensors" data, which advocates physical data independence and software-reuse, and investigates alternative representations for storage and retrieve of data in support of query processing.
Abstract: Moving sensors refers to an emerging class of data intensive applications that inpacts disciplines such as communication, health-care, scientific applications, etc. These applications consist of a fixed number of sensors that move and produce streams of data as a function of time. They may require the system to match these streams against stored streams to retrieve relevant data (patterns). With communication, for example, a speaking impaired individual might utilize a haptic glove that translates hand signs into written (spoken) words. The glove consists of sensors for different finger joints. These sensors report their location and values as a function of time, producing streams of data. These streams are matched against a repository of spatio-temporal streams to retrieve the corresponding English character or word.The contributions of this study are two fold. First, it introduces a framework to store and retrieve "moving sensors" data. The framework advocates physical data independence and software-reuse. Second, we investigate alternative representations for storage and retrieve of data in support of query processing. We quantify the tradeoff associated with these alternatives using empirical data RoboCup soccer matches.

8 citations


Proceedings Article
01 Jan 2001
TL;DR: A spectrum of preference relations is described, and a new syntax for modeling preference is proposed that allows decision trees to be tightly integrated into the user-interface model itself, enhancing their flexibility and power.
Abstract: The incorporation of plastic and adaptive user-interfaces into the model-based paradigm requires a new, more flexible modeling formalism. Rather than modeling the user-interface as a set of static structures and mappings, the UI should be modeled as a set of design preferences. Preferences are frequently many-toone or many-to-many relationships that elude conventional UI modeling, which has largely focused on oneto-one mappings. In this paper, a spectrum of preference relations is described, and a new syntax for modeling preference is proposed. This spectrum extends from simple one-to-one bindings to complex design guidelines that can be structured together to implement decision trees. This new representation allows decision trees to be tightly integrated into the user-interface model itself, enhancing their flexibility and power.