scispace - formally typeset
Search or ask a question

Showing papers by "Morten Fjeld published in 2007"


Proceedings ArticleDOI
29 Apr 2007
TL;DR: The empirical evaluation described in this paper compares learning effectiveness and user acceptance of AC versus the more traditional ball-and-stick model (BSM), which results were almost the same for both learning environments.
Abstract: Augmented Chemistry (AC) is an application that utilizes a tangible user interface (TUI) for organic chemistry education. The empirical evaluation described in this paper compares learning effectiveness and user acceptance of AC versus the more traditional ball-and-stick model (BSM). Learning effectiveness results were almost the same for both learning environments. User preference and rankings, using NASA-TLX and SUMI, showed more differences and it was therefore decided to focus mainly on improving these aspects in a re-design of the AC system. For enhanced interaction, keyboard-free system configuration, and internal/external database (DB) access, a graphical user interface (GUI) has been incorporated into the TUI. Three-dimensional (3D) rendering has also been improved using shadows and related effects, thereby enhancing depth perception. The re-designed AC system was then compared to the old system by means of a small qualitative user study. This user study showed an improvement in subjective opinions a out the system's ease of use and ease of learning.

102 citations


Proceedings ArticleDOI
19 Nov 2007
TL;DR: Ortholumen is a light pen based tabletop interaction system that can employ all the pen's spatial degrees of freedom and can be expanded to track multiple pens of the same or different colors, making the system affordable to home users.
Abstract: Ortholumen is a light pen based tabletop interaction system that can employ all the pen's spatial degrees of freedom (DOF). The pen's light is projected from above onto a horizontal translucent screen and tracked by a Webcam sitting underneath, facing upwards; system output is projected back onto the same screen. The elliptic light spot cast by the pen informs the system of pen position, orientation, and direction. While this adds up to six DOFs, we have used up to four at a time. In order to better separate input and output light we employ polarizing filters on the Webcam and on the projector lens. Two applications, painting and map navigation, are presented. Ortholumen can be expanded to track multiple pens of the same or different colors. This would enable bi-manual input, collaboration, and placed pens as external memory. Visible light, as opposed to infrared or radio, may be perceived more directly by users. Ortholumen employs only low-cost parts, making the system affordable to home users.

14 citations


Proceedings ArticleDOI
03 Sep 2007
TL;DR: Hierarchical Task Analysis is used as a potential explanation, and three other theories that relate to these findings are presented: mental models, habit errors, and emotional attachment.
Abstract: The present study explores potential usability gaps when users switch from a familiar to an unfamiliar mobile phone interface. A within-subject experiment was performed in which nine users familiar with Sony-Ericsson T630 and nine familiar with Nokia 7250 performed tasks on both phones. On average, test subjects spent more time on finishing tasks with an unfamiliar phone than with a familiar one. For two of the four tasks, there was a significant difference in completion time between the first-time Nokia users and the first-time Sony-Ericsson users. The tasks of adding a contact to the address book and sending an SMS to a contact in the address book were performed more quickly by new Nokia users than by new Sony-Ericsson users. The subjective difficulty ranking also showed that first-time Nokia users found the new phone easier to use than first-time Sony-Ericsson users did. Hierarchical Task Analysis is used as a potential explanation, and three other theories that relate to these findings are presented: mental models, habit errors, and emotional attachment.

9 citations


01 Jan 2007
TL;DR: The purpose of this project is to create a library that will allow its users to control 3D applications by using one or both of their hands by using palm and finger coding, and to reduce the processing power required for feasible real-time 3D interaction.
Abstract: The purpose of this project is to create a library that will allow its users to control 3D applications by using one or both of their hands. The final product could easily be incorporated into 3D applications, each customized to utilize a set of poses. Even though off-the-shelf motion capture gloves have reached lower prices in recent years, they are still expensive for home users. The algorithm suggested is based only on a single webcam combined with coded palm and fingers. Users should be able to code one or more of the fingers. One webcam is still somewhat constraining as two should ideally be used for 3D mapping of the hand, but by additionally using palm and finger coding we can greatly improve precision and, most importantly, reduce the processing power required for feasible real-time 3D interaction.

9 citations


01 Jan 2007
TL;DR: A device with the purpose of controlling, in real time, a motor which is integrated into a slider system with accuracy and latency values sufficient for productive interaction is designed.
Abstract: This paper examines the use of motorized physical sliders with position and force as input and output parameters for tangible computer interaction. We have designed a device with the purpose of controlling, in real time, a motor which is integrated into a slider system with accuracy and latency values sufficient for productive interaction. This was accomplished with a microcontroller that handles the I2C protocol for communication with a master device that centralizes the sliders’ information. The system is modular, using the configuration of one mainboard and I2C protocol to communicate with the sliders. The mainboard interacts with the computer through a USB connection. The mainboard also controls the sliders, each sitting on a slider board. This paper also presents the designs and realizations of the mainboard and slider board hardware components. Finally, the paper envisions future applications of force feedback sliders such as mapping of GUI sliders onto physical sliders, polling of user impressions based on non-verbal selections cues, and remote controls with haptic feedback.

8 citations


01 Jan 2007
TL;DR: While mobile browser access to any web page is fundamental, it is argued that it is even more important to offer mobile context specific web-based services that create user benefit by initially aiming for the mobile context before catering the desktop context.
Abstract: The user experience of the mobile internet is most of the time inferior to the desktop internet. This is often explained by the mobile internet’s cumbersome user interface. This paper argues that the real reason is the business community’s incomplete of understanding of the mobile context, preventing the creation of killer applications. While mobile browser access to any web page is fundamental, we argue that it is even more important to offer mobile context specific web-based services. These services would create user benefit by initially aiming for the mobile context before catering the desktop context.

1 citations


Proceedings ArticleDOI
03 Sep 2007
TL;DR: A case study examining prototyping as a method in re-designing a user interface (UI) found paper prototyping to be an efficient method to gain user feedback on usability issues and that a low-fidelity prototype does not automatically mean low-effort testing.
Abstract: This paper presents the results of a case study examining prototyping as a method in re-designing a user interface (UI). In the case presented, a web-based room booking was re-designed. Running on a university web site, the existing system has caused much critique amongst its users. Their expectations for a new UI were increased ease of use, less effort required, and less time consumed. We prototyped a new UI using Visio and tested it with a small number of experienced and novice users. Our results partly favor the existing system and partly the new one. To our surprise, experienced users performed relatively poorer with the new UI considering their critique of the existing one. We found paper prototyping to be an efficient method to gain user feedback on usability issues and that a low-fidelity prototype does not automatically mean low-effort testing. We observed that visible-state UI elements can be demanding to test through paper prototyping.

1 citations


01 Jan 2007
TL;DR: The objective behind Tangible User Interfaces is to allow users to interact with computers through familiar tangible objects, thereby taking advantage of the richness of the tactile world combined with the power of computer-based simulations.
Abstract: The objective behind Tangible User Interfaces (TUIs) is to allow users to interact with computers through familiar tangible objects, thereby taking advantage of the richness of the tactile world combined with the power of computer-based simulations. TUIs give physical form to digital information, employing physical artifacts both as representations and controls for computational media. They lend themselves well to collaboration around intelligent tables, or what we call tabletop interaction. At the t2i Lab at Chalmers, we are expanding the boundaries of interactive technology. We do this primarily by constructing TUIs and tabletop, large-display User Interfaces (UI). These can be used in creative problem solving, collaborative work, and science education. Fields of knowledge at the t2i Lab include software (SW) for multimodal UIs, sensors and actuators, analogue and digital hardware (HW), vision-based tracking system utilizing infrared (IR), and visible light. Further areas of investigation are six-degrees-of-freedom (6DOF) UIs, automatic user analysis, and cognitive-perceptual issues.