scispace - formally typeset
Search or ask a question

Showing papers by "Raimund Dachselt published in 2018"


Proceedings ArticleDOI
19 Apr 2018
TL;DR: A conceptual framework is proposed to enable analysts to explore data items, track interaction histories, and alter visualization configurations through mechanisms using both devices in combination to support visual data analysis.
Abstract: We explore the combination of smartwatches and a large interactive display to support visual data analysis. These two extremes of interactive surfaces are increasingly popular, but feature different characteristics-display and input modalities, personal/public use, performance, and portability. In this paper, we first identify possible roles for both devices and the interplay between them through an example scenario. We then propose a conceptual framework to enable analysts to explore data items, track interaction histories, and alter visualization configurations through mechanisms using both devices in combination. We validate an implementation of our framework through a formative evaluation and a user study. The results show that this device combination, compared to just a large display, allows users to develop complex insights more fluidly by leveraging the roles of the two devices. Finally, we report on the interaction patterns and interplay between the devices for visual exploration as observed during our study.

67 citations


Journal ArticleDOI
TL;DR: VisTiles is a conceptual framework that uses a set of mobile devices to distribute and coordinate visualization views for the exploration of multivariate data and presents a web-based prototype implementation as a specific instance of the concepts.
Abstract: We present V is T iles , a conceptual framework that uses a set of mobile devices to distribute and coordinate visualization views for the exploration of multivariate data. In contrast to desktop-based interfaces for information visualization, mobile devices offer the potential to provide a dynamic and user-defined interface supporting co-located collaborative data exploration with different individual workflows. As part of our framework, we contribute concepts that enable users to interact with coordinated & multiple views (CMV) that are distributed across several mobile devices. The major components of the framework are: (i) dynamic and flexible layouts for CMV focusing on the distribution of views and (ii) an interaction concept for smart adaptations and combinations of visualizations utilizing explicit side-by-side arrangements of devices. As a result, users can benefit from the possibility to combine devices and organize them in meaningful spatial layouts. Furthermore, we present a web-based prototype implementation as a specific instance of our concepts. This implementation provides a practical application case enabling users to explore a multivariate data collection. We also illustrate the design process including feedback from a preliminary user study, which informed the design of both the concepts and the final prototype.

58 citations


Book ChapterDOI
16 Oct 2018
TL;DR: This chapter briefly review the development of natural user interfaces and discusses their role in providing human-computer interaction that is immersive in various ways, and suggests some interaction design guidelines for immersive analytics.
Abstract: In this chapter, we briefly review the development of natural user interfaces and discuss their role in providing human-computer interaction that is immersive in various ways. Then we examine some opportunities for how these technologies might be used to better support data analysis tasks. Specifically, we review and suggest some interaction design guidelines for immersive analytics. We also review some hardware setups for data visualization that are already archetypal. Finally, we look at some emerging system designs that suggest future directions.

46 citations


Proceedings ArticleDOI
20 Apr 2018
TL;DR: This workshop will bring together researchers, designers, and practitioners from relevant application and research fields, including visualization, personal informatics, and data journalism to work on identifying a research agenda for mobile data visualization.
Abstract: As mobile visualization is increasingly used and new mobile device form factors and hardware capabilities continuously emerge, it is timely to reflect on what has been discovered to date and to look into the future. This workshop will bring together researchers, designers, and practitioners from relevant application and research fields, including visualization, personal informatics, and data journalism. We will work on identifying a research agenda for mobile data visualization as well as to collect and propagate practical guidance for mobile visualization design. Our overarching goal is to bring us closer to making an effective use of ubiquitous mobile devices as data visualization platforms.

26 citations


Proceedings ArticleDOI
01 Mar 2018
TL;DR: This paper proposes a conceptual framework for Reality-Based Information Retrieval, which combines the classic Information retrieval process with Augmented Reality technologies to provide context-dependent search cues and situated visualizations of the query and the results.
Abstract: Today, the widespread use of mobile devices allows users to search information "on the go»», whenever and wherever they want, no longer confining Information Retrieval to classic desktop interfaces. We believe that technical advances in Augmented Reality will allow Information Retrieval to go even further, making use of both the users» surroundings and their abilities to interact with the physical world. In this paper, we present the fundamental concept of Reality-Based Information Retrieval, which combines the classic Information Retrieval process with Augmented Reality technologies to provide context-dependent search cues and situated visualizations of the query and the results. With information needs often stemming from real-world experiences, this novel combination has the potential to better support both Just-in-time Information Retrieval and serendipity. Based on extensive literature research, we propose a conceptual framework for Reality-Based Information Retrieval. We illustrate and discuss this framework and present two prototypical implementations, which we tested in small user studies. They demonstrate the feasibility of our concepts and inspired our discussion of notable challenges for further research in this novel and promising area.

21 citations


Proceedings ArticleDOI
20 Apr 2018
TL;DR: This paper presents a set of cord-based interaction techniques for browsing menus, selecting items, adjusting continuous values & ranges and solving advanced tasks in AR, and presents the current implementation including different touch-enabled cords, its data transmission and AR visualization.
Abstract: Research on wearable controllers has shown that body-worn cords have many interesting physical affordances that make them powerful as a novel input device to control mobile applications in an unobtrusive manner. With this paper, we want to extend the interaction and application repertoire of body-worn cords by contributing the concept of visually augmented interactive cords using state-of-the-art augmented reality (AR) glasses. This novel combination of simultaneous input and output on a cord has the potential to create rich AR user interfaces that seamlessly support direct interaction and reduce cognitive burden by providing visual and tactile feedback. As a main contribution, we present a set of cord-based interaction techniques for browsing menus, selecting items, adjusting continuous values & ranges and solving advanced tasks in AR. In addition, we present our current implementation including different touch-enabled cords, its data transmission and AR visualization. Finally, we conclude with future challenges.

14 citations


Proceedings ArticleDOI
20 Apr 2018
TL;DR: The DebugAR approach shows a representation of the current systems state, message provenance, and the lifetime of participating nodes and offers layouting techniques, and provides a screen that shows a traditional text-log, to bridge the gap to conventional tools.
Abstract: Distributed systems are very complex and in case of errors hard to debug. The high number of messages with non deterministic delivery timings, as well as message losses, data corruption and node crashes cannot be efficiently analyzed with traditional GUI tools. We propose to use immersive technologies in a multi-display environment to tackle these shortcomings. Our DebugAR approach shows a representation of the current systems state, message provenance, and the lifetime of participating nodes and offers layouting techniques. By providing a screen that shows a traditional text-log, we bridge the gap to conventional tools. Additionally, we propose an interactive 3D visualization of the message flow, combining an interactive tabletop with augmented reality using a head-mounted display. We are confident that our proposed solution can not only be used to analyze distributed system, but also for other time-dependent networks.

7 citations


Proceedings ArticleDOI
11 Oct 2018
TL;DR: A three-stage fabrication pipeline is contributed that describes the production and assembly of thin, bendable and highly customizable membrane dome switches on the basis of prototyping methods with different skill levels making the approach suitable for technology-enthusiastic makers, researchers, fab labs and others who require custom membrane switches in small quantities.
Abstract: Momentary switches are important building blocks to prototype novel physical user interfaces and enable tactile, explicit and eyes-free interactions. Unfortunately, typical representatives, such as push-buttons or pre-manufactured membrane switches, often do not fulfill individual design requirements and lack customization options for rapid prototyping. With this work, we present Pushables, a DIY fabrication approach for producing thin, bendable and highly customizable membrane dome switches. Therefore, we contribute a three-stage fabrication pipeline that describes the production and assembly on the basis of prototyping methods with different skill levels making our approach suitable for technology-enthusiastic makers, researchers, fab labs and others who require custom membrane switches in small quantities. To demonstrate the wide applicability of Pushables, we present application examples from ubiquitous, mobile and wearable computing.

4 citations


Proceedings ArticleDOI
29 May 2018
TL;DR: The current web-based prototype of the conceptual VisTiles framework runs on commodity devices and is able to determine the spatial device arrangement by either a cross-device pinch gesture or an external tracking system.
Abstract: We demonstrate the prototype of the conceptual VisTiles framework. VisTiles allows exploring multivariate data sets by using multiple coordinated views that are distributed across a set of mobile devices. This setup allows users to benefit from dynamic and user-defined interface arrangements and to easily initiate co-located data exploration sessions. The current web-based prototype runs on commodity devices and is able to determine the spatial device arrangement by either a cross-device pinch gesture or an external tracking system. Multiple data sets are provided that can be explored by different visualizations (e.g., scatterplots, parallel coordinate plots, stream graphs). With this demonstration, we showcase the general concepts of VisTiles and discuss ideas for enhancements as well the potential for application cases beyond data analysis.

2 citations


Proceedings ArticleDOI
20 Apr 2018
TL;DR: This work explores the combination of smartwatches and a large interactive display to support visual data analysis and presents a conceptual framework and its implementation, which enables analysts to explore data items using both devices in combination.
Abstract: We explore the combination of smartwatches and a large interactive display to support visual data analysis. These two extremes of interactive surfaces are increasingly popular, but feature different characteristics-display and input modalities, personal/public use, performance, and portability. With this demonstration, we present our conceptual framework and its implementation, which enables analysts to explore data items using both devices in combination. Building on a analysis scenario for convicted crimes in Baltimore, our demonstration gives an impression of how the device combination can allow users to develop complex insights more fluidly by leveraging the device roles.

1 citations


Journal ArticleDOI
TL;DR: This article provides a comprehensive overview of the final TouchNoise concept and its approach to mapping and interaction, from which a variety of unique sonic capabilities derives.
Abstract: TouchNoise is a multitouch interface for creative work with noise. It allows direct and indirect manipulation of sound particles, which are added together in a panning and frequency space. Based on the mechanics of a multiagent system and flocking algorithms, novel possibilities for the creation and modulation of noise and harmonic spectra are supported. TouchNoise underwent extensive revisions and extensions throughout a three-year, iterative development process. This article provides a comprehensive overview of the final TouchNoise concept and its approach to mapping and interaction, from which a variety of unique sonic capabilities derives. This article is based on our experiences with a fully functional prototype implementation, and focuses on the systematic exploration and discussion of these new sonic capabilities and corresponding playing techniques, which differ strongly from traditional synthesis interfaces.

DOI
01 Jan 2018
TL;DR: A novel graspable device, called HANDle, is introduced that is developed to train wrist agility, finger strength as well as its coordination in a motivational game and a therapy game that combines different physiotherapeutic motion and grasp exercises supporting custom-defined levels that match to the patient needs.
Abstract: Today’s working environments are characterized by and at the same time highly depending on many repetitive hand movements, such as typing or assembling tasks. The physical health of hands is thereby becoming increasingly valuable. Guided by the idea of Tangible User Interfaces (TUIs), we introduce a novel graspable device, called HANDle, that we developed to train wrist agility, finger strength as well as its coordination in a motivational game. Therefore, we iteratively prototyped a fully-functional controller that senses multiple finger forces, its relative position in space and that provides visual and vibro-tactile feedback. In addition, we implemented a therapy game that combines different physiotherapeutic motion and grasp exercises supporting custom-defined levels that match to the patient needs.

Proceedings ArticleDOI
20 Apr 2018
TL;DR: Reality-Based Information Retrieval augments the classic Information retrieval process with context-dependent search cues and situated query and result visualizations using Augmented Reality technologies.
Abstract: With this work, we demonstrate our concept of Reality-Based Information Retrieval. Our principal idea is to bring Information Retrieval closer to the real world, for a new class of future, immersive IR interfaces. Technological advances in computer vision and machine learning will allow mobile Information Retrieval to make even better use of the people's surroundings and their ability to interact with the physical world. Reality-Based Information Retrieval augments the classic Information Retrieval process with context-dependent search cues and situated query and result visualizations using Augmented Reality technologies. We briefly describe our concept as an extension of the Information Retrieval pipeline and present two prototype implementations that showcase the potential of Reality-Based Information Retrieval.

DOI
01 Jan 2018
TL;DR: The design and evaluation of a smartwatch-based mid-air pointing and clicking interaction technique called Twist, Point, and Tap, or short TPT is presented, which aims to provide a fast and error-prone pointing approach that can easily be deployed to existing environments with a shared display.
Abstract: In this work, we present the design and evaluation of a smartwatch-based mid-air pointing and clicking interaction technique called Twist, Point, and Tap, or short TPT. Incorporating only commodity devices, we aim to provide a fast and error-prone pointing approach that can easily be deployed to existing environments with a shared display, e.g., meeting rooms or public info points. Detected by internal sensors, TPT maps horizontal forearm movements as well as wrist rotation to relative cursor movements on a nearby large display. Left and right-click interactions are supported through tapping on the smartwatch’s touchscreen. By running a Fitts’s law study, we compared our TPT concept against an existing smartwatch-based pointing technique called Watchpoint (Katsuragawa et al., 2016). The study revealed that the TPT concept has a smaller error rate while maintaining a comparable performance.