scispace - formally typeset
R

Raimund Dachselt

Researcher at Dresden University of Technology

Publications -  198
Citations -  4319

Raimund Dachselt is an academic researcher from Dresden University of Technology. The author has contributed to research in topics: Augmented reality & Visualization. The author has an hindex of 35, co-authored 183 publications receiving 3570 citations. Previous affiliations of Raimund Dachselt include Blekinge Institute of Technology & Association for Computing Machinery.

Papers
More filters
Proceedings ArticleDOI

Augmented Displays: Seamlessly Extending Interactive Surfaces With Head-Mounted Augmented Reality

TL;DR: A new class of display systems combining high-resolution interactive surfaces with head-coupled Augmented Reality that extends the screen estate beyond the display and enables placing AR content directly at the display's borders or within the real environment.
Proceedings ArticleDOI

Blended interaction: envisioning future collaborative interactive spaces

TL;DR: This workshop will bring together leading experts in cognitive theories and post-WIMP designs and technologies to create this unified view of Blended Interaction in a multidisciplinary approach.

Eine deklarative Komponentenarchitektur und Interaktionsbausteine für dreidimensionale multimediale Anwendungen.

TL;DR: 3D-Benutzungsoberflachen (3D Graphical User Interfaces, 3DGUIs) wurden bisher jedoch vor allem im Bereich Virtuelle Realitat (VR) entwickelt, wo 1 Dieser Beitrag fast die Dissertation gleichen Titels zusammen, die auch als Buch [Da04] erschienen ist.

Contigra - Towards a Document-based Approach to 3D Components

TL;DR: The CONTIGRA architecture is introduced as a declarative high-level 3D component framework entirely based on XML documents and a set of technical and authoring requirements for threedimensional component architectures derived.
Proceedings ArticleDOI

AvatAR: An Immersive Analysis Environment for Human Motion Data Combining Interactive 3D Avatars and Trajectories

TL;DR: AvatAR as discussed by the authors is an immersive analysis environment for the in-situ visualization of human motion data, that combines 3D trajectories with virtual avatars showing people's detailed movement and posture.