D
Daniel Leithinger
Researcher at University of Colorado Boulder
Publications - 57
Citations - 2560
Daniel Leithinger is an academic researcher from University of Colorado Boulder. The author has contributed to research in topics: Haptic technology & Computer science. The author has an hindex of 22, co-authored 50 publications receiving 1878 citations. Previous affiliations of Daniel Leithinger include Massachusetts Institute of Technology.
Papers
More filters
Proceedings ArticleDOI
inFORM: dynamic physical affordances and constraints through shape and object actuation
TL;DR: This work outlines potential interaction techniques and introduces Dynamic Physical Affordances and Constraints with the inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction.
Proceedings ArticleDOI
Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices
TL;DR: This work enables jamming structures to sense input and function as interaction devices through two contributed methods for high-resolution shape sensing using: 1) index-matched particles and fluids, and 2) capacitive and electric field sensing.
Proceedings ArticleDOI
Relief: a scalable actuated shape display
Daniel Leithinger,Hiroshi Ishii +1 more
TL;DR: Relief is an actuated tabletop display, which is able to render and animate three-dimensional shapes with a malleable surface and is controlled with a low-cost, scalable platform built upon open-source hardware and software tools.
Proceedings ArticleDOI
Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration
TL;DR: A preliminary evaluation shows how users are able to manipulate remote objects, and the observations of several different manipulation techniques that highlight the expressive nature of the system are reported.
Proceedings ArticleDOI
Direct and gestural interaction with relief: a 2.5D shape display
TL;DR: This work argues why input modalities beyond direct touch are required and proposes the combination of freehand gestures and direct touch that provides additional degrees of freedom and resolves input ambiguities, while keeping the locus of interaction on the shape output.