scispace - formally typeset
C

Christian Theobalt

Researcher at Max Planck Society

Publications -  508
Citations -  34680

Christian Theobalt is an academic researcher from Max Planck Society. The author has contributed to research in topics: Motion capture & Computer science. The author has an hindex of 89, co-authored 450 publications receiving 25487 citations. Previous affiliations of Christian Theobalt include Stanford University & Facebook.

Papers
More filters
Posted Content

High-Fidelity Neural Human Motion Transfer from Monocular Video

TL;DR: In this article, a recurrent deep neural network (RNN) is used to generate intermediate representations from 2D poses and their temporal derivatives, which are then used to synthesize results with plausible dynamics and pose-dependent detail.
Posted Content

EgoRenderer: Rendering Human Avatars from Egocentric Camera Images

TL;DR: EgoRenderer as mentioned in this paper uses a parametric body model to estimate body pose from the egocentric view and then synthesizes an external free-viewpoint pose image by projecting the parametric model to the user-specified target viewpoint.

Scene-aware Egocentric 3D Human Pose Estimation **Supplementary Material**

TL;DR: In this article , the authors estimate the transformation matrix Mhead2ego between the calibration board and the fisheye camera with hand-eye calibration, and then estimate the relative pose Mego2calib between the egocentric camera and the external calibration board.

3D Image Analysis and Synthesis at MPI Informatik.

TL;DR: In this article, a model-based system was developed to acquire, reconstruct and render free-viewpoint videos of human actors that nicely illustrates the concept of 3D Image Analysis and Synthesis.

Supplementary of NeuRIS

TL;DR: In this paper , the authors remove faces in a (predicted) mesh that are not observed in its corresponding ground truth mesh or not visible to all cameras, and filter faces at empty regions of GT mesh by 2D masks, which are rendered from GT mesh for each input view.