scispace - formally typeset
K

Klaus-Robert Müller

Researcher at Technical University of Berlin

Publications -  799
Citations -  98394

Klaus-Robert Müller is an academic researcher from Technical University of Berlin. The author has contributed to research in topics: Artificial neural network & Computer science. The author has an hindex of 129, co-authored 764 publications receiving 79391 citations. Previous affiliations of Klaus-Robert Müller include Korea University & University of Tokyo.

Papers
More filters
Proceedings ArticleDOI

The New MPEG-4/FAMC Standard for Animated 3D Mesh Compression

TL;DR: This paper presents a new compression technique for 3D dynamic meshes, referred to as FAMC - Frame-based Animated Mesh Compression, recently promoted within the MPEG-4 standard as Amen-dement 2 of part 16 (AFX -Animation Framework extension).
Proceedings ArticleDOI

Brain-computer interfacing in discriminative and stationary subspaces

TL;DR: It is shown that learning in a discriminative and stationary subspace is advantageous for BCI application and outperforms the standard SSA method.
Journal ArticleDOI

Multiscale temporal neural dynamics predict performance in a complex sensorimotor task

TL;DR: It is shown that Long-Range Temporal Correlations (LRTCs) estimated from the amplitude of EEG oscillations over a range of time-scales predict performance in a complex sensorimotor task, based on Brain-Computer Interfacing (BCI).
Journal ArticleDOI

Ensemble Learning of Coarse-Grained Molecular Dynamics Force Fields with a Kernel Approach

TL;DR: This work proposes a 2-layer training scheme that enables GDML to learn an effective coarse-grained (CG) model from all-atom simulation data in a sample efficient manner and yields a smaller free energy error than neural networks when the training set is small, and a comparably high accuracy when theTraining set is sufficiently large.
Posted Content

Understanding and Comparing Deep Neural Networks for Age and Gender Classification

TL;DR: In this paper, the authors compare four popular neural network architectures, studies the effect of pretraining, evaluates the robustness of the considered alignment preprocessings via cross-method test set swapping and intuitively visualizes the model's prediction strategies in given preprocessing conditions using the recent Layer-wise Relevance Propagation (LRP) algorithm.