scispace - formally typeset
E

Erich Kobler

Researcher at Graz University of Technology

Publications -  38
Citations -  2022

Erich Kobler is an academic researcher from Graz University of Technology. The author has contributed to research in topics: Computer science & Iterative reconstruction. The author has an hindex of 8, co-authored 27 publications receiving 1202 citations. Previous affiliations of Erich Kobler include Johannes Kepler University of Linz.

Papers
More filters
Journal ArticleDOI

Learning a variational network for reconstruction of accelerated MRI data.

TL;DR: In this paper, a variational network approach is proposed to reconstruct the clinical knee imaging protocol for different acceleration factors and sampling patterns using retrospectively and prospectively undersampled data.
Posted Content

Learning a Variational Network for Reconstruction of Accelerated MRI Data

TL;DR: To allow fast and high‐quality reconstruction of clinical accelerated multi‐coil MR data by learning a variational network that combines the mathematical structure of variational models with deep learning.
Journal ArticleDOI

Assessment of the generalization of learned image reconstruction and the potential for transfer learning

TL;DR: In this paper, the authors evaluated the generalization ability of learned image reconstruction with respect to deviations in the acquisition settings between training and testing, and provided an outlook for the potential of transfer learning to fine-tune trainings to a particular target application using only a small number of training cases.
Book ChapterDOI

Variational Networks: Connecting Variational Methods and Deep Learning

TL;DR: Surprisingly, in numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.
Proceedings ArticleDOI

Total Deep Variation for Linear Inverse Problems

TL;DR: This paper proposes a novel learnable general-purpose regularizer exploiting recent architectural design patterns from deep learning and casts the learning problem as a discrete sampled optimal control problem, for which the adjoint state equations and an optimality condition are derived.