Book ChapterDOI
Variational Networks: Connecting Variational Methods and Deep Learning
Erich Kobler,Teresa Klatzer,Kerstin Hammernik,Thomas Pock,Thomas Pock +4 more
- pp 281-293
Reads0
Chats0
TLDR
Surprisingly, in numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.Abstract:
In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.read more
Citations
More filters
Journal ArticleDOI
Learning a variational network for reconstruction of accelerated MRI data.
Kerstin Hammernik,Teresa Klatzer,Erich Kobler,Michael P. Recht,Daniel K. Sodickson,Thomas Pock,Thomas Pock,Florian Knoll +7 more
TL;DR: In this paper, a variational network approach is proposed to reconstruct the clinical knee imaging protocol for different acceleration factors and sampling patterns using retrospectively and prospectively undersampled data.
Book ChapterDOI
RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
Zachary Teed,Jia Deng +1 more
TL;DR: RAFT as mentioned in this paper extracts per-pixel features, builds multi-scale 4D correlation volumes for all pairs of pixels, and iteratively updates a flow field through a recurrent unit that performs lookups on the correlation volumes.
Journal ArticleDOI
Solving inverse problems using data-driven models
TL;DR: This survey paper aims to give an account of some of the main contributions in data-driven inverse problems.
Journal ArticleDOI
A gentle introduction to deep learning in medical image processing
TL;DR: A gentle introduction to deep learning in medical image processing is given, proceeding from theoretical foundations to applications, including general reasons for the popularity of deep learning, including several major breakthroughs in computer science.
Journal ArticleDOI
Modern Regularization Methods for Inverse Problems
Martin Benning,Martin Burger +1 more
TL;DR: In this article, the authors provide a reasonably comprehensive overview of this shift towards modern nonlinear regularization methods, including their analysis, applications and issues for future research, since they have attracted much recent interest and link to other fields, such as image processing and compressed sensing.
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Book
Convex Optimization
Stephen Boyd,Lieven Vandenberghe +1 more
TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book ChapterDOI
Microsoft COCO: Common Objects in Context
Tsung-Yi Lin,Michael Maire,Serge Belongie,James Hays,Pietro Perona,Deva Ramanan,Piotr Dollár,C. Lawrence Zitnick +7 more
TL;DR: A new dataset with the goal of advancing the state-of-the-art in object recognition by placing the question of object recognition in the context of the broader question of scene understanding by gathering images of complex everyday scenes containing common objects in their natural context.
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.