scispace - formally typeset
A

Andreas M. Tillmann

Researcher at RWTH Aachen University

Publications -  33
Citations -  979

Andreas M. Tillmann is an academic researcher from RWTH Aachen University. The author has contributed to research in topics: Underdetermined system & Computational complexity theory. The author has an hindex of 10, co-authored 31 publications receiving 857 citations. Previous affiliations of Andreas M. Tillmann include Braunschweig University of Technology & Technische Universität Darmstadt.

Papers
More filters
Journal ArticleDOI

The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing

TL;DR: It is confirmed by showing that for a given matrix A and positive integer k, computing the best constants for which the RIP or NSP hold is, in general, NP-hard.
Posted Content

The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing

TL;DR: In this article, it was shown that for a given matrix A and positive integer k, computing the best constants for which the restricted isometry property (RIP) or nullspace property (NSP) hold is NP-hard.
Journal ArticleDOI

DOLPHIn—Dictionary Learning for Phase Retrieval

TL;DR: This work proposes a new algorithm to learn a dictionary for reconstructing and sparsely encoding signals from measurements without phase, and jointly reconstructs the unknown signal and learns a dictionary such that each patch of the estimated image can be sparsely represented.
Journal ArticleDOI

On the Computational Intractability of Exact and Approximate Dictionary Learning

TL;DR: In this paper, it was shown that learning a dictionary with which a given set of training signals can be represented as sparsely as possible is indeed a NP-hard problem, and also established hardness of approximating the solution to within large factors of the optimal sparsity level.
Journal ArticleDOI

On the Computational Intractability of Exact and Approximate Dictionary Learning

TL;DR: It is shown that learning a dictionary with which a given set of training signals can be represented as sparsely as possible is indeed NP-hard, and hardness of approximating the solution to within large factors of the optimal sparsity level is established.