scispace - formally typeset
Open Access

Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective There are many signal techniques that can be used to perform data acquisition, analysis or processing more efficiently and accurately than can be done by most random signal processors.

TLDR
This work compares and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction, and points out a common misconception related to probabilistic compressible signal models.
Abstract
We compare and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction. We consider sparse and compressible signal models for deter- ministic and random signals, structured sparse and compressible signal models, point clouds, and manifold signal models. Each model has a particular geometrical structure that enables signal information to be stably preserved via a simple linear and nonadaptive projection to a much lower dimensional space; in each case the projection dimension is independent of the signal's ambient dimension at best or grows logarithmically with it at worst. As abonus, we point out a common misconception related to probabilistic compressible signal models, namely, by showing that the oft-used generalized Gaussian and Laplacian models do not support stable linear dimensionality reduction.

read more

Citations
More filters
Posted Content

A survey of dimensionality reduction techniques.

TL;DR: In this review, the plethora of dimension reduction techniques available are categorized and the mathematical insight behind them are given.
Journal ArticleDOI

Compressed Sensing, Sparsity, and Dimensionality in Neuronal Information Processing and Data Analysis

TL;DR: Recent mathematical advances that provide ways to combat dimensionality in specific situations are reviewed, shed light on two dual questions in neuroscience, and how do brains themselves process information in their intrinsically high-dimensional patterns of neural activity as well as learn meaningful, generalizable models of the external world from limited experience.
Book

Block-Based Compressed Sensing of Images and Video

TL;DR: For multiple-image scenarios, including video and multiview imagery, motion and disparity compensation is employed to exploit frame-to-frame redundancies due to object motion and parallax, resulting in residual frames which are more compressible and thus more easily reconstructed from compressed-sensing measurements.
Journal ArticleDOI

A Compressed Sensing Approach for Array Diagnosis From a Small Set of Near-Field Measurements

TL;DR: A technique for array diagnosis using a small number of measured data acquired by a near-field system is proposed, which gives satisfactory results in terms of failure detection with a reduction in the number of data of two orders of magnitudes compared to standard back-propagation technique and of one order of magnitude compared to thenumber of elements of the array.
Journal ArticleDOI

Should Penalized Least Squares Regression be Interpreted as Maximum A Posteriori Estimation

TL;DR: It is shown that for any prior PX, the minimum mean-square error (MMSE) estimator is the solution of a penalized least square problem with some penalty φ, which can be interpreted as the MAP estimator with the prior C·exp(-φ(x
References
More filters
Book ChapterDOI

I and J

Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Journal ArticleDOI

A and V.

Journal ArticleDOI

A tutorial on hidden Markov models and selected applications in speech recognition

TL;DR: In this paper, the authors provide an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and give practical details on methods of implementation of the theory along with a description of selected applications of HMMs to distinct problems in speech recognition.
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Related Papers (5)