scispace - formally typeset
Open AccessPosted Content

A Theoretical Analysis of Joint Manifolds

Reads0
Chats0
TLDR
This work proposes a new joint manifold framework for data ensembles that exploits dependencies among multiple sensors and shows that simple algorithms can exploit the joint manifold structure to improve their performance on standard signal processing applications.
Abstract
The emergence of low-cost sensor architectures for diverse modalities has made it possible to deploy sensor arrays that capture a single event from a large number of vantage points and using multiple modalities. In many scenarios, these sensors acquire very high-dimensional data such as audio signals, images, and video. To cope with such high-dimensional data, we typically rely on low-dimensional models. Manifold models provide a particularly powerful model that captures the structure of high-dimensional data when it is governed by a low-dimensional set of parameters. However, these models do not typically take into account dependencies among multiple sensors. We thus propose a new joint manifold framework for data ensembles that exploits such dependencies. We show that simple algorithms can exploit the joint manifold structure to improve their performance on standard signal processing applications. Additionally, recent results concerning dimensionality reduction for manifolds enable us to formulate a network-scalable data compression scheme that uses random projections of the sensed data. This scheme efficiently fuses the data from all sensors throu gh the addition of such projections, regardless of the data modalities and dimensions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective

TL;DR: In this article, the authors compare and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction, including sparse and compressible signal models for deterministic and random signals.

Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective There are many signal techniques that can be used to perform data acquisition, analysis or processing more efficiently and accurately than can be done by most random signal processors.

TL;DR: This work compares and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction, and points out a common misconception related to probabilistic compressible signal models.
Journal ArticleDOI

Joint Manifolds for Data Fusion

TL;DR: It is shown that joint manifold structure can lead to improved performance for a variety of signal processing algorithms for applications including classification and manifold learning and to formulate a scalable and universal dimensionality reduction scheme that efficiently fuses the data from all sensors.
Dissertation

Compressive sensing for signal ensembles

TL;DR: Compressive Sensing for Signal Ensembles is presented as a probabilistic method to estimate the strength of the response of individual components of a signal ensemble.
Proceedings Article

High-dimensional data fusion via joint manifold learning

TL;DR: It is shown that joint manifold structure can lead to improved performance for manifold learning and is leveraged to formulate a universal, network-scalable dimensionality reduction scheme that efficiently fuses the data from all sensors.
References
More filters
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI

Nonlinear dimensionality reduction by locally linear embedding.

TL;DR: Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Journal ArticleDOI

Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information

TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI

Eigenfaces for recognition

TL;DR: A near-real-time computer system that can locate and track a subject's head, and then recognize the person by comparing characteristics of the face to those of known individuals, and that is easy to implement using a neural network architecture.
Journal ArticleDOI

A global geometric framework for nonlinear dimensionality reduction.

TL;DR: An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.