scispace - formally typeset
Search or ask a question
Topic

Sparse approximation

About: Sparse approximation is a research topic. Over the lifetime, 18037 publications have been published within this topic receiving 497739 citations. The topic is also known as: Sparse approximation.


Papers
More filters
Journal ArticleDOI
TL;DR: The novelty of this work consists in presenting a framework of spatial-spectral KSRC and measuring the spatial similarity by means of neighborhood filtering in the kernel feature space, which opens a wide field for future developments in which filtering methods can be easily incorporated.
Abstract: Kernel sparse representation classification (KSRC), a nonlinear extension of sparse representation classification, shows its good performance for hyperspectral image classification. However, KSRC only considers the spectra of unordered pixels, without incorporating information on the spatially adjacent data. This paper proposes a neighboring filtering kernel to spatial-spectral kernel sparse representation for enhanced classification of hyperspectral images. The novelty of this work consists in: 1) presenting a framework of spatial-spectral KSRC; and 2) measuring the spatial similarity by means of neighborhood filtering in the kernel feature space. Experiments on several hyperspectral images demonstrate the effectiveness of the presented method, and the proposed neighboring filtering kernel outperforms the existing spatial-spectral kernels. In addition, the proposed spatial-spectral KSRC opens a wide field for future developments in which filtering methods can be easily incorporated.

164 citations

Journal ArticleDOI
TL;DR: The main lesson learned is that depending on the performance measure, greedy approaches and iterative reweighted algorithms are the most efficient algorithms either in term of computational complexities, sparsity recovery or mean-square error.

164 citations

Proceedings ArticleDOI
01 Dec 2013
TL;DR: This paper cast tracking as a novel multi-task multi-view sparse learning problem and exploit the cues from multiple views including various types of visual features, such as intensity, color, and edge, where each feature observation can be sparsely represented by a linear combination of atoms from an adaptive feature dictionary.
Abstract: Combining multiple observation views has proven beneficial for tracking. In this paper, we cast tracking as a novel multi-task multi-view sparse learning problem and exploit the cues from multiple views including various types of visual features, such as intensity, color, and edge, where each feature observation can be sparsely represented by a linear combination of atoms from an adaptive feature dictionary. The proposed method is integrated in a particle filter framework where every view in each particle is regarded as an individual task. We jointly consider the underlying relationship between tasks across different views and different particles, and tackle it in a unified robust multi-task formulation. In addition, to capture the frequently emerging outlier tasks, we decompose the representation matrix to two collaborative components which enable a more robust and accurate approximation. We show that the proposed formulation can be efficiently solved using the Accelerated Proximal Gradient method with a small number of closed-form updates. The presented tracker is implemented using four types of features and is tested on numerous benchmark video sequences. Both the qualitative and quantitative results demonstrate the superior performance of the proposed approach compared to several state-of-the-art trackers.

164 citations

Proceedings ArticleDOI
19 Apr 2009
TL;DR: A new sparse representation for acoustic signals is presented which is based on a mixing model defined in the complex-spectrum domain (where additivity holds), and allows us to extract recurrent patterns of magnitude spectra that underlie observed complex spectra and the phase estimates of constituent signals.
Abstract: This paper presents a new sparse representation for acoustic signals which is based on a mixing model defined in the complex-spectrum domain (where additivity holds), and allows us to extract recurrent patterns of magnitude spectra that underlie observed complex spectra and the phase estimates of constituent signals. An efficient iterative algorithm is derived, which reduces to the multiplicative update algorithm for non-negative matrix factorization developed by Lee under a particular condition.

164 citations

Book ChapterDOI
28 May 2001
TL;DR: The experience indicates that for matrices arising in scientific simulations, register level optimizations are critical, and this work focuses here on the optimizations and parameter selection techniques used in Sparsity for register-level optimizations.
Abstract: Sparse matrix-vector multiplication is an important computational kernel that tends to perform poorly on modern processors, largely because of its high ratio of memory operations to arithmetic operations. Optimizing this algorithm is difficult, both because of the complexity of memory systems and because the performance is highly dependent on the nonzero structure of the matrix. The Sparsity system is designed to address these problem by allowing users to automatically build sparse matrix kernels that are tuned to their matrices and machines. The most difficult aspect of optimizing these algorithms is selecting among a large set of possible transformations and choosing parameters, such as block size. In this paper we discuss the optimization of two operations: a sparse matrix times a dense vector and a sparse matrix times a set of dense vectors. Our experience indicates that for matrices arising in scientific simulations, register level optimizations are critical, and we focus here on the optimizations and parameter selection techniques used in Sparsity for register-level optimizations. We demonstrate speedups of up to 2× for the single vector case and 5× for the multiple vector case.

164 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
93% related
Image segmentation
79.6K papers, 1.8M citations
92% related
Convolutional neural network
74.7K papers, 2M citations
92% related
Deep learning
79.8K papers, 2.1M citations
90% related
Image processing
229.9K papers, 3.5M citations
89% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023193
2022454
2021641
2020924
20191,208
20181,371