scispace - formally typeset
Search or ask a question
Topic

Sparse approximation

About: Sparse approximation is a research topic. Over the lifetime, 18037 publications have been published within this topic receiving 497739 citations. The topic is also known as: Sparse approximation.


Papers
More filters
Journal ArticleDOI
TL;DR: The proposed sketch-photo synthesis method works at patch level and is composed of two steps: sparse neighbor selection (SNS) for an initial estimate of the pseudoimage (pseudosketch or pseudophoto) and sparse-representation-based enhancement (SRE) for further improving the quality of the synthesized image.
Abstract: Sketch-photo synthesis plays an important role in sketch-based face photo retrieval and photo-based face sketch retrieval systems. In this paper, we propose an automatic sketch-photo synthesis and retrieval algorithm based on sparse representation. The proposed sketch-photo synthesis method works at patch level and is composed of two steps: sparse neighbor selection (SNS) for an initial estimate of the pseudoimage (pseudosketch or pseudophoto) and sparse-representation-based enhancement (SRE) for further improving the quality of the synthesized image. SNS can find closely related neighbors adaptively and then generate an initial estimate for the pseudoimage. In SRE, a coupled sparse representation model is first constructed to learn the mapping between sketch patches and photo patches, and a patch-derivative-based sparse representation method is subsequently applied to enhance the quality of the synthesized photos and sketches. Finally, four retrieval modes, namely, sketch-based, photo-based, pseudosketch-based, and pseudophoto-based retrieval are proposed, and a retrieval algorithm is developed by using sparse representation. Extensive experimental results illustrate the effectiveness of the proposed face sketch-photo synthesis and retrieval algorithms.

164 citations

Proceedings Article
01 Aug 2008
TL;DR: It is shown that the proposed approximation framework can successfully determine multiple target locations by using linear dimensionality-reducing projections of sensor measurements, ameliorating the communication requirements.
Abstract: We propose an approximation framework for distributed target localization in sensor networks. We represent the unknown target positions on a location grid as a sparse vector, whose support encodes the multiple target locations. The location vector is linearly related to multiple sensor measurements through a sensing matrix, which can be locally estimated at each sensor. We show that we can successfully determine multiple target locations by using linear dimensionality-reducing projections of sensor measurements. The overall communication bandwidth requirement per sensor is logarithmic in the number of grid points and linear in the number of targets, ameliorating the communication requirements. Simulations results demonstrate the performance of the proposed framework.

163 citations

Journal ArticleDOI
TL;DR: The proposed NJCRC-LAD method is tested on three HSIs, and the experimental results suggest that the proposed algorithm outperforms the corresponding sparsity-based algorithms and the classical support vector machine hyperspectral classifier.
Abstract: Sparse representation has been widely used in image classification. Sparsity-based algorithms are, however, known to be time consuming. Meanwhile, recent work has shown that it is the collaborative representation (CR) rather than the sparsity constraint that determines the performance of the algorithm. We therefore propose a nonlocal joint CR classification method with a locally adaptive dictionary (NJCRC-LAD) for hyperspectral image (HSI) classification. This paper focuses on the working mechanism of CR and builds the joint collaboration model (JCM). The joint-signal matrix is constructed with the nonlocal pixels of the test pixel. A subdictionary is utilized, which is adaptive to the nonlocal signal matrix instead of the entire dictionary. The proposed NJCRC-LAD method is tested on three HSIs, and the experimental results suggest that the proposed algorithm outperforms the corresponding sparsity-based algorithms and the classical support vector machine hyperspectral classifier.

163 citations

Journal ArticleDOI
Xuefeng Chen1, Zhaohui Du1, Jimeng Li1, Xiang Li1, Han Zhang1 
TL;DR: A new scheme, Sparse Extraction of Impulse by Adaptive Dictionary (SpaEIAD), to extract impulse components relies on the sparse model of compressed sensing, involving the sparse dictionary learning and redundant representations over the learned dictionary.

162 citations

Journal ArticleDOI
01 May 2014
TL;DR: The DBCSR (Distributed Block Compressed Sparse Row) library for scalable sparse matrix–matrix multiplication and its use in the CP2K program for linear-scaling quantum-chemical calculations is presented.
Abstract: Efficient parallel multiplication of sparse matrices is key to enabling many large-scale calculations. This article presents the DBCSR (Distributed Block Compressed Sparse Row) library for scalable sparse matrix–matrix multiplication and its use in the CP2K program for linear-scaling quantum-chemical calculations. The library combines several approaches to implement sparse matrix multiplication in a way that performs well and is demonstrably scalable. Parallel communication has well-defined limits. Data volume decreases with O ( 1 / P ) with increasing process counts P and every process communicates with at most O ( P ) others. Local sparse matrix multiplication is handled efficiently using a combination of techniques: blocking elements together in an application-relevant way, an autotuning library for small matrix multiplications, cache-oblivious recursive multiplication, and multithreading. Additionally, on-the-fly filtering not only increases sparsity but also avoids performing calculations that fall below the filtering threshold. We demonstrate and analyze the performance of the DBCSR library and its various scaling behaviors.

162 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
93% related
Image segmentation
79.6K papers, 1.8M citations
92% related
Convolutional neural network
74.7K papers, 2M citations
92% related
Deep learning
79.8K papers, 2.1M citations
90% related
Image processing
229.9K papers, 3.5M citations
89% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023193
2022454
2021641
2020924
20191,208
20181,371