scispace - formally typeset
Search or ask a question
Topic

Sparse matrix

About: Sparse matrix is a research topic. Over the lifetime, 13025 publications have been published within this topic receiving 393290 citations. The topic is also known as: sparse array.


Papers
More filters
Journal ArticleDOI
TL;DR: An ANSI C code for sparse LU factorization is presented that combines a column pre-ordering strategy with a right-looking unsymmetric-pattern multifrontal numerical factorization, and an upper bound on fill-in, work, and memory usage is computed.
Abstract: An ANSI C code for sparse LU factorization is presented that combines a column pre-ordering strategy with a right-looking unsymmetric-pattern multifrontal numerical factorization. The pre-ordering and symbolic analysis phase computes an upper bound on fill-in, work, and memory usage during the subsequent numerical factorization. User-callable routines are provided for ordering and analyzing a sparse matrix, computing the numerical factorization, solving a system with the LU factors, transposing and permuting a sparse matrix, and converting between sparse matrix representations. The simple user interface shields the user from the details of the complex sparse factorization data structures by returning simple handles to opaque objects. Additional user-callable routines are provided for printing and extracting the contents of these opaque objects. An even simpler way to use the package is through its MATLAB interface. UMFPACK is incorporated as a built-in operator in MATLAB 6.5 as x = Abb when A is sparse and unsymmetric.

1,469 citations

Journal ArticleDOI
TL;DR: This work analyzes the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern of a vector beta* based on observations contaminated by noise, and establishes precise conditions on the problem dimension p, the number k of nonzero elements in beta*, and the number of observations n.
Abstract: The problem of consistently estimating the sparsity pattern of a vector beta* isin Rp based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension p, the number k of nonzero elements in beta*, and the number of observations n that are necessary and sufficient for sparsity pattern recovery using the Lasso. We first analyze the case of observations made using deterministic design matrices and sub-Gaussian additive noise, and provide sufficient conditions for support recovery and linfin-error bounds, as well as results showing the necessity of incoherence and bounds on the minimum value. We then turn to the case of random designs, in which each row of the design is drawn from a N (0, Sigma) ensemble. For a broad class of Gaussian ensembles satisfying mutual incoherence conditions, we compute explicit values of thresholds 0 0, if n > 2 (thetasu + delta) klog (p- k), then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for n < 2 (thetasl - delta)klog (p - k), then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble (Sigma = Iptimesp), we show that thetasl = thetas

1,438 citations

Proceedings ArticleDOI
20 Jun 2009
TL;DR: This work proposes a method based on sparse representation (SR) to cluster data drawn from multiple low-dimensional linear or affine subspaces embedded in a high-dimensional space and applies this method to the problem of segmenting multiple motions in video.
Abstract: We propose a method based on sparse representation (SR) to cluster data drawn from multiple low-dimensional linear or affine subspaces embedded in a high-dimensional space. Our method is based on the fact that each point in a union of subspaces has a SR with respect to a dictionary formed by all other data points. In general, finding such a SR is NP hard. Our key contribution is to show that, under mild assumptions, the SR can be obtained `exactly' by using l1 optimization. The segmentation of the data is obtained by applying spectral clustering to a similarity matrix built from this SR. Our method can handle noise, outliers as well as missing data. We apply our subspace clustering algorithm to the problem of segmenting multiple motions in video. Experiments on 167 video sequences show that our approach significantly outperforms state-of-the-art methods.

1,411 citations

Journal ArticleDOI
TL;DR: This work surveys the quadratic eigenvalue problem, treating its many applications, its mathematical properties, and a variety of numerical solution techniques.
Abstract: We survey the quadratic eigenvalue problem, treating its many applications, its mathematical properties, and a variety of numerical solution techniques. Emphasis is given to exploiting both the structure of the matrices in the problem (dense, sparse, real, complex, Hermitian, skew-Hermitian) and the spectral properties of the problem. We classify numerical methods and catalogue available software.

1,369 citations

Book
15 Sep 2006
TL;DR: This paper presents a meta-modelling framework for solving sparse linear systems using cholesky factorization and CSparse, and some examples show how this framework can be modified to handle sparse matrices.
Abstract: Preface 1. Introduction 2. Basic algorithms 3. Solving triangular systems 4. Cholesky factorization 5. Orthogonal methods 6. LU factorization 7. Fill-reducing orderings 8. Solving sparse linear systems 9. CSparse 10. Sparse matrices in MATLAB Appendix: Basics of the C programming language Bibliography Index.

1,366 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
86% related
Artificial neural network
207K papers, 4.5M citations
85% related
Feature extraction
111.8K papers, 2.1M citations
85% related
Convolutional neural network
74.7K papers, 2M citations
85% related
Deep learning
79.8K papers, 2.1M citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023103
2022312
2021595
2020668
2019710
2018880