Open AccessBook
Sketching as a Tool for Numerical Linear Algebra
Reads0
Chats0
TLDR
A survey of linear sketching algorithms for numeric allinear algebra can be found in this paper, where the authors consider least squares as well as robust regression problems, low rank approximation, and graph sparsification.Abstract:
This survey highlights the recent advances in algorithms for numericallinear algebra that have come from the technique of linear sketching,whereby given a matrix, one first compresses it to a much smaller matrixby multiplying it by a (usually) random matrix with certain properties.Much of the expensive computation can then be performed onthe smaller matrix, thereby accelerating the solution for the originalproblem. In this survey we consider least squares as well as robust regressionproblems, low rank approximation, and graph sparsification.We also discuss a number of variants of these problems. Finally, wediscuss the limitations of sketching methods.read more
Citations
More filters
Posted Content
Federated Learning: Strategies for Improving Communication Efficiency
Jakub Konečný,H. Brendan McMahan,Felix X. Yu,Peter Richtárik,Ananda Theertha Suresh,Dave Bacon +5 more
TL;DR: Two ways to reduce the uplink communication costs are proposed: structured updates, where the user directly learns an update from a restricted space parametrized using a smaller number of variables, e.g. either low-rank or a random mask; and sketched updates, which learn a full model update and then compress it using a combination of quantization, random rotations, and subsampling.
Book
High-Dimensional Statistics: A Non-Asymptotic Viewpoint
TL;DR: This book provides a self-contained introduction to the area of high-dimensional statistics, aimed at the first-year graduate level, and includes chapters that are focused on core methodology and theory - including tail bounds, concentration inequalities, uniform laws and empirical process, and random matrices.
Book
An Introduction to Matrix Concentration Inequalities
TL;DR: The matrix concentration inequalities as discussed by the authors are a family of matrix inequalities that can be found in many areas of theoretical, applied, and computational mathematics. But they are not suitable for the analysis of random matrices.
Journal ArticleDOI
Low-Rank Approximation and Regression in Input Sparsity Time
TL;DR: A new distribution over m × n matrices S is designed so that, for any fixed n × d matrix A of rank r, with probability at least 9/10, ∥SAx∥2 = (1 ± ε)∥Ax∢2 simultaneously for all x ∈ Rd.
Proceedings ArticleDOI
Dimensionality Reduction for k-Means Clustering and Low Rank Approximation
TL;DR: In this paper, the authors show how to approximate a data matrix A with a much smaller sketch ~A that can be used to solve a general class of constrained k-rank approximation problems to within (1+e) error.
References
More filters
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Proceedings Article
Algorithms for Non-negative Matrix Factorization
Daniel D. Lee,H. Sebastian Seung +1 more
TL;DR: Two different multiplicative algorithms for non-negative matrix factorization are analyzed and one algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence.
Book
Linear and nonlinear programming
David G. Luenberger,Yinyu Ye +1 more
TL;DR: Strodiot and Zentralblatt as discussed by the authors introduced the concept of unconstrained optimization, which is a generalization of linear programming, and showed that it is possible to obtain convergence properties for both standard and accelerated steepest descent methods.