scispace - formally typeset
Search or ask a question
Book

Handbook Of Linear Algebra

01 Jan 2010-
TL;DR: In this article, DeAlba et al. presented a method for computing Eigenvalues and Singular Value Decomposition for linear systems. But their method was not suitable for the problem of linear algebra.
Abstract: Linear Algebra Linear Algebra Vectors, Matrices, and Systems of Linear Equations Jane Day Linear Independence, Span, and Bases Mark Mills Linear Transformations Francesco Barioli Determinants and Eigenvalues Luz M. DeAlba Inner Product Spaces, Orthogonal Projection, Least Squares, and Singular Value Decomposition Lixing Han and Michael Neumann Canonical Forms Leslie Hogben Other Canonical Forms Roger A. Horn and Vladimir V. Sergeichuk Unitary Similarity, Normal Matrices, and Spectral Theory Helene Shapiro Hermitian and Positive Definite Matrices Wayne Barrett Nonnegative and Stochastic Matrices Uriel G. Rothblum Partitioned Matrices Robert Reams Topics in Linear Algebra Schur Complements Roger A. Horn and Fuzhen Zhang Quadratic, Bilinear, and Sesquilinear Forms Raphael Loewy Multilinear Algebra J.A. Dias da Silva and Armando Machado Tensors and Hypermatrices Lek-Heng Lim Matrix Equalities and Inequalities Michael Tsatsomeros Functions of Matrices Nicholas J. Higham Matrix Polynomials Jorg Liesen and Christian Mehl Matrix Equations Beatrice Meini Invariant Subspaces G.W. Stewart Matrix Perturbation Theory Ren-Cang Li Special Types of Matrices Albrecht Bottcher and Ilya Spitkovsky Pseudospectra Mark Embree Singular Values and Singular Value Inequalities Roy Mathias Numerical Range Chi-Kwong Li Matrix Stability and Inertia Daniel Hershkowitz Generalized Inverses of Matrices Yimin Wei Inverse Eigenvalue Problems Alberto Borobia Totally Positive and Totally Nonnegative Matrices Shaun M. Fallat Linear Preserver Problems Peter Semrl Matrices over Finite Fields J. D. Botha Matrices over Integral Domains Shmuel Friedland Similarity of Families of Matrices Shmuel Friedland Representations of Quivers and Mixed Graphs Roger A. Horn and Vladimir V. Sergeichuk Max-Plus Algebra Marianne Akian, Ravindra Bapat, and Stephane Gaubert Matrices Leaving a Cone Invariant Bit-Shun Tam and Hans Schneider Spectral Sets Catalin Badea and Bernhard Beckermann Combinatorial Matrix Theory and Graphs Combinatorial Matrix Theory Combinatorial Matrix Theory Richard A. Brualdi Matrices and Graphs Willem H. Haemers Digraphs and Matrices Jeffrey L. Stuart Bipartite Graphs and Matrices Bryan L. Shader Sign Pattern Matrices Frank J. Hall and Zhongshan Li Topics in Combinatorial Matrix Theory Permanents Ian M. Wanless D-Optimal Matrices Michael G. Neubauer and William Watkins Tournaments T.S. Michael Minimum Rank, Maximum Nullity, and Zero Forcing Number of Graphs Shaun M. Fallat and Leslie Hogben Spectral Graph Theory Steve Butler and Fan Chung Algebraic Connectivity Steve Kirkland Matrix Completion Problems Luz M. DeAlba, Leslie Hogben, and Amy Wangsness Wehe Numerical Methods Numerical Methods for Linear Systems Vector and Matrix Norms, Error Analysis, Efficiency, and Stability Ralph Byers and Biswa Nath Datta Matrix Factorizations and Direct Solution of Linear Systems Christopher Beattie Least Squares Solution of Linear Systems Per Christian Hansen and Hans Bruun Nielsen Sparse Matrix Methods Esmond G. Ng Iterative Solution Methods for Linear Systems Anne Greenbaum Numerical Methods for Eigenvalues Symmetric Matrix Eigenvalue Techniques Ivan Slapnicar Unsymmetric Matrix Eigenvalue Techniques David S. Watkins The Implicitly Restarted Arnoldi Method D.C. Sorensen Computation of the Singular Value Decomposition Alan Kaylor Cline and Inderjit S. Dhillon Computing Eigenvalues and Singular Values to High Relative Accuracy Zlatko Drmac Nonlinear Eigenvalue Problems Heinrich Voss Topics in Numerical Linear Algebra Fast Matrix Multiplication Dario A. Bini Fast Algorithms for Structured Matrix Computations Michael Stewart Structured Eigenvalue Problems | Structure-Preserving Algorithms, Structured Error Analysis Heike Fassbender Large-Scale Matrix Computations Roland W. Freund Linear Algebra in Other Disciplines Applications to Physical and Biological Sciences Linear Algebra and Mathematical Physics Lorenzo Sadun Linear Algebra in Biomolecular Modeling Zhijun Wu Linear Algebra in Mathematical Population Biology and Epidemiology Fred Brauer and Carlos Castillo-Chavez Applications to Optimization Linear Programming Leonid N. Vaserstein Semidefinite Programming Henry Wolkowicz Applications to Probability and Statistics Random Vectors and Linear Statistical Models Simo Puntanen and George P.H. Styan Multivariate Statistical Analysis Simo Puntanen, George A.F. Seber, and George P.H. Styan Markov Chains Beatrice Meini Applications to Computer Science Coding Theory Joachim Rosenthal and Paul Weiner Quantum Computation Zijian Diao Operator Quantum Error Correction Chi-Kwong Li, Yiu-Tung Poon, and Nung-Sing Sze Information Retrieval and Web Search Amy N. Langville and Carl D. Meyer Signal Processing Michael Stewart Applications to Analysis Differential Equations and Stability Volker Mehrmann and Tatjana Stykel Dynamical Systems and Linear Algebra Fritz Colonius and Wolfgang Kliemann Control Theory Peter Benner Fourier Analysis Kenneth Howell Applications to Geometry Geometry Mark Hunacek Some Applications of Matrices and Graphs in Euclidean Geometry Miroslav Fiedler Applications to Algebra Matrix Groups Peter J. Cameron Group Representations Randall R. Holmes and Tin-Yau Tam Nonassociative Algebras Murray R. Bremner, Lucia I. Murakami, and Ivan P. Shestakov Lie Algebras Robert Wilson Computational Software Interactive Software for Linear Algebra MATLAB Steven J. Leon Linear Algebra in Maple David J. Jeffrey and Robert M. Corless Mathematica Heikki Ruskeep'a'a Sage Robert A. Beezer, Robert Bradshaw, Jason Grout, and William Stein Packages of Subroutines for Linear Algebra BLAS Jack Dongarra, Victor Eijkhout, and Julien Langou LAPACK Zhaojun Bai, James Demmel, Jack Dongarra, Julien Langou, and Jenny Wang Use of ARPACK and EIGS D.C. Sorensen Summary of Software for Linear Algebra Freely Available on the Web Jack Dongarra, Victor Eijkhout, and Julien Langou Glossary Notation Index Index
Citations
More filters
Journal ArticleDOI
TL;DR: This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.
Abstract: A basic idea of quantum computing is surprisingly similar to that of kernel methods in machine learning, namely, to efficiently perform computations in an intractably large Hilbert space. In this Letter we explore some theoretical foundations of this link and show how it opens up a new avenue for the design of quantum machine learning algorithms. We interpret the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space. A quantum computer can now analyze the input data in this feature space. Based on this link, we discuss two approaches for building a quantum model for classification. In the first approach, the quantum device estimates inner products of quantum states to compute a classically intractable kernel. The kernel can be fed into any classical kernel method such as a support vector machine. In the second approach, we use a variational quantum circuit as a linear model that classifies data explicitly in Hilbert space. We illustrate these ideas with a feature map based on squeezing in a continuous-variable system, and visualize the working principle with two-dimensional minibenchmark datasets.

852 citations

Book
27 Nov 2007
TL;DR: An essential, one-of-a-kind book for graduate-level courses in advanced statistical studies including linear and nonlinear models, multivariate analysis, and statistical computing, and it also serves as an excellent self-study guide for statistical researchers.
Abstract: A comprehensive, must-have handbook of matrix methods with a unique emphasis on statistical applications This timely book, A Matrix Handbook for Statisticians, provides a comprehensive, encyclopedic treatment of matrices as they relate to both statistical concepts and methodologies. Written by an experienced authority on matrices and statistical theory, this handbook is organized by topic rather than mathematical developments and includes numerous references to both the theory behind the methods and the applications of the methods. A uniform approach is applied to each chapter, which contains four parts: a definition followed by a list of results; a short list of references to related topics in the book; one or more references to proofs; and references to applications. The use of extensive cross-referencing to topics within the book and external referencing to proofs allows for definitions to be located easily as well as interrelationships among subject areas to be recognized. A Matrix Handbook for Statisticians addresses the need for matrix theory topics to be presented together in one book and features a collection of topics not found elsewhere under one cover. These topics include: * Complex matrices * A wide range of special matrices and their properties * Special products and operators, such as the Kronecker product * Partitioned and patterned matrices * Matrix analysis and approximation * Matrix optimization * Majorization * Random vectors and matrices * Inequalities, such as probabilistic inequalities Additional topics, such as rank, eigenvalues, determinants, norms, generalized inverses, linear and quadratic equations, differentiation, and Jacobians, are also included. The book assumes a fundamental knowledge of vectors and matrices, maintains a reasonable level of abstraction when appropriate, and provides a comprehensive compendium of linear algebra results with use or potential use in statistics. A Matrix Handbook for Statisticians is an essential, one-of-a-kind book for graduate-level courses in advanced statistical studies including linear and nonlinear models, multivariate analysis, and statistical computing. It also serves as an excellent self-study guide for statistical researchers.

502 citations

Journal ArticleDOI
TL;DR: The aim is to provide an overview of the major algorithmic developments that have taken place over the past few decades in the numerical solution of this and related problems, which are producing reliable numerical tools in the formulation and solution of advanced mathematical models in engineering and scientific computing.
Abstract: Given the square matrices $A, B, D, E$ and the matrix $C$ of conforming dimensions, we consider the linear matrix equation $A{\mathbf X} E+D{\mathbf X} B = C$ in the unknown matrix ${\mathbf X}$. Our aim is to provide an overview of the major algorithmic developments that have taken place over the past few decades in the numerical solution of this and related problems, which are producing reliable numerical tools in the formulation and solution of advanced mathematical models in engineering and scientific computing.

451 citations

Journal ArticleDOI
TL;DR: This paper provides an overview of modern techniques for exploiting low-rank structure to perform matrix recovery in these settings, providing a survey of recent advances in this rapidly-developing field.
Abstract: Low-rank matrices play a fundamental role in modeling and computational methods for signal processing and machine learning. In many applications where low-rank matrices arise, these matrices cannot be fully sampled or directly observed, and one encounters the problem of recovering the matrix given only incomplete and indirect observations. This paper provides an overview of modern techniques for exploiting low-rank structure to perform matrix recovery in these settings, providing a survey of recent advances in this rapidly-developing field. Specific attention is paid to the algorithms most commonly used in practice, the existing theoretical guarantees for these algorithms, and representative practical applications of these techniques.

424 citations

Journal ArticleDOI
TL;DR: This work explores dimension reduction techniques as one of the emerging approaches for data integration, and how these can be applied to increase the understanding of biological systems in normal physiological function and disease.
Abstract: State-of-the-art next-generation sequencing, transcriptomics, proteomics and other high-throughput ‘omics' technologies enable the efficient generation of large experimental data sets. These data may yield unprecedented knowledge about molecular pathways in cells and their role in disease. Dimension reduction approaches have been widely used in exploratory analysis of single omics data sets. This review will focus on dimension reduction approaches for simultaneous exploratory analyses of multiple data sets. These methods extract the linear relationships that best explain the correlated structure across data sets, the variability both within and between variables (or observations) and may highlight data issues such as batch effects or outliers. We explore dimension reduction techniques as one of the emerging approaches for data integration, and how these can be applied to increase our understanding of biological systems in normal physiological function and disease.

280 citations