scispace - formally typeset
Search or ask a question
Topic

Matrix differential equation

About: Matrix differential equation is a research topic. Over the lifetime, 3219 publications have been published within this topic receiving 61794 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it was shown that the statistical properties of low-lying eigenvalues of the Dirac operator can be described by a random matrix theory with the global symmetries of the QCD partition function.
Abstract: ▪ Abstract Random matrix theory is a powerful way to describe universal correlations of eigenvalues of complex systems. It also may serve as a schematic model for disorder in quantum systems. In this review, we discuss both types of applications of chiral random matrix theory to the QCD partition function. We show that constraints imposed by chiral symmetry and its spontaneous breaking determine the structure of low-energy effective partition functions for the Dirac spectrum. We thus derive exact results for the low-lying eigenvalues of the QCD Dirac operator. We argue that the statistical properties of these eigenvalues are universal and can be described by a random matrix theory with the global symmetries of the QCD partition function. The total number of such eigenvalues increases with the square root of the Euclidean four-volume. The spectral density for larger eigenvalues (but still well below a typical hadronic mass scale) also follows from the same low-energy effective partition function. The valid...

316 citations

Book
01 Jan 1971
TL;DR: A particular computation algorithm for the method without reorthogonalization is shown to have remarkably good error properties, and this suggests that this variant of the Lanczos process is likely to become an extremely useful algorithm for finding several extreme eigenvalues, and their eigenvectors if needed, of very large sparse symmetric matrices.
Abstract: Several methods are available for computing eigenvalues and eigenvectors of large sparse matrices, but as yet no outstandingly good algorithm is generally known. For the symmetric matrix case one of the most elegant algorithms theoretically is the method of minimized iterations developed by Lanczos in 1950. This method reduces the original matrix to tri-diagonal form from which the eigensystem can easily be found. The method can be used iteratively, and here the convergence properties and different possible eigenvalue intervals are first considered assuming infinite precision computation. Next rounding error analyses are given for the method both with and without re-orthogonalization. It is shown that the method has been unjustly neglected, in fact a particular computation algorithm for the method without reorthogonalization is shown to have remarkably good error properties. As well as this the algorithm is very fast and can be programmed to require very little store compared with other comparable methods, and this suggests that this variant of the Lanczos process is likely to become an extremely useful algorithm for finding several extreme eigenvalues, and their eigenvectors if needed, of very large sparse symmetric matrices.

310 citations

Journal ArticleDOI
TL;DR: In this paper, an implicit eigenvalue equation may be transformed into an ordinary eigen value problem by generalizing the Lagrange formula to operators, and a method is given to build a constant operator h which has the same eigenvalues and eigenvectors as the original equation.

297 citations

Journal ArticleDOI
TL;DR: In this paper, a numerical method for computing all eigenvalues (and the corresponding eigenvectors) of a nonlinear holomorphic eigenvalue problem that lie within a given contour in the complex plane is proposed.

284 citations

Journal ArticleDOI
TL;DR: This work proposes to use a fundamental result in random matrix theory, the Marcenko-Pastur equation, to better estimate the eigenvalues of large dimensional covariance matrices, and suggests a change of point of view when thinking about estimation of high-dimensional vectors: it does not try to estimate directly the vectors but rather a probability measure that describes them.
Abstract: Estimating the eigenvalues of a population covariance matrix from a sample covariance matrix is a problem of fundamental importance in multivariate statistics; the eigenvalues of covariance matrices play a key role in many widely used techniques, in particular in principal component analysis (PCA). In many modern data analysis problems, statisticians are faced with large datasets where the sample size, n, is of the same order of magnitude as the number of variables p. Random matrix theory predicts that in this context, the eigenvalues of the sample covariance matrix are not good estimators of the eigenvalues of the population covariance. We propose to use a fundamental result in random matrix theory, the Marcenko–Pastur equation, to better estimate the eigenvalues of large dimensional covariance matrices. The Marcenko–Pastur equation holds in very wide generality and under weak assumptions. The estimator we obtain can be thought of as “shrinking” in a nonlinear fashion the eigenvalues of the sample covariance matrix to estimate the population eigenvalues. Inspired by ideas of random matrix theory, we also suggest a change of point of view when thinking about estimation of high-dimensional vectors: we do not try to estimate directly the vectors but rather a probability measure that describes them. We think this is a theoretically more fruitful way to think about these problems. Our estimator gives fast and good or very good results in extended simulations. Our algorithmic approach is based on convex optimization. We also show that the proposed estimator is consistent.

278 citations


Network Information
Related Topics (5)
Differential equation
88K papers, 2M citations
87% related
Partial differential equation
70.8K papers, 1.6M citations
85% related
Linear system
59.5K papers, 1.4M citations
84% related
Bounded function
77.2K papers, 1.3M citations
83% related
Matrix (mathematics)
105.5K papers, 1.9M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20234
202213
202111
202022
201910
201830