scispace - formally typeset
Search or ask a question
Topic

Square matrix

About: Square matrix is a research topic. Over the lifetime, 5000 publications have been published within this topic receiving 92428 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, it is shown that the Hawkins-Simon condition is satisfied by any real square matrix which is inverse-positive after a suitable permutation of columns or rows, and one more characterization of inverse positive matrices is given concerning the Le Chatelier-Braun principle.
Abstract: Dedicated to the late Professors David Hawkins and Hukukane Nikaido Abstract. It is shown that (a weak version of) the Hawkins-Simon condition is satisfied by any real square matrix which is inverse-positive after a suitable permutation of columns or rows. One more characterization of inverse-positive matrices is given concerning the Le Chatelier-Braun principle. The proofs are all simple and elementary.

97 citations

Journal ArticleDOI
01 Oct 1958
TL;DR: In this paper, it was shown that if a real square matrix P fulfils certain rather general conditions then there exists a real diagonal matrix D such that the characteristic equation of DP is stable and furthermore, aperiodic.
Abstract: In this note we show that if a real square matrix P fulfils certain rather general conditions then there exists a real diagonal matrix D such that the characteristic equation of DP is stable and, furthermore, aperiodic. (A characteristic equation is called stable if the real parts of its roots are all negative. If the roots are all real and simple the equation is said to be aperiodic; see Fuller(3).)

96 citations

Posted Content
TL;DR: In this article, the authors studied the empirical measure of the eigenvalues of non-normal square matrices with independent Haar distributed on the unitary group and real diagonal.
Abstract: We study the empirical measure $L_{A_n}$ of the eigenvalues of non-normal square matrices of the form $A_n=U_nD_nV_n$ with $U_n,V_n$ independent Haar distributed on the unitary group and $D_n$ real diagonal. We show that when the empirical measure of the eigenvalues of $D_n$ converges, and $D_n$ satisfies some technical conditions, $L_{A_n}$ converges towards a rotationally invariant measure on the complex plan whose support is a single ring. In particular, we provide a complete proof of Feinberg-Zee single ring theorem \cite{FZ}. We also consider the case where $U_n,V_n$ are independent Haar distributed on the orthogonal group.

96 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present a new algorithm for the calculation of the eigenvalues of real square matrices of orders up to 100, which is directly applicable to complex matrices as well.
Abstract: 1. Introduction. We present a new algorithm for the calculation of the eigenvalues of real square matrices of orders up to 100. The basic method is directly applicable to complex matrices as well and, in both cases, with each eigenvalue X of A a vector v is produced for which (A - XI)v is null except for a small last element. This vector is not always an approximation to the eigenvector for X and this algorithm claims only to find eigenvalues. The main concern has been to use only single precision arithmetic although the effect of using an accumulated inner product procedure in one part of the program is shown in the results in Section 15. The method consists of two parts. Firstly the given matrix A is reduced to almost triangular (Hessenberg) form H by elementary similarity transformations. Direct reduction of H to sparser forms requires extra precision in practice and even then is not without difficulties. So the second stage is the iterative search for the eigenvalues of H. A natural extension of Hyman's method [13] may be used to evaluate p(z) = det (H - zI) and any number of derivatives in an accurate and stable way. However each evaluation requires approximately n2 real multiplications and n2 real additions for an n X n matrix and complex z. Thus the viability of this approach depends on finding each eigenvalue with a small number of evaluations. Results so far with the method developed here indicate an average of just less than 9 evaluations (3 iterations) per eigenvalue on a wide variety of matrices of orders from 8 to 100. Now the iterations of Muller, Newton, and Bairstow converge quickly once a fair approximation to an eigenvalue has been found. They do not seem so satisfactory at the beginning of a search. Laguerre's method [2], [5], [6] was designed for polynomials with real zeros and when these are distinct it gives strong convergence right from any starting value. The method can be extended to the complex plane. No longer is convergence certain for any starting value but, in practice, the complex iteration seems as powerful as the real one on all examples so far considered. One Laguerre step requires more calculation than one step of any of the methods mentioned above, but when there are no a priori approximation to the zeros available the reduction in the number of iterations with Laguerre more than compensates for the extra calculation for each step. In addition when an eigenvalue has been found there is enough information available to take one Newton step towards the next eigenvalue. This paper is mainly a detailed discussion of the practical application of the method and techniques for keeping the number of iterations to a minimum. A description of the program is given in ALGOL 60 [1], together with some results ob

96 citations

Journal ArticleDOI
TL;DR: In this paper, an explicit test for the rank of an arbitrary rectangular or square matrix A and a related test of (semi) definiteness of A were derived based on the Gaussian elimination Lower-Diagonal-Upper triangular (LDU) decomposition.
Abstract: Consider any consistent, asymptotically normal estimate ǎ of an arbitrary rectangular or square matrix A. This article derives an explicit test for the rank of A and a related test of (semi) definiteness of A. Potential applications include testing for identification of structural models, testing for the number of state variables in state-space models (including tests for the order of autoregressive moving average (ARMA) processes), consumer demand analysis applications, and testing for the number of factors in factor analysis and related procedures. The test is based on the Gaussian elimination Lower-Diagonal-Upper triangular (LDU) decomposition. The test is illustrated with an empirical application to testing the order of ARMA processes.

96 citations


Network Information
Related Topics (5)
Matrix (mathematics)
105.5K papers, 1.9M citations
84% related
Polynomial
52.6K papers, 853.1K citations
84% related
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
81% related
Bounded function
77.2K papers, 1.3M citations
80% related
Hilbert space
29.7K papers, 637K citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202322
202244
2021115
2020149
2019134
2018145