scispace - formally typeset
Search or ask a question
Topic

Square matrix

About: Square matrix is a research topic. Over the lifetime, 5000 publications have been published within this topic receiving 92428 citations.


Papers
More filters
Book
02 Jun 2013
TL;DR: In this paper, the authors present a software environment for Dimensioned Linear Algebra (DLLA) and a set of methods to represent and manipulate dimensioned vectors and scalar operations.
Abstract: 0. Introductory.- 0.1 Physical Dimensions.- 0.2 Mathematical Dimensions.- 0.3 Overview.- Exercises.- 1. The Mathematical Foundations of Science and Engineering.- 1.1 The Inadequacy of Real Numbers.- 1.1.1 The Error of Substitution.- 1.1.2 The Problem with Linear Spaces.- 1.1.3 Nondimensionalization.- 1.1.3 Dimensioned Algebras.- 1.2 The Mathematics of Dimensioned Quantities.- 1.2.1 Axiomatic Development.- 1.2.2 Constructive Approach.- 1.2.3 Constraints on Exponentiation.- 1.2.4 The Dimensional Basis.- 1.2.5 Dimensional Logarithms.- 1.2.6 The Basis-Independence Principle.- 1.2.7 Symmetries of Dimensioned Quantities.- 1.2.8 Images.- 1.3 Conclusions.- Exercises.- 2. Dimensioned Linear Algebra.- 2.1 Vector Spaces and Linear Transformations.- 2.2 Terminology and Dimensional Inversion.- 2.3 Dimensioned Scalars.- 2.4 Dimensioned Vectors.- 2.5 Dimensioned Matrices.- Exercises.- 3. The Theory of Dimensioned Matrices.- 3.1 The Dimensional Freedom of Multipliable Matrices.- 3.2 Endomorphic Matrices and the Matrix Exponential.- 3.3 Square Matrices, Inverses, and the Determinant.- 3.4 Squarable Matrices and Eigenstructure.- 3.5 Dimensionally Symmetric Multipliable Matrices.- 3.6 Dimensionally Hankel and Toeplitz Matrices.- 3.7 Uniform, Half Uniform, and Dimensionless Matrices.- 3.8 Conclusions.- 3.A Appendix: The n + m ? 1 Theorem.- Exercises.- 4. Norms, Adjoints, and Singular Value Decomposition.- 4.1 Norms for Dimensioned Spaces.- 4.1.1 Wand Norms.- 4.1.2 Extrinsic Norms.- 4.2 Dimensioned Singular Value Decomposition (DSVD).- 4.3 Adjoints.- 4.4 Norms for Nonuniform Matrices.- 4.5 A Control Application.- 4.6 Factorization of Symmetric Matrices.- Exercises.- 5. Aspects of the Theory of Systems.- 5.1 Differential and Difference Equations.- 5.2 State-Space Forms.- 5.3 Canonical Forms.- 5.4 Transfer Functions and Impulse Responses.- 5.5 Duals and Adjoints.- 5.6 Stability.- 5.7 Controllability, Observability, and Grammians.- 5.8 Expectations and Probability Densities.- Exercises.- 6. Multidimensional Computational Methods.- 6.1 Computers and Engineering.- 6.1.1 A Software Environment for Dimensioned Linear Algebra.- 6.1.2 Overview.- 6.2 Representing and Manipulating Dimensioned Scalars.- 6.2.1 The Numeric and Dimensional Components of a Scalar.- 6.2.2 The Dimensional Basis.- 6.2.3 Numerical Representations and Uniqueness.- 6.2.4 Scalar Operations.- 6.2.5 Input String Conversion.- 6.2.6 Output and Units Conversion.- 6.2.7 Binary Relations.- 6.2.8 Summary of Scalar Methods.- 6.3 Dimensioned Vectors.- 6.3.1 Dimensioned Vectors and Dimension Vectors.- 6.3.2 Representing Dimensioned Vectors.- 6.3.3 Vector Operations.- 6.3.4 Summary of Vectors.- 6.4 Representing Dimensioned Matrices.- 6.4.1 Arrays versus Matrices.- 6.4.2 The Domain/Range Matrix Representation.- 6.1.3 Allowing Geometric and Matrix Algebra Interpretations.- 6.4.4 Input Conversion.- 6.4.5 Output Conversion.- 6.4.6 Special Classes of Dimensioned Matrices.- 6.4.7 Identity and Zero Matrices.- 6.4.8 Scalar and Vector Conversion to Matrices.- 6.4.9 Summary of the Matrix Representation.- 6.5 Operations on Dimensioned Matrices.- 6.5.1 Matrix Addition, Subtraction, Similarity, and Equality.- 6.5.2 Block Matrices.- 6.5.3 Matrix Multiplication.- 6.5.4 Gaussian Elimination.- 6 5.5 The Determinant and Singularity.- 6.5.6 The Trace.- 6.5.7 Matrix Inverse.- 6.5.8 Matrix Transpose.- 6.5.9 Eigenstructure Decomposition.- 6.5.10 Singular Value Decomposition.- 6.6 Conclusions.- Exercises.- 7. Forms of Multidimensional Relationships.- 7.1 Goals.- 7.2 Operations.- 7.3 Procedure.- Exercises.- 8. Concluding Remarks.- 9. Solutions to Odd-Numbered Exercises.- References.

30 citations

Journal ArticleDOI
TL;DR: In this article, the real symmetric and real skew-symmetric cases are treated in detail with an elementary proof of a result of Ilyushechkin, which leads to the expansion of the discriminant as a sum of squares.

30 citations

Journal ArticleDOI
01 Feb 1969
TL;DR: In this article, it was shown that every complex matrix with real determinant is the product of four hermitian matrices, which is a special case of the result of Halmos and Kakutani.
Abstract: It is the main purpose of this note to prove that every complex matrix with real determinant is the product of four hermitian matrices; Theorem 2 is an actually stronger result. Every real square matrix is the product of two real hermitian matrices [1]; this is a special case of our Theorem 1 which is of interest in itself, if it is indeed new. Theorem 3 was motivated by a theorem of Halmos and Kakutani [3 ] who proved that every unitary operator on an infinitedimensional Hilbert space is the product of four symmetries (i.e., operators that are hermitian and unitary). We also show that the number of factors in these results cannot be reduced in general.

30 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that every nilpotent n × n matrix over an entire antiring can be written as a sum of ⌈ log 2 n ⌉ square-zero matrices and also the necessary number of square zero summands for an arbitrary trace zero matrix to be expressible as their sum.

30 citations

Journal ArticleDOI
01 Dec 1983
TL;DR: In this paper, it was shown that the coefficients of linear prediction for a random process and the prediction error variances are related to the covariance matrix through triangular decomposition, and that the matrix is written in the product form LDL* where L is lower triangular with unit diagonal and D is diagonal.
Abstract: It is shown that the coefficients of linear prediction for a random process and the prediction error variances are related to the covariance matrix through triangular decomposition. In particular, if the covariance matrix is written in the product form LDL*where L is lower triangular with unit diagonal and D is diagonal, then the rows of L-1are the coefficients of linear prediction and the elements of D are the prediction error variances.

30 citations


Network Information
Related Topics (5)
Matrix (mathematics)
105.5K papers, 1.9M citations
84% related
Polynomial
52.6K papers, 853.1K citations
84% related
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
81% related
Bounded function
77.2K papers, 1.3M citations
80% related
Hilbert space
29.7K papers, 637K citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202322
202244
2021115
2020149
2019134
2018145