scispace - formally typeset
Search or ask a question
Topic

Square matrix

About: Square matrix is a research topic. Over the lifetime, 5000 publications have been published within this topic receiving 92428 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it was shown that Definition 1 and Theorem 1 can be applied to a more general class of linear matrix equations, including linear matrix differential equations, and that they can be used to solve problems of matrix differentiation.
Abstract: where B' is the transpose of B. It has been shown that Definition 1 and Theorem 1 can fruitfully be applied to problems of matrix differentiation [2]. In this note it will be shown that they can be applied to a more general class of linear matrix equations, including linear matrix differential equations. Firstly, four standard properties of Kronecker products have to be related, all of which may be proved in an elementary fashion [1, p. 223 if.]. The matrices involved can have any appropriate orders. In Property 4 it is assumed that A and B are square of order m and s, respectively. (The same order assumption will be made in Theorems 2 and 3, which will be presented further on.) PROPERTY 1. (A B)(C D) =(AC) (BD). PROPERTY 2. (A 0 B)' = A' 0 B'. PROPERTY. 3. (A + B) 09 (C + D) = A C + A D + B (& C + B X D. PROPERTY 4. If A has characteristic roots oci, i = 1, -*, m, and if B has characteristic roots /IB, j = 1,... , s, then A 0 B has characteristic roots oci,Bj. Further, Is 0 A + B 0) Im has characteristic roots oci + f3B. For the treatment of differential equations we need the matrix exponential

106 citations

Journal ArticleDOI
TL;DR: In this article, the usual companion matrix of a polynomial of degree n can be factored into a product of n matrices, n − 1 of them being the identity matrix in which a 2×2 identity submatrix in two consecutive rows (and columns) is replaced by an appropriate 2 × 2 matrix, the remaining being an identity matrix with the last entry replaced by possibly different entry.

106 citations

Journal ArticleDOI
TL;DR: In this article, the authors studied the solutions of complex matrix equations X−AXB=C and X −AXB =C, and obtained explicit solutions of the equations by the method of characteristic polynomial and a method of real representation of a complex matrix respectively.

105 citations

Posted Content
TL;DR: In this paper, the non-square matrix sensing problem under restricted isometry property (RIP) assumptions is considered, and it is shown that matrix factorization does not introduce any spurious local minima, under RIP.
Abstract: We consider the non-square matrix sensing problem, under restricted isometry property (RIP) assumptions. We focus on the non-convex formulation, where any rank-$r$ matrix $X \in \mathbb{R}^{m \times n}$ is represented as $UV^\top$, where $U \in \mathbb{R}^{m \times r}$ and $V \in \mathbb{R}^{n \times r}$. In this paper, we complement recent findings on the non-convex geometry of the analogous PSD setting [5], and show that matrix factorization does not introduce any spurious local minima, under RIP.

104 citations

Journal ArticleDOI
TL;DR: An improved method is developed for eigenvalue and eigenvector placement of a closed-loop control system using either state or output feedback and is formulated in real arithmetic for efficient implementation.
Abstract: An improved method is developed for eigenvalues and eigenvectors placement of a closed-loop control system using either state or output feedback. The method basically consists of three steps. First, the singular value of QR decomposition is used to generate an orthonormal basis that spans admissible eigenvector space corresponding to each assigned eigenvalue. Secondly, given a unitary matrix, the eigenvector set which best approximates the given matrix in the least-square sense and still satisfy eigenvalue cosntraints is determined. Thirdly, a unitary matrix is sought to minimize the error between the unitary matrix and the assignable eigenvector matrix. For use as the desired eigenvector set, two matrices, namely, the open-loop eigenvector matrix and its closest unitary matrix are proposed. The latter matrix generally encourages both minimum conditioning and control gains. In addition, the algorithm is formulated in real arithmetic for efficient implementation. To illustrate the basic concepts, numerical examples are included.

103 citations


Network Information
Related Topics (5)
Matrix (mathematics)
105.5K papers, 1.9M citations
84% related
Polynomial
52.6K papers, 853.1K citations
84% related
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
81% related
Bounded function
77.2K papers, 1.3M citations
80% related
Hilbert space
29.7K papers, 637K citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202322
202244
2021115
2020149
2019134
2018145