Topic
Square matrix
About: Square matrix is a research topic. Over the lifetime, 5000 publications have been published within this topic receiving 92428 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this paper, the determinant and Pfaffian identities were used to evaluate Kuperberg's determinants, and express the round partition functions in terms of irreducible characters of classical groups.
Abstract: An alternating sign matrix is a square matrix with entries 1, 0 and ?1 such that the sum of the entries in each row and each column is equal to 1 and the nonzero entries alternate in sign along each row and each column. To some of the symmetry classes of alternating sign matrices and their variations, G. Kuperberg associate square ice models with appropriate boundary conditions, and give determinant and Pfaffian formulae for the partition functions. In this paper, we utilize several determinant and Pfaffian identities to evaluate Kuperberg's determinants and Pfaffians, and express the round partition functions in terms of irreducible characters of classical groups. In particular, we settle a conjecture on the number of vertically and horizontally symmetric alternating sign matrices (VHSASMs).
117 citations
••
TL;DR: In this paper, it was shown that each square centrohermitian matrix is similar to a real (pure imaginary) matrix with real entries, and that skew-centro-hermitians are similar to real-valued centro-symmetric matrix.
116 citations
••
TL;DR: It is proved that these pencils are linearizations even when $P(\lambda)$ is a singular square matrix polynomial, and it is shown explicitly how to recover the left and right minimal indices and minimal bases of the polynomials from the minimum indices and bases of these linearizations.
Abstract: A standard way of dealing with a matrix polynomial $P(\lambda)$ is to convert it into an equivalent matrix pencil—a process known as linearization. For any regular matrix polynomial, a new family of linearizations generalizing the classical first and second Frobenius companion forms has recently been introduced by Antoniou and Vologiannidis, extending some linearizations previously defined by Fiedler for scalar polynomials. We prove that these pencils are linearizations even when $P(\lambda)$ is a singular square matrix polynomial, and show explicitly how to recover the left and right minimal indices and minimal bases of the polynomial $P(\lambda)$ from the minimal indices and bases of these linearizations. In addition, we provide a simple way to recover the eigenvectors of a regular polynomial from those of any of these linearizations, without any computational cost. The existence of an eigenvector recovery procedure is essential for a linearization to be relevant for applications.
115 citations
••
TL;DR: In this paper, it has been shown that the inverses of these matrices are the operators which perform the Gaussian elimination steps for calculating Cholesky's factorization.
114 citations
••
TL;DR: It is shown that the linear complementarity problem of finding az inRn such that Mz + q ⩽ 0, z ⩾ 0 andzT(Mz +q) = 0 can be solved by a single linear program in some important special cases including those whenM or its inverse is a Z-matrix, that is a real square matrix with nonpositive off-diagonal elements.
Abstract: It is shown that the linear complementarity problem of finding az inRn such thatMz + q ? 0, z ? 0 andzT(Mz + q) = 0 can be solved by a single linear program in some important special cases including those whenM or its inverse is a Z-matrix, that is a real square matrix with nonpositive off-diagonal elements. As a consequence certain problems in mechanics, certain problems of finding the least element of a polyhedral set and certain quadratic programming problems, can each be solved by a single linear program.
114 citations