scispace - formally typeset
Search or ask a question
Topic

Unitary matrix

About: Unitary matrix is a research topic. Over the lifetime, 2656 publications have been published within this topic receiving 53127 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors studied the distribution of eigenvalues for two sets of random Hermitian matrices and one set of random unitary matrices in the energy spectra of disordered systems.
Abstract: In this paper we study the distribution of eigenvalues for two sets of random Hermitian matrices and one set of random unitary matrices. The statement of the problem as well as its method of investigation go back originally to the work of Dyson [i] and I. M. Lifsic [2], [3] on the energy spectra of disordered systems, although in their probability character our sets are more similar to sets studied by Wigner [4]. Since the approaches to the sets we consider are the same, we present in detail only the most typical case. The corresponding results for the other two cases are presented without proof in the last section of the paper. §1. Statement of the problem and survey of results We shall consider as acting in iV-dimensiona l unitary space ///v, a selfadjoint operator BN (re) of the form

2,594 citations

Journal ArticleDOI
TL;DR: An algorithmic proof that any discrete finite-dimensional unitary operator can be constructed in the laboratory using optical devices is given, and optical experiments with any type of radiation exploring higher-dimensional discrete quantum systems become feasible.
Abstract: An algorithmic proof that any discrete finite-dimensional unitary operator can be constructed in the laboratory using optical devices is given. Our recursive algorithm factorizes any N\ifmmode\times\else\texttimes\fi{}N unitary matrix into a sequence of two-dimensional beam splitter transformations. The experiment is built from the corresponding devices. This also permits the measurement of the observable corresponding to any discrete Hermitian matrix. Thus optical experiments with any type of radiation (photons, atoms, etc.) exploring higher-dimensional discrete quantum systems become feasible.

1,699 citations

Journal ArticleDOI
TL;DR: In this article, a numerically stable and fairly fast scheme is described to compute the unitary matrices U and V which transform a given matrix A into a diagonal form π = U^ * AV, thus exhibiting A's singular values on π's diagonal.
Abstract: A numerically stable and fairly fast scheme is described to compute the unitary matrices U and V which transform a given matrix A into a diagonal form $\Sigma = U^ * AV$, thus exhibiting A’s singular values on $\Sigma $’s diagonal. The scheme first transforms A to a bidiagonal matrix J, then diagonalizes J. The scheme described here is complicated but does not suffer from the computational difficulties which occasionally afflict some previously known methods. Some applications are mentioned, in particular the use of the pseudo-inverse $A^I = V\Sigma ^I U^* $ to solve least squares problems in a way which dampens spurious oscillation and cancellation.

1,683 citations

Book ChapterDOI
01 Jan 1993
TL;DR: In this paper, a simplified derivation of the system of nonlinear completely integrable equations (the aj's are the independent variables) that were first derived by Jimbo, Miwa, Mori, and Sato in 1980 was presented.
Abstract: Here I = S j (a2j 1,a2j) andI(y) is the characteristic function of the set I. In the Gaussian Unitary Ensemble (GUE) the probability that no eigenvalues lie in I is equal to �(a). Also �(a) is a tau-function and we present a new simplified derivation of the system of nonlinear completely integrable equations (the aj's are the independent variables) that were first derived by Jimbo, Miwa, Mori, and Sato in 1980. In the case of a single interval these equations are reducible to a Painleve V equation. For large s we give an asymptotic formula for E2(n;s), which is the probability in the GUE that exactly n eigenvalues lie in an interval of length s. These notes provide an introduction to that aspect of the theory of random matrices dealing with the distribution of eigenvalues. To first orient the reader, we present in Sec. II some numerical experiments that illustrate some of the basic aspects of the subject. In Sec. III we introduce the invariant measures for the three "circular ensembles" involving unitary matrices. We also define the level spacing distributions and express these distributions in terms of a particular Fredholm determinant. In Sec. IV we explain how these measures are modified for the orthogonal polynomial ensembles. In Sec. V we discuss the universality of these level spacing distribution functions in a particular scaling limit. The discussion up to this point (with the possible exception of Sec. V) follows the well-known path pioneered by Hua, Wigner, Dyson, Mehta and others who first developed this theory (see, e.g., the reprint volume of Porter (34) and Hua (17)). This, and much more, is discussed in Mehta's book (25)—the classic reference in the subject. An important development in random matrices was the discovery by Jimbo, Miwa, Mori, and Sato (21) (hereafter referred to as JMMS) that the basic Fredholm determinant mentioned above is a �-function in the sense of the Kyoto School. Though it has been some twelve years since (21) was published, these results are not widely appreciated by the practitioners of random matrices. This is due no doubt to the complexity of their paper. The methods of JMMS are methods of discovery; but now that we know the result, simpler proofs can be constructed. In Sec. VI we give such a proof of the JMMS equations. Our proof is a simplification and generalization of Mehta's (27) simplified proof of the single interval case. Also our methods build on the earlier work of Its, Izergin, Korepin, and Slavnov (18) and Dyson (12). We include in this section a discussion of the connection between the JMMS equations and the integrable Hamiltonian systems that appear in the geometry of quadrics and spectral theory as developed by Moser (31). This section concludes with a discussion of the case of a single interval (viz., probability that exactly n eigenvalues lie in a given interval). In this case the JMMS equations can be reduced to a single ordinary differential equation—the Painleve V equation. Finally, in Sec. VII we discuss the asymptotics in the case of a large single interval of the various level spacing distribution functions (4,38,28). In this analysis both the Painleve representation and new results in Toeplitz/Wiener- Hopf theory are needed to produce these asymptotics. We also give an approach based on the asymptotics of the eigenvalues of the basic linear integral operator (14,25,35). These results are then compared with the continuum model calculations of Dyson (12).

717 citations

Proceedings Article
19 Jun 2016
TL;DR: This work constructs an expressive unitary weight matrix by composing several structured matrices that act as building blocks with parameters to be learned, and demonstrates the potential of this architecture by achieving state of the art results in several hard tasks involving very long-term dependencies.
Abstract: Recurrent neural networks (RNNs) are notoriously difficult to train. When the eigenvalues of the hidden to hidden weight matrix deviate from absolute value 1, optimization becomes difficult due to the well studied issue of vanishing and exploding gradients, especially when trying to learn long-term dependencies. To circumvent this problem, we propose a new architecture that learns a unitary weight matrix, with eigenvalues of absolute value exactly 1. The challenge we address is that of parametrizing unitary matrices in a way that does not require expensive computations (such as eigendecomposition) after each weight update. We construct an expressive unitary weight matrix by composing several structured matrices that act as building blocks with parameters to be learned. Optimization with this parameterization becomes feasible only when considering hidden states in the complex domain. We demonstrate the potential of this architecture by achieving state of the art results in several hard tasks involving very longterm dependencies.

630 citations

Network Information
Related Topics (5)
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
85% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Matrix (mathematics)
105.5K papers, 1.9M citations
83% related
Bounded function
77.2K papers, 1.3M citations
81% related
Hamiltonian (quantum mechanics)
48.6K papers, 1M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202330
202281
2021104
2020113
201999
201893