scispace - formally typeset
Search or ask a question
Topic

Generalized eigenvector

About: Generalized eigenvector is a research topic. Over the lifetime, 767 publications have been published within this topic receiving 13532 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A simplified procedure is presented for the determination of the derivatives of eigenvectors of nth order algebraic eigensystems, applicable to symmetric or nonsymmetric systems, and requires knowledge of only one eigenvalue and its associated right and left eigenavectors.
Abstract: A simplified procedure is presented for the determination of the derivatives of eigenvectors of nth order algebraic eigensystems. The method is applicable to symmetric or nonsymmetric systems, and requires knowledge of only one eigenvalue and its associated right and left eigenvectors. In the procedure, the matrix of the original eigensystem of rank (/?-!) is modified to convert it to a matrix of rank /?, which then is solved directly for a vector which, together with the eigenvector, gives the eigenvector derivative to within an arbitrary constant. The norm of the eigenvector is used to determine this constant and complete the calculation. The method is simple, since the modified n rank matrix is formed without matrix multiplication or extensive manipulation. Since the matrix has the same bandedness as the original eigensystems, it can be treated efficiently using the same banded equation solution algorithms that are used to find the eigenvectors.

878 citations

Journal ArticleDOI
TL;DR: In this paper, the problem is formulated as an ill-posed matrix equation, and general criteria are established for constructing an inverse matrix, defined in terms of a set of generalized eigenvectors of the matrix, and may be chosen to optimize the resolution provided by the data.
Abstract: Sumntary Many problems in physical science involve the estimation of a number of unknown parameters which bear a linear or quasi-linear relationship to a set of experimental data. The data may be contaminated by random errors, insufficient to determine the unknowns, redundant, or all of the above. This paper presents a method of optimizing the conclusions from such a data set. The problem is formulated as an ill-posed matrix equation, and general criteria are established for constructing an ‘ inverse ’ matrix. The ‘ solution ’ to the problem is defined in terms of a set of generalized eigenvectors of the matrix, and may be chosen to optimize the resolution provided by the data, the expected error in the solution, the fit to the data, the proximity of the solution to an arbitrary function, or any combination of the above. The classical ‘ least-squares ’ solution is discussed as a special case.

766 citations

Journal ArticleDOI
TL;DR: In this paper, the problem of finding the stationary values of a quadratic form subject to linear constraints and determining the eigenvalues of a matrix which is modified by a matrix of rank one is considered.
Abstract: We consider the numerical calculation of several matrix eigenvalue problems which require some manipulation before the standard algorithms may be used. This includes finding the stationary values of a quadratic form subject to linear constraints and determining the eigenvalues of a matrix which is modified by a matrix of rank one. We also consider several inverse eigenvalue problems. This includes the problem of determining the coefficients for the Gauss–Radau and Gauss–Lobatto quadrature rules. In addition, we study several eigenvalue problems which arise in least squares.

615 citations

Book
01 Mar 1990
TL;DR: A starting point Formal problems in linear algebra The singular-value decomposition and its use to solve least-squares problems Handling larger problems Some comments on the formation of the cross-product matrix ATA.
Abstract: A starting point Formal problems in linear algebra The singular-value decomposition and its use to solve least-squares problems Handling larger problems Some comments on the formation of the cross-product matrix ATA Linear equations-a direct approach The Choleski decomposition The symmetric positive definite matrix again The algebraic eigenvalue generalized problem Real symmetric matrices The generalized symmetric matrix eigenvalue problem Optimization and nonlinear equations One-dimensional problems Direct search methods Descent to a minimum I-variable metric algorithms Descent to a minimum II-conjugate gradients Minimizing a nonlinear sum of squares Leftovers The conjugate gradients method applied to problems in linear algebra Appendices Bibliography Index

451 citations

Journal ArticleDOI
TL;DR: It is proved that the generalized eigenequations on the optimal discriminant plane are stable in respect of eigenvalues and the generalized Eigenvectors are indeed the optimal discriminate directions, if the perturbation is subject to some certain conditions.

267 citations


Network Information
Related Topics (5)
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
75% related
Matrix (mathematics)
105.5K papers, 1.9M citations
74% related
Differential equation
88K papers, 2M citations
73% related
Iterative method
48.8K papers, 1.2M citations
73% related
Rate of convergence
31.2K papers, 795.3K citations
73% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20232
20225
202112
202022
20196
201820