Journal•ISSN: 0196-5204
Siam Journal on Scientific and Statistical Computing
Society for Industrial and Applied Mathematics
About: Siam Journal on Scientific and Statistical Computing is an academic journal. The journal publishes majorly in the area(s): Boundary value problem & Matrix (mathematics). Over the lifetime, 857 publications have been published receiving 72701 citations.
Topics: Boundary value problem, Matrix (mathematics), Iterative method, Nonlinear system, Partial differential equation
Papers published on a yearly basis
Papers
More filters
••
TL;DR: An iterative method for solving linear systems, which has the property of minimizing at every step the norm of the residual vector over a Krylov subspace.
Abstract: We present an iterative method for solving linear systems, which has the property of minimizing at every step the norm of the residual vector over a Krylov subspace. The algorithm is derived from t...
10,907 citations
••
TL;DR: Numerical experiments indicate that the new variant of Bi-CG, named Bi- CGSTAB, is often much more efficient than CG-S, so that in some cases rounding errors can even result in severe cancellation effects in the solution.
Abstract: Recently the Conjugate Gradients-Squared (CG-S) method has been proposed as an attractive variant of the Bi-Conjugate Gradients (Bi-CG) method. However, it has been observed that CG-S may lead to a rather irregular convergence behaviour, so that in some cases rounding errors can even result in severe cancellation effects in the solution. In this paper, another variant of Bi-CG is proposed which does not seem to suffer from these negative effects. Numerical experiments indicate also that the new variant, named Bi-CGSTAB, is often much more efficient than CG-S.
4,722 citations
••
TL;DR: In this article, the use of Partial Least Squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed, and successive estimates are obtained using the residuals from previous rank as a new dependent variable y.
Abstract: The use of partial least squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed. Consecutive estimates $({\text{rank }}1,2,\cdots )$ are obtained using the residuals from previous rank as a new dependent variable y. The PLS method is equivalent to the conjugate gradient method used in Numerical Analysis for related problems.To estimate the “optimal” rank, cross validation is used. Jackknife estimates of the standard errors are thereby obtained with no extra computation.The PLS method is compared with ridge regression and principal components regression on a chemical example of modelling the relation between the measured biological activity and variables describing the chemical structure of a set of substituted phenethylamines.
2,290 citations
••
TL;DR: VODE is a new initial value ODE solver for stiff and nonstiff systems that uses variable-coefficient Adams-Moulton and Backward Differentiation Formula methods in Nordsieck form, treating the Jacobian as full or banded.
Abstract: VODE is a new initial value ODE solver for stiff and nonstiff systems. It uses variable-coefficient Adams-Moulton and Backward Differentiation Formula (BDF) methods in Nordsieck form, as taken from the older solvers EPISODE and EPISODEB, treating the Jacobian as full or banded. Unlike the older codes, VODE has a highly flexible user interface that is nearly identical to that of the ODEPACK solver LSODE.In the process, several algorithmic improvements have been made in VODE, aside from the new user interface. First, a change in stepsize and/or order that is decided upon at the end of one successful step is not implemented until the start of the next step, so that interpolations performed between steps use the more correct data. Second, a new algorithm for setting the initial stepsize has been included, which iterates briefly to estimate the required second derivative vector. Efficiency is often greatly enhanced by an added algorithm for saving and reusing the Jacobian matrix J, as it occurs in the Newton m...
1,601 citations
••
TL;DR: An algorithm for the problem of minimizing a quadratic function subject to an ellipsoidal constraint is proposed and it is shown that this algorithm is guaranteed to produce a nearly optimal solution in a finite number of iterations.
Abstract: We propose an algorithm for the problem of minimizing a quadratic function subject to an ellipsoidal constraint and show that this algorithm is guaranteed to produce a nearly optimal solution in a finite number of iterations. We also consider the use of this algorithm in a trust region Newton's method. In particular, we prove that under reasonable assumptions the sequence generated by Newton's method has a limit point which satisfies the first and second order necessary conditions for a minimizer of the objective function. Numerical results for GQTPAR, which is a Fortran implementaton of our algorithm, show that GQTPAR is quite successful in a trust region method. In our tests a call to GQTPAR only required 1.6 iterations on the average.
1,434 citations