Iterative Methods for Optimization
Citations
1,652 citations
1,436 citations
Cites methods from "Iterative Methods for Optimization"
...Recently, Arnold in his Ph.D. thesis (Arnold, 2001) extensively tested numerous optimization methods under noise, including: (1) the direct pattern search algorithm of Hooke and Jeeves (Hooke and Jeeves, 1961), (2) the simplex metdod of Nelder and Mead (Nelder and Mead, 1965), (3) the multi-directional search algorithm of Torczon (Torczon, 1989), (4) the implicit filtering algorithm of Gilmore and Kelley (Gilmore and Kelley, 1995; Kelley, 1999) that is based on explicitly approximating the local gradient of the objective functions by means of finite differencing, (5) the simultaneous perturbation stochastic approximation algorithm due to Spall (Spall, 1992; Spall, 1998a; Spall, 1998b), (6) the evolutionary gradient search algorithm of Salomon (Salomon, 1998), (7) the evolution strategy with cumulative mutation strength adaptation mechanism by Hansen and Ostermeier (Hansen, 1998; Hansen and Ostermeier, 2001)....
[...]
...(2) the simplex metdod of Nelder and Mead (Nelder and Mead, 1965), (3) the multi-directional search algorithm of Torczon (Torczon, 1989), (4) the implicit filtering algorithm of Gilmore and Kelley (Gilmore and Kelley, 1995; Kelley, 1999) that is based on explicitly approximating the local gradient of the objective functions by means of finite differencing, (5) the simultaneous perturbation stochastic approximation algorithm due to Spall (Spall, 1992; Spall, 1998a; Spall, 1998b), (6) the evolutionary gradient search algorithm of Salomon (Salomon, 1998), (7) the evolution strategy with cumulative mutation strength adaptation mechanism by Hansen and Ostermeier (Hansen, 1998; Hansen and Ostermeier, 2001)....
[...]
...…multi-directional search algorithm of Torczon (Torczon, 1989), (4) the implicit filtering algorithm of Gilmore and Kelley (Gilmore and Kelley, 1995; Kelley, 1999) that is based on explicitly approximating the local gradient of the objective functions by means of finite differencing, (5) the…...
[...]
1,183 citations
Cites background from "Iterative Methods for Optimization"
...Recent works on the subject have led to significant progress by providing convergence proofs [5,9,31,34,76,80, 85,88,134], incorporating the use of surrogate models [22,24,127,131], and offering the first textbook that is exclusively devoted to this topic [35]....
[...]
901 citations
853 citations
Cites background or methods from "Iterative Methods for Optimization"
...be judged. In the special case of bound-constrained optimization, gradient-projection methods [3,4, 24 ,32,38] or coordinate descent methods [8,23,30,33,42] can be effective....
[...]
...–i fJ = N and P is given by (2), then d is a scaled gradient-projection direction for bound-constrained minimization [4, 24 ,38,41]; –i ff isquadraticandwechoose H =∇ 2 f (x),then d isa(block)coordinatedescent direction [4,41,49,54,55]....
[...]
References
41,772 citations
[...]
34,729 citations
"Iterative Methods for Optimization" refers background or methods in this paper
...32), [115], [116], [127]....
[...]
...The minimum norm solution can be expressed in terms of the singular value decomposition [127], [249] of A, A = UΣV T ....
[...]
...11) by computing the Cholesky factorization [249], [127] of H H = LL ,...
[...]
...We refer the reader to [11], [15], [127], and [154] for more discussion of preconditioners and their construction....
[...]
...The material in texts such as [127] and [264] is sufficient....
[...]
28,888 citations
"Iterative Methods for Optimization" refers methods in this paper
...The Levenberg–Marquardt method [172], [183] addresses these issues by adding a regularization parameter ν > 0 to R′(xc)TR′(xc) to obtain x+ = xc + s where...
[...]
27,271 citations
"Iterative Methods for Optimization" refers background or methods in this paper
...1 Description and Implementation The Nelder–Mead [204] simplex algorithm maintains a simplex S of approximations to an optimal point....
[...]
...We will focus on three such methods: the Nelder–Mead simplex algorithm [204], the multidirectional search method [85], [261], [262], and the Hooke–Jeeves algorithm [145]....
[...]
...The ideas in this section were originally used in [155] to analyze the Nelder–Mead [204] algorithm, which we discuss in §8....
[...]
...However [204], the ability of the Nelder–Mead simplices to drastically vary their shape is an important feature of the algorithm and looking at the simplex condition alone would lead to poor results....
[...]
...Even though some of the methods, such as the Nelder–Mead [204] and Hooke–Jeeves [145] algorithms are classic, most of the convergence analysis in this part of the book was done after 1990....
[...]