scispace - formally typeset
Open AccessJournal ArticleDOI

Multivariate stochastic approximation using a simultaneous perturbation gradient approximation

James C. Spall
- 01 Mar 1992 - 
- Vol. 37, Iss: 3, pp 332-341
Reads0
Chats0
TLDR
The paper presents an SA algorithm that is based on a simultaneous perturbation gradient approximation instead of the standard finite-difference approximation of Keifer-Wolfowitz type procedures that can be significantly more efficient than the standard algorithms in large-dimensional problems.
Abstract
The problem of finding a root of the multivariate gradient equation that arises in function minimization is considered. When only noisy measurements of the function are available, a stochastic approximation (SA) algorithm for the general Kiefer-Wolfowitz type is appropriate for estimating the root. The paper presents an SA algorithm that is based on a simultaneous perturbation gradient approximation instead of the standard finite-difference approximation of Keifer-Wolfowitz type procedures. Theory and numerical experience indicate that the algorithm can be significantly more efficient than the standard algorithms in large-dimensional problems. >

read more

Content maybe subject to copyright    Report






Citations
More filters
Journal ArticleDOI

A stochastic simplex approximate gradient (StoSAG) for optimization under uncertainty

TL;DR: In this article, the Stochastic Simplex Approximate Gradient (StoSAG) algorithm is proposed to estimate an approximate gradient using an ensemble of randomly chosen control vectors.
Journal ArticleDOI

An analog VLSI recurrent neural network learning a continuous-time trajectory

TL;DR: This work presents an alternative implementation in analog VLSI, which employs a stochastic perturbation algorithm to observe the gradient of the error index directly on the network in random directions of the parameter space, thereby avoiding the tedious task of deriving the gradient from an explicit model of the network dynamics.
Journal ArticleDOI

Comparative study of stochastic algorithms for system optimization based on gradient approximations

TL;DR: In this paper, the authors studied three types of stochastic approximation algorithms in a multivariate Kiefer-Wolfowitz setting, which uses only noisy measurements of the loss function (i.e., no loss function gradient measurements).
Journal ArticleDOI

Sequential minimal optimization for quantum-classical hybrid algorithms

TL;DR: A sequential minimal optimization method for quantum-classical hybrid algorithms, which converges faster, is robust against statistical error, and is hyperparameter-free, which substantially outperforms the existing optimization algorithms and converges to a solution almost independent of the initial choice of the parameters.

Monkey Algorithm for Global Numerical Optimization

TL;DR: The computational results show that the monkey algorithm can find optimal or near-optimal solutions to the problems with a large dimensions and very large numbers of local optima.
References
More filters
Journal ArticleDOI

Stochastic Estimation of the Maximum of a Regression Function

TL;DR: In this article, the authors give a scheme whereby, starting from an arbitrary point, one obtains successively $x_2, x_3, \cdots$ such that the regression function converges to the unknown point in probability as n \rightarrow \infty.
Journal ArticleDOI

Multidimensional Stochastic Approximation Methods

TL;DR: In this paper, a multidimensional stochastic approximation scheme is presented, and conditions are given for these schemes to converge a.s.p.s to the solutions of $k-stochastic equations in $k$ unknowns.
Journal ArticleDOI

Accelerated Stochastic Approximation

TL;DR: In this article, the Robbins-Monro procedure and the Kiefer-Wolfowitz procedure are considered, for which the magnitude of the $n$th step depends on the number of changes in sign in $(X_i - X_{i - 1})$ for n = 2, \cdots, n.