Journal ArticleDOI
Asymptotic global behavior for stochastic approximation and diffusions with slowly decreasing noise effects: Global minimization via Monte Carlo
Reads0
Chats0
TLDR
In this article, the authors studied the asymptotic behavior of the systems where the objective function values can only be sampled via Monte Carlo, where the discrete algorithm is a combination of stochastic approximation and simulated annealing.Abstract:
The asymptotic behavior of the systems $X_{n + 1} = X_n + a_n b( {X_n ,\xi _n } ) + a_n \sigma ( X_n )\psi_n $ and $dy = \bar b( y )dt + \sqrt {a( t )} \sigma ( y )dw$ is studied, where $\{ {\psi _n } \}$ is i.i.d. Gaussian, $\{ \xi _n \}$ is a (correlated) bounded sequence of random variables and $a_n \approx A_0/\log (A_1 + n )$. Without $\{ \xi _n \}$, such algorithms are versions of the “simulated annealing” method for global optimization. When the objective function values can only be sampled via Monte Carlo, the discrete algorithm is a combination of stochastic approximation and simulated annealing. Our forms are appropriate. The $\{ \psi _n \}$ are the “annealing” variables, and $\{ \xi _n \}$ is the sampling noise. For large $A_0 $, a full asymptotic analysis is presented, via the theory of large deviations: Mean escape time (after arbitrary time n) from neighborhoods of stable sets of the algorithm, mean transition times (after arbitrary time n) from a neighborhood of one stable set to another, a...read more
Citations
More filters
Journal ArticleDOI
Sparse calibration of subsurface flow models using nonlinear orthogonal matching pursuit and an iterative stochastic ensemble method
TL;DR: The proposed NOMP, a nonlinear orthogonal matching pursuit for sparse calibration of subsurface flow models, is the first ensemble based algorithm that tackels the sparse nonlinear parameter estimation problem.
Journal ArticleDOI
Chemical distance geometry: Current realization and future projection
TL;DR: In this article, the authors present a review of the state of the art in distance geometry and molecular conformation in chemical applications, with a focus on the problem of determining macromolecular conformation.
Proceedings ArticleDOI
Efficient global optimization using SPSA
J.L. Maryak,D.C. Chin +1 more
TL;DR: It is argued that, in some cases, the naturally occurring error in the SPSA gradient approximation effectively introduces injected noise that promotes convergence of the algorithm to a global optimum (obviating the necessity for injecting extra noise).
Proceedings ArticleDOI
Weight Space Probability Densities in Stochastic Learning: II. Transients and Basin Hopping Times
Genevieve Orr,Todd K. Leen +1 more
TL;DR: Theoretical predictions of the time required for noise-induced hopping between basins of different optima are compared with simulations of large ensembles of networks for simple problems in supervised and unsupervised learning.
Adaptive Online Learning of Bayesian Network Parameters
TL;DR: The paper shows convergence properties of the Voting EM that uses a constant learning rate and uses the convergence properties to formulate an error driven scheme for adapting the learning rate.
References
More filters
Journal ArticleDOI
Optimization by Simulated Annealing
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Journal ArticleDOI
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
Stuart Geman,Donald Geman +1 more
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Book
Large Deviations and Applications
TL;DR: The large deviation problem for empirical distributions of Markov Processes has been studied in this article, where it has been applied to the problem of the Wiener Sausage problem.
Journal ArticleDOI
Diffusions for global optimizations
Stuart Geman,Chii-Ruey Hwang +1 more
TL;DR: In this article, the authors seek a global minimum of $U:[0, 1]^n \to R$, where R is the number of vertices in an n-dimensional (n-dimensional) Brownian motion.
Journal ArticleDOI
The averaging principle and theorems on large deviations
TL;DR: In this article, the average principle for stochastic differential equations is used to describe the behavior of the system over large time intervals, and the probability of large deviations from the averaged system is analyzed.
Related Papers (5)
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
Stuart Geman,Donald Geman +1 more