scispace - formally typeset

Topic

Maxima and minima

About: Maxima and minima is a(n) research topic. Over the lifetime, 6170 publication(s) have been published within this topic receiving 170062 citation(s). The topic is also known as: minima and maxima & variable's extreme values.


Papers
More filters
Journal ArticleDOI
Abstract: Linear registration and motion correction are important components of structural and functional brain image analysis. Most modern methods optimize some intensity-based cost function to determine the best registration. To date, little attention has been focused on the optimization method itself, even though the success of most registration methods hinges on the quality of this optimization. This paper examines the optimization process in detail and demonstrates that the commonly used multiresolution local optimization methods can, and do, get trapped in local minima. To address this problem, two approaches are taken: (1) to apodize the cost function and (2) to employ a novel hybrid global-local optimization method. This new optimization method is specifically designed for registering whole brain images. It substantially reduces the likelihood of producing misregistrations due to being trapped by local minima. The increased robustness of the method, compared to other commonly used methods, is demonstrated by a consistency test. In addition, the accuracy of the registration is demonstrated by a series of experiments with motion correction. These motion correction experiments also investigate how the results are affected by different cost functions and interpolation methods.

7,937 citations

Journal ArticleDOI
TL;DR: A powerful method for exploring the properties of the multidimensional free energy surfaces of complex many-body systems by means of coarse-grained non-Markovian dynamics in the space defined by a few collective coordinates is introduced.
Abstract: We introduce a powerful method for exploring the properties of the multidimensional free energy surfaces (FESs) of complex many-body systems by means of coarse-grained non-Markovian dynamics in the space defined by a few collective coordinates. A characteristic feature of these dynamics is the presence of a history-dependent potential term that, in time, fills the minima in the FES, allowing the efficient exploration and accurate determination of the FES as a function of the collective coordinates. We demonstrate the usefulness of this approach in the case of the dissociation of a NaCl molecule in water and in the study of the conformational changes of a dialanine in solution.

3,998 citations

Journal ArticleDOI
TL;DR: The current knowledge about numerical instabilities such as checkerboards, mesh-dependence and local minima occurring in applications of the topology optimization method are summarized and the methods with which they can be avoided are listed.
Abstract: In this paper we seek to summarize the current knowledge about numerical instabilities such as checkerboards, mesh-dependence and local minima occurring in applications of the topology optimization method. The checkerboard problem refers to the formation of regions of alternating solid and void elements ordered in a checkerboard-like fashion. The mesh-dependence problem refers to obtaining qualitatively different solutions for different mesh-sizes or discretizations. Local minima refers to the problem of obtaining different solutions to the same discretized problem when choosing different algorithmic parameters. We review the current knowledge on why and when these problems appear, and we list the methods with which they can be avoided and discuss their advantages and disadvantages.

1,555 citations

Journal ArticleDOI
TL;DR: A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization, which is quite costly in terms of function evaluations, but its cost can be predicted in advance, depending only slightly on the starting point.
Abstract: A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization.The algorithm is essentially an iterative random search procedure with adaptive moves along the coordinate directions. It permits uphill moves under the control of a probabilistic criterion, thus tending to avoid the first local minima encountered.The algorithm has been tested against the Nelder and Mead simplex method and against a version of Adaptive Random Search. The test functions were Rosenbrock valleys and multiminima functions in 2,4, and 10 dimensions.The new method proved to be more reliable than the others, being always able to find the optimum, or at least a point very close to it. It is quite costly in term of function evaluations, but its cost can be predicted in advance, depending only slightly on the starting point.

1,550 citations

Dissertation
01 Jan 2002
TL;DR: This thesis presents a theoretical model that can be used to describe the long-term behaviour of the Particle Swarm Optimiser and results are presented to support the theoretical properties predicted by the various models, using synthetic benchmark functions to investigate specific properties.
Abstract: Many scientific, engineering and economic problems involve the optimisation of a set of parameters. These problems include examples like minimising the losses in a power grid by finding the optimal configuration of the components, or training a neural network to recognise images of people's faces. Numerous optimisation algorithms have been proposed to solve these problems, with varying degrees of success. The Particle Swarm Optimiser (PSO) is a relatively new technique that has been empirically shown to perform well on many of these optimisation problems. This thesis presents a theoretical model that can be used to describe the long-term behaviour of the algorithm. An enhanced version of the Particle Swarm Optimiser is constructed and shown to have guaranteed convergence on local minima. This algorithm is extended further, resulting in an algorithm with guaranteed convergence on global minima. A model for constructing cooperative PSO algorithms is developed, resulting in the introduction of two new PSO-based algorithms. Empirical results are presented to support the theoretical properties predicted by the various models, using synthetic benchmark functions to investigate specific properties. The various PSO-based algorithms are then applied to the task of training neural networks, corroborating the results obtained on the synthetic benchmark functions.

1,463 citations


Network Information
Related Topics (5)
Matrix (mathematics)

105.5K papers, 1.9M citations

83% related
Nonlinear system

208.1K papers, 4M citations

83% related
Scattering

152.3K papers, 3M citations

80% related
Cluster analysis

146.5K papers, 2.9M citations

80% related
Artificial neural network

207K papers, 4.5M citations

79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
2021247
2020299
2019303
2018261
2017212