scispace - formally typeset
Open AccessJournal ArticleDOI

A New Method for Minimising a Sum of Squares without Calculating Gradients

G. Peckham
- 01 Nov 1970 - 
- Vol. 13, Iss: 4, pp 418-420
TLDR
A new method for minimising a sum of squares of non-linear functions is described and is shown to be more efficient than other methods in that fewer function values are required.
Abstract
A new method for minimising a sum of squares of non-linear functions is described and is shown to be more efficient than other methods in that fewer function values are required.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods ∗

TL;DR: This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited, then turns to a broad class of methods for which the underlying principles allow general-ization to handle bound constraints and linear constraints.
Journal ArticleDOI

A semiautomatic algorithm for rutherford backscattering analysis

TL;DR: In this paper, nonlinear least squares techniques have been applied to Rutherford backscattering spectrometry (RBS), allowing routine multivariable fits of simulated spectra to experimental data.
Journal ArticleDOI

Identification of systems containing linear dynamic and static nonlinear elements

TL;DR: It is shown that systems composed of cascade, feedforward, feedback and multiplicative connections of linear dynamic and zero memory nonlinear elements can be identified in terms of the individual component subsystems from measurements of the system input and output only.
Journal ArticleDOI

Dud, A Derivative-Free Algorithm for Nonlinear Least Squares

TL;DR: The performance of the new Gauss-Newton-like algorithm, called Dud for “doesn't use derivatives”, is evaluated on a number of standard test problems from the literature and it competes favorably with even the best derivative-based algorithms.
Journal ArticleDOI

Derivative-free optimization methods

TL;DR: A review of derivative-free methods for non-convex optimization problems is given in this paper, with an emphasis on recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature.
References
More filters
Journal ArticleDOI

A simplex method for function minimization

TL;DR: A method is described for the minimization of a function of n variables, which depends on the comparison of function values at the (n 41) vertices of a general simplex, followed by the replacement of the vertex with the highest value by another point.
Journal ArticleDOI

Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation

TL;DR: A technique for empirical optimisation is presented in which a sequence of experimental designs each in the form of a regular or irregular simplex is used, each simplex having all vertices but one in common with the preceding simplex, and being completed by one new point.
Journal ArticleDOI

A New Method of Constrained Optimization and a Comparison With Other Methods

TL;DR: A new method for finding the maximum of a general non-linear function of several variables within a constrained region is described, and shown to be efficient compared with existing methods when the required optimum lies on one or more constraints.
Related Papers (5)