scispace - formally typeset
Search or ask a question

Showing papers on "Metaheuristic published in 1976"



Journal ArticleDOI
TL;DR: The following hypothesis is made: given an appropriate measure of nearness for subjective probability functions, the probability function the individual should adopt is the one closest to his original which is consistent with the new information.
Abstract: In a paper entitled ‘Toward an Optimization Procedure For Applying Minimum Change Principles in Probability Kinematics’ [ I], the problem of adjusting an individual’s degree-of-belief function in response to new information is analyzed as a mathematical programming problem in an infinite dimensional space. The following hypothesis is made: given an appropriate measure of nearness for subjective probability functions, the probability function the individual should adopt is the one closest to his original which is consistent with the new information. Mathematically the problem is formulated as follows: Let d(-, .) be a metric on the set of probability functions. Suppose that p. is an individual’s original probability function, and that a certain experience leads him to change his degree of belief in a proposition b from its original value pa(b) to some new value y. Then the individual’s new probability function p+ should be one which minimizes d(po, p) over the set of all probability functions for which p(b) = y. The metric used in [I] results from an embedding of A, the set of all subjective probability functions, into a Banach space. On A d(., .) takes the form

6 citations


01 Nov 1976
TL;DR: A new class of random search algorithms for stochastic optimization is presented and the designer has the option to employ a learning memory in order to reduce the cost of the optimization process measured in terms of the number of observations.
Abstract: : A new class of random search algorithms for stochastic optimization is presented. The designer has the option to employ a learning memory in order to reduce the cost of the optimization process measured in terms of the number of observations. The asymptotical properties of the procedure are discussed, and new probability theoretical techniques are used in the proof of convergence. (Author)

2 citations



Journal ArticleDOI
TL;DR: The solution of a combined static and dynamic optimization problem is considered and by means of a feasible decomposition procedure this combined problem is first decomposed into a two-level optimization problem.
Abstract: The solution of a combined static and dynamic optimization problem is considered in this paper. By means of a feasible decomposition procedure this combined problem is first decomposed into a two-level optimization problem. In this new problem the first-level subproblem is found to be one of conventional optimal control and mathematical programming which can be solved using known techniques. For the solution of the second-level problem a computational algorithm is developed using the gradient projection method. The existence of the solution is investigated. To illustrate the applicability of the method, a simplified gantry crane yard design problem is considered and some computational results are given.

2 citations