scispace - formally typeset
Journal ArticleDOI

Decomposition Principle for Linear Programs

George B. Dantzig, +1 more
- 01 Feb 1960 - 
- Vol. 8, Iss: 1, pp 101-111
TLDR
A technique is presented for the decomposition of a linear program that permits the problem to be solved by alternate solutions of linear sub-programs representing its several parts and a coordinating program that is obtained from the parts by linear transformations.
Abstract
A technique is presented for the decomposition of a linear program that permits the problem to be solved by alternate solutions of linear sub-programs representing its several parts and a coordinating program that is obtained from the parts by linear transformations. The coordinating program generates at each cycle new objective forms for each part, and each part generates in turn from its optimal basic feasible solutions new activities columns for the interconnecting program. Viewed as an instance of a “generalized programming problem” whose columns are drawn freely from given convex sets, such a problem can be studied by an appropriate generalization of the duality theorem for linear programming, which permits a sharp distinction to be made between those constraints that pertain only to a part of the problem and those that connect its parts. This leads to a generalization of the Simplex Algorithm, for which the decomposition procedure becomes a special case. Besides holding promise for the efficient computation of large-scale systems, the principle yields a certain rationale for the “decentralized decision process” in the theory of the firm. Formally the prices generated by the coordinating program cause the manager of each part to look for a “pure” sub-program analogue of pure strategy in game theory, which he proposes to the coordinator as best he can do. The coordinator finds the optimum “mix” of pure sub-programs using new proposals and earlier ones consistent with over-all demands and supply, and thereby generates new prices that again generates new proposals by each of the parts, etc. The iterative process is finite.

read more

Citations
More filters
Journal ArticleDOI

EVPI-Based Importance Sampling Solution Procedures for Multistage Stochastic Linear Programmes on Parallel MIMD Architectures

TL;DR: The nodal parallelization technique is presented in this article, as it balances load more efficiently than its alternative, and achieves near linear speedup on a variety of multicomputing platforms: problems with few variables and constraints per node do not gain from this parallelisation.
Journal ArticleDOI

Dantzig–Wolfe decomposition and plant-wide MPC coordination

TL;DR: This work studies the feasibility of applying Dantzig–Wolfe decomposition to provide an on-line solution for coordinating decentralized MPC and proposes a framework for designing a coordination system for centralized MPC which requires only minor modification to the current MPC layer.
Journal ArticleDOI

Genetic Algorithms for Optimal Reservoir Dispatching

TL;DR: It is concluded that with three basic generators selection, crossover and mutation genetic algorithm could search the optimum solution or near-optimal solution to a complex water resources problem.
Journal ArticleDOI

Grid-Enabled Optimization with GAMS

TL;DR: A number of new features of the GAMS modeling system provide a lightweight, portable, and powerful framework for optimization on a grid that facilitates the widely used master-worker model of computing.
Book ChapterDOI

Computational Integer Programming and Cutting Planes

TL;DR: This article describes components and recent developments that can be found in many solvers of linear programming based relaxation methods, including Langrangean, Dantzig-Wolfe and Benders' decomposition and their interrelations.