scispace - formally typeset
Search or ask a question
JournalISSN: 0364-765X

Mathematics of Operations Research 

Institute for Operations Research and the Management Sciences
About: Mathematics of Operations Research is an academic journal published by Institute for Operations Research and the Management Sciences. The journal publishes majorly in the area(s): Mathematics & Markov decision process. It has an ISSN identifier of 0364-765X. Over the lifetime, 2499 publications have been published receiving 154861 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Optimal auctions are derived for a wide class of auction design problems when the seller has imperfect information about how much the buyers might be willing to pay for the object.
Abstract: This paper considers the problem faced by a seller who has a single object to sell to one of several possible buyers, when the seller has imperfect information about how much the buyers might be willing to pay for the object. The seller's problem is to design an auction game which has a Nash equilibrium giving him the highest possible expected utility. Optimal auctions are derived in this paper for a wide class of auction design problems.

6,003 citations

Journal ArticleDOI
TL;DR: It turns out that the ratio between the two grows at most logarithmically in the largest column sum of A when all the components of cT are the same, which reduces to a theorem established previously by Johnson and Lovasz.
Abstract: Let A be a binary matrix of size m × n, let cT be a positive row vector of length n and let e be the column vector, all of whose m components are ones. The set-covering problem is to minimize cTx subject to Ax ≥ e and x binary. We compare the value of the objective function at a feasible solution found by a simple greedy heuristic to the true optimum. It turns out that the ratio between the two grows at most logarithmically in the largest column sum of A. When all the components of cT are the same, our result reduces to a theorem established previously by Johnson and Lovasz.

2,645 citations

Journal ArticleDOI
TL;DR: If U is an ellipsoidal uncertainty set, then for some of the most important generic convex optimization problems (linear programming, quadratically constrained programming, semidefinite programming and others) the corresponding robust convex program is either exactly, or approximately, a tractable problem which lends itself to efficientalgorithms such as polynomial time interior point methods.
Abstract: We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we lay the foundation of robust convex optimization. In the main part of the paper we show that if U is an ellipsoidal uncertainty set, then for some of the most important generic convex optimization problems (linear programming, quadratically constrained programming, semidefinite programming and others) the corresponding robust convex program is either exactly, or approximately, a tractable problem which lends itself to efficientalgorithms such as polynomial time interior point methods.

2,501 citations

Journal ArticleDOI
TL;DR: The results are strong in that they hold whether the problem size is measured by number of tasks, number of bits required to express the task lengths, or by the sum of thetask lengths.
Abstract: NP-complete problems form an extensive equivalence class of combinatorial problems for which no nonenumerative algorithms are known. Our first result shows that determining a shortest-length schedule in an m-machine flowshop is NP-complete for m ≥ 3. For m = 2, there is an efficient algorithm for finding such schedules. The second result shows that determining a minimum mean-flow-time schedule in an m-machine flowshop is NP-complete for every m ≥ 2. Finally we show that the shortest-length schedule problem for an m-machine jobshop is NP-complete for every m ≥ 2. Our results are strong in that they hold whether the problem size is measured by number of tasks, number of bits required to express the task lengths, or by the sum of the task lengths.

2,351 citations

Journal ArticleDOI
TL;DR: Two general convergence proofs for random search algorithms are given and how these extend those available for specific variants of the conceptual algorithm studied here are shown.
Abstract: We give two general convergence proofs for random search algorithms. We review the literature and show how our results extend those available for specific variants of the conceptual algorithm studied here. We then exploit the convergence results to examine convergence rates and to actually design implementable methods. Finally we report on some computational experience.

1,550 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202392
2022178
2021125
202067
201952
201849