scispace - formally typeset
Search or ask a question

Showing papers in "Annals of Operations Research in 1989"


Journal ArticleDOI
TL;DR: A state-of-the-art survey of the results, applications, algorithms and implementations for dynamic network flows.
Abstract: Dynamic network flow models describe network-structured, decision-making problems over time. They are of interest because of their numerous applications and intriguing dynamic structure. The dynamic models are specially structured problems that can be solved with known general methods. However, specialized techniques have been developed to exploit the underlying dynamic structure. Here, we present a state-of-the-art survey of the results, applications, algorithms and implementations for dynamic network flows.

302 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide an axiomatization of the class of betweenness-conforming utility theories and apply it to the second-price auction mechanism, where they show that its demand-revelation property under expected utility is not robust with respect to a class of preferences.
Abstract: This paper focuses on the betweenness property of expected utility theory. We provide an axiomatization of the class of betweenness-conforming utility theories. Subclasses of betweenness-conforming preferences are axiomatized with ‘substitution’ axioms of intermediate generality. The latter axioms incorporate specifically the effects of replacing a certain outcome with a lottery that is indifferent to it. Our representation is applied to the second-price auction mechanism where we show that its demand-revelation property under expected utility is not robust with respect to the class of betweenness-conforming preferences.

232 citations


Journal ArticleDOI
TL;DR: In this article, a modified version of the auction algorithm was proposed for solving linear transportation problems, where the authors converted the transportation problem into an assignment problem and then modified the auction to exploit the special structure of this problem.
Abstract: The auction algorithm is a parallel relaxation method for solving the classical assignment problem. It resembles a competitive bidding process whereby unassigned persons bid simultaneously for objects, thereby raising their prices. Once all bids are in, objects are awarded to the highest bidder. This paper generalizes the auction algorithm to solve linear transportation problems. The idea is to convert the transportation problem into an assignment problem, and then to modify the auction algorithm to exploit the special structure of this problem. Computational results show that this modified version of the auction algorithm is very efficient for certain types of transportation problems.

186 citations


Journal ArticleDOI
Uzi Segal1
TL;DR: In this paper, it was shown that a preference relation over lotteries can be represented by a measure of the area above the distribution function of each lottery, which is called the anticipated utility functional.
Abstract: This paper presents axioms which imply that a preference relation over lotteries can be represented by a measure of the area above the distribution function of each lottery. A special case of this family is the anticipated utility functional. One additional axiom implies this theory. This functional is then extended for the case of vectorial prizes.

177 citations



Journal ArticleDOI
TL;DR: In this article, the authors describe and compare stochastic network optimization models for investment planning under uncertainty, focusing on multi-period a sset allocation and active portfolio management problems.
Abstract: We describe and compare stochastic network optimization models for investment planning under uncertainty. Emphasis is placed on multiperiod a sset allocation and active portfolio management problems. Myopic as well as multiple period models are considered. In the case of multiperiod models, the uncertainty in asset returns filters into the constraint coefficient matrix, yielding a multi-scenario program formulation. Different scenario generation procedures are examined. The use of utility functions to reflect risk bearing attitudes results in nonlinear stochastic network models. We adopt a newly proposed decomposition procedure for solving these multiperiod stochastic programs. The performance of the models in simulations based on historical data is discussed.

123 citations


Journal ArticleDOI
TL;DR: This article examined the risk propensities of experienced executives in the oil and gas industry faced with a hypothetical risky business decision that involves significant gains and losses, consistent with the prediction of prospect theory.
Abstract: This paper examines the risk propensities of experienced executives in the oil and gas industry faced with a hypothetical risky business decision that involves significant gains and losses. The executives were asked to provide the minimum price their firm should accept before selling their share of a joint exploration venture whose future prospects were systematically varied to include gains only, losses only, and mixed gains and losses. In addition, they were asked to provide a single probability equivalence for a mixed gain/loss situation in lieu of breaking even for sure. The executives were more risk taking than risk averse over pure losses, consistent with the prediction of prospect theory. Over pure gains, however, there was as much risk taking as risk aversion, with more risk taking occurring when the chance of breaking even was higher. The relationship between risk propensity over pure gains and over pure losses was insignificant, indicating very different attitudes in these two domains. Although the reflection effect did occur in some cases, it was not pervasive. There was a tendency for certainty equivalences to show greater risk taking than probability equivalences in mixed gain/loss situations, which was consistent with a reframing effect. Risk propensity over mixed gains and losses was closer to that expressed in the losses only domain than to risk propensity over pure gains. More than half of the executives gave responses that were fully consistent with expected utility, and an additional quarter of executives were consistent within a 10% margin of error in their responses. However, one out of five executives did not satisfy the stochastic dominance relationships among the certainty equivalences. Systematic inconsistencies occurred most frequently in the mixed situations where the certainty equivalences for some subjects were biased toward the outcome that had the predominant chance of occurring.

60 citations


Journal ArticleDOI
TL;DR: In this article, the same decision maker at different decision nodes as different agents is considered, and the Bayesian Nash equilibrium of this game is applied to a finite ascending bid auction game.
Abstract: Decision makers whose preferences do not satisfy the independence axiom of expected utility theory, when faced with sequential decisions will act in a dynamically inconsistent manner. In order to avoid this inconsistency and maintain nonexpected utility, we suggest the idea of behavioral consistency. We implement this notion by regarding the same decision maker at different decision nodes as different agents, and then taking the Bayesian — Nash equilibrium of this game. This idea is applied to a finite ascending bid auction game. We show the condition for the existence of an equilibrium of this game, and we also characterize the equilibrium in those cases when it exists. In particular, when the utility functionals are both quasi-concave and quasi-convex, then there is an equilibrium in dominant strategies where each bidder continues to bid if and only if the prevailing price is smaller than his value. In the case of quasi-concavity it is shown that, in equilibrium, each bidder has a value such that he continues with positive probability up to it, and withdraws after that.

53 citations


Journal ArticleDOI
TL;DR: The computational results show that, in each case, the advantage of the adaptive version (as measured by total number of permanent labels) grows with the problem size.
Abstract: In this paper, we examine the problems of finding thek-shortest paths, thek-shortest paths without repeated nodes, and the shortest path with a single side constraint between an origin and destination pair. Distances are assumed to be non-negative, but networks may be either directed or undirected. Two versions of algorithms for all these problems are compared. The computational results show that, in each case, the advantage of the adaptive version (as measured by total number of permanent labels) grows with the problem size.

53 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a heuristic to obtain an approximate solution for the targeting problem in multiproduct manufacturing systems, which is a nonlinear integer program that is not easy to solve.
Abstract: We consider multiproduct manufacturing systems modeled by open networks of queues with general distributions for arrival patterns and service times. Since exact solutions are not available for measuring mean number of jobs in these systems, we rely on approximate analyses based on the decomposition approach developed, among others, by Reiser and Kobayashi [16], Kuehn [14], Shanthikumar and Buzacott [19], Whitt [29], and extensions by Bitran and Tirupati [2]. The targeting problem (TP) presented in this paper addresses capacity planning issues in multiproduct manufacturing systems. Since TP is a nonlinear integer program that is not easy to solve, we present a heuristic to obtain an approximate solution. We also provide bounds on the performance of this heuristic and illustrate our approach by means of a numerical example.

49 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify three objectives for developing a satisfactory procedure: (1) the capability of validating expressed preference differences by actual choices among naturally occurring options, (2) compatibility with the existing problem structure, and (3) no confounding of extraneous factors in the measurement of preference intensity.
Abstract: The concept of preference intensity has been criticized over the past sixty years for having no substantive meaning. Much of the controversy stems from the inadequacy of measurement procedures. In reviewing the shortcomings of existing procedures, we identify three objectives for developing a satisfactory procedure: (1) the capability of validating expressed preference differences by actual choices among naturally occurring options, (2) compatibility with the existing problem structure, and (3) no confounding of extraneous factors in the measurement of preference intensity. Several recently developed measurement procedures are criticized for failing one or more of these objectives. We then examine three different approaches for measuring preference intensity based on multiple perspectives. Thereplication approach emerges as a promising way of satisfying the three objectives above. This methodology applies to problems where an attribute can be replicated by “parallel components” that are independent, identical copies of the attribute. We illustrate the approach with two applications reported in the decision analysis literature. We also offer guidance on how to construct parallel components satisfying the requisite properties.

Journal ArticleDOI
TL;DR: Important features of object-oriented computing and the relevance of such an approach in modelling and developing software for manufacturing systems are presented.
Abstract: With the recognition of the importance of Computer Integrated Systems (CIM) in improving manufacturing productivity, there is a pressing need for good software modelling approaches to support efficient design and control of manufacturing systems. Software design concepts based on Object-Oriented Programming (OOP) are emerging as powerful techniques for developing large scale software systems. This paper presents important features of object-oriented computing and the relevance of such an approach in modelling and developing software for manufacturing systems.

Journal ArticleDOI
TL;DR: Adescriptive model of decision making under ambiguity based on principles of behavior, i.e. principles that describe how people behave as opposed to how theyshould behave, is derived and compared to other models of ambiguity proposed in the literature.
Abstract: Contrary to most formal models of decision making under risk and uncertainty that are built on the basis ofprescriptive behavioral principles or axioms, this paper derives adescriptive model of decision making under ambiguity based on principles of behavior, i.e. principles that describehow people behave as opposed to how theyshould behave. The model assumes that people evaluate the impact of ambiguous probabilities by first anchoring on a given value of the unknown probability and then adjusting this by the net effect of imagining or “trying out” other values the probability could take. The mental simulation process incorporates giving differential weight to the ranges of probability values above and below the anchor where such weight reflects individual and situational variables. In particular, the assumption that people are cautious as opposed to reckless in making decisions leads to attributing more weight to possible values of probabilities below the anchor when considering potential gains, and the reverse when faced with potential losses. Two experiments test implications of the model for situations involving competitive decisions. The first concerns legal decision making, the second involves the purchase and sale of industrial equipment. Both experiments validate predictions that (a) the effects of ambiguity depend on both the sign of payoffs and the level of probabilities, and (b) two parties to a negotiation or transaction may be differentially sensitive to ambiguity. Finally, the model is compared to other models of ambiguity proposed in the literature, and directions are suggested for future work.

Journal ArticleDOI
TL;DR: The author argues for a major redirection of effort toward PDS by the decision science community, and suggests promising directions for its development with illustrations from his company's work.
Abstract: Descriptive decision science says how peopledo make up their minds (e.g. as psychological and organizational theory). Normative decision science says howideal peoplewould make up their minds (e.g. as statistical decision theory). Prescriptive decision science (PDS) says how peopleshould make up their minds (including a distinctive fusion of the descriptive and normative). PDS supports the development and validation of decision-aiding technology, to make it appropriate for specific circumstances, balancing considerations of feasible input, useful output, logical coherence, and cost of implementation. The author argues for a major redirection of effort toward PDS by the decision science community, and suggests promising directions for its development with illustrations from his company's work.

Journal ArticleDOI
S. X. C. Lou, Garrett van Ryzin1
TL;DR: In this paper, the authors developed the control rules for job shop scheduling based on the Flow Rate Control model and derived optimal control results for job shops with work station in series (transfer line).
Abstract: In this paper, we develop the control rules for job shop scheduling based on theFlow Rate Control model. We derive optimal control results for job shops with work station in series (transfer line). We use these results to derive rules which are suboptimal, robust against random events, and easy to implement and expand.

Journal ArticleDOI
TL;DR: It is shown that some forms of tables with subtotals always have a controlled rounding solution, while other table structures cannot be guaranteed such a solution under “zero-restrictedness”.
Abstract: The problem of rounding in statistical tables to protect confidentiality is an important problem in the area of data publication, especially for official statistics. Controlled rounding involves rounding the table data to a prespecified base while ensuring additivity to totals. Previous research provided a formulation of the controlled rounding problem of a simple two-way table as a transportation problem. This paper extends that work to tables with subtotals by using a capacitated transshipment formulation. It is shown that some forms of tables with subtotals always have a controlled rounding solution. Other table structures cannot be guaranteed such a solution under “zero-restrictedness”. Initial computational experience suggests that the method is viable for use in practical situations.

Journal ArticleDOI
TL;DR: In this paper, an axiomatization for the implicit rank-dependent mean value is provided, which class includes a number of mean values that have appeared in statistics, in utility theory and in the theory of income inequality measurement.
Abstract: This paper provides an axiomatization for the implicit rank-dependent mean value, which class includes a number of mean values that have appeared in statistics, in utility theory and in the theory of income inequality measurement.

Journal ArticleDOI
TL;DR: In this article, an overview of lexicographic choice under conditions of uncertainty is presented, where the usual Archimedean axiom is weakened and the role of these variants in explaining some well-known paradoxes of choice theory is reviewed.
Abstract: This overview focuses on lexicographic choice under conditions of uncertainty. First, lexicographic versions of traditional (von Neumann-Morgenstern) expected utility theory are described where the usual Archimedean axiom is weakened. The role of these lexicographic variants in explaining some well-known “paradoxes” of choice theory is reviewed. Next, the significance of lexicographic choice for game theory is discussed. Finally, some lexicographic extensions of the classical maximin decision rule are described.

Journal ArticleDOI
Peter C. Fishburn1
TL;DR: In this article, the authors developed theories that generalize the von Neumann -Morgenstern theory of preference under risk and Savage's preference under uncertainty to accommodate systematic and predictable violations of previous theories while not giving up too much of the mathematical elegance of their expected utility representations.
Abstract: This paper reviews recently developed theories that generalize the von Neumann — Morgenstern theory of preference under risk and Savage's theory of preference under uncertainty. The new theories are designed to accommodate systematic and predictable violations of previous theories while not giving up too much of the mathematical elegance of their expected utility representations. The material in the paper is adapted from the author's bookNonlinear Preference and Utility Theory.

Journal ArticleDOI
TL;DR: In this article, the duality problems in expected utility theory, raised by the introduction of non-additive probabilities, are examined, and it is shown that these problems do not arise if the probability measure is symmetric, i.e. has the property of complementary additivity.
Abstract: Some duality problems in expected utility theory, raised by the introduction of non-additive probabilities, are examined. These problems do not arise if the probability measure is symmetric; i.e. has the property of complementary additivity. Additional, mild properties of coherence of conditional probabilities imply full additivity of the unconditional measure.

Journal ArticleDOI
TL;DR: An FMS scheduling method that treats an FMS as a group of problem-solving agents cooperating to perform manufacturing jobs and its ability to handle the dynamically changing production conditions, its taking into account the communication method, the improved reliability, and the use of distributed control is described.
Abstract: This paper describes an FMS scheduling method that treats an FMS as a group of problem-solving agents cooperating to perform manufacturing jobs. The main thrusts of such a method include the ability to handle the dynamically changing production conditions, its taking into account the communication method, the improved reliability, and the use of distributed control. The paper emphasizes research issues associated with various aspects of the cooperative problem-solving method, including: (1) dynamic task assignments, (2) the coordination mechanism, and (3) knowledge-based scheduling as problem solving. A simulation study which compares the performance of the cooperative problem solving approach with that of the more traditional scheduling approaches is also reported.

Journal ArticleDOI
Robert F. Nau1
TL;DR: In this paper, the decision maker's degree of beliefs in the occurrence of an event is represented by a unimodal (in fact, concave) function on the unit interval, whose parameters are elicited in terms of lower and upper probabilities with attached confidence weights.
Abstract: This paper presents a new method of modeling indeterminate and incoherent probability judgments in decision analysis problems. The decision maker's degree of beliefs in the occurrence of an event is represented by a unimodal (in fact, concave) function on the unit interval, whose parameters are elicited in terms of lower and upper probabilities with attached confidence weights. This is shown to provide a unified framework for performing sensitivity analysis, reconciling incoherence, and combining expert judgments.

Journal ArticleDOI
TL;DR: The problem was solved using both dynamic programming and a branch and bound approach using state space relaxation to minimise the total costs of operating the system.
Abstract: Dynamic facility location is concerned with developing a location decision plan over a given planning horizon during which changes in the market and in costs are expected to occur. The objective is to select from a list of predetermined possible facility sites the locations of the facilities to use in each period of the planning horizon to minimise the total costs of operating the system. The costs considered here include not only transport and operation/maintenance charges but also relocation costs arising from the opening and closing of facilities as required by the plan.


Journal ArticleDOI
TL;DR: In this article, the authors show that differences between models of choice under uncertainty may be derived primarily from different assumptions about the appropriate ways in which states of the world may be compared and combined.
Abstract: The primary purpose of this paper is to show that differences between models of choice under uncertainty may be derived primarily from different assumptions about the “appropriate” ways in which states of the world may be compared and combined. It considers different concepts of stochastic dominance arising from different permitted transformations on the ordering of prizes during a comparison of two lotteries. These concepts imply various forms of the Independence axiom and correspond to various non-expected utility theories.


Journal ArticleDOI
TL;DR: The paper analyzes such approaches for production planning architectures by use of a unifying mathematical formulation of the production plan optimization problem, to recognize the main features of the existing planning approaches, and compare their usefulness in different manufacturing processes.
Abstract: Recent trends in automated manufacturing call for hierarchical decision architectures for production planning, suitable for integration with part flow controls. Different design approaches are currently adopted for implementing production planning architectures, depending either on the objective of defining a centralized production plan for the whole manufacturing system (as in the case of MRP and OPT), or on the desire of coordinating local plans for the component work cells (as for JIT). The paper analyzes such approaches by use of a unifying mathematical formulation of the production plan optimization problem, to recognize the main features of the existing planning approaches, and compare their usefulness in different manufacturing processes.

Journal ArticleDOI
TL;DR: In this paper, the authors characterize ambiguity aversion in terms of the Subjectively Weighted Linear Utility (SWLU) model parameters and show that ambiguity content may reasonably be regarded as residing in the decision maker's subjective probability distribution of induced utility.
Abstract: The Subjectively Weighted Linear Utility (SWLU) model for decision making under uncertainty can accommodate non-neutral attitudes toward ambiguity. We first characterize ambiguity aversion in terms of the SWLU model parameters. In addition, we show that ambiguity content may reasonably be regarded as residing in the decision maker's subjective probability distribution of induced utility. In particular, (a) a special kind of mean preserving spread of the induced utility distribution will always increase ambiguity content, and (b) utility distributions which are more shiftable by new information have higher ambiguity content.


Journal ArticleDOI
TL;DR: The history of both aggregate planning and learning models, the various combined models, and their appropriateness to a given environment are discussed.
Abstract: Production managers employ numerous aggregate planning models to smooth work loads and minimize labor and inventory costs. Some of the more recently developed models incorporate the learning that occurs during repetitive work. This article discusses the history of both aggregate planning and learning models, the various combined models, and their appropriateness to a given environment.