scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Decision-theoretic foundations of qualitative possibility theory☆

TL;DR: A justification of two qualitative counterparts of the expected utility criterion for decision under uncertainty, which only require bounded, linearly ordered, valuation sets for expressing uncertainty and preferences, and proposes an operationally testable description of possibility theory.
About: This article is published in European Journal of Operational Research.The article was published on 2001-02-01. It has received 273 citations till now. The article focuses on the topics: Uncertainty theory & Decision theory.
Citations
More filters
Posted Content
TL;DR: The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed in this paper, mainly a bipolar extension of both Choquet integral and the Sugeno integral.
Abstract: The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed. They concern mainly a bipolar extension of both the Choquet integral and the Sugeno integral, interesting particular submodels, new learning techniques, a better interpretation of the models and a better use of the Choquet integral in multi-criteria decision aid. Parallel to these theoretical works, the Choquet integral has been applied to many new fields, and several softwares and libraries dedicated to this model have been developed.

449 citations

Journal ArticleDOI
TL;DR: The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed, which concern mainly a bipolar extension of both theChoquet integral and the Sugeno integral.
Abstract: The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed. They concern mainly a bipolar extension of both the Choquet integral and the Sugeno integral, interesting particular submodels, new learning techniques, a better interpretation of the models and a better use of the Choquet integral in multi-criteria decision aid. Parallel to these theoretical works, the Choquet integral has been applied to many new fields, and several softwares and libraries dedicated to this model have been developed.

439 citations

Journal ArticleDOI
TL;DR: An overview of some fuzzy set-based approaches to scheduling is proposed,phasizing two distinct uses of fuzzy sets: representing preference profiles and modelling uncertainty distributions, and a possibility-theoretic counterpart of PERT.

303 citations


Cites background from "Decision-theoretic foundations of q..."

  • ...See Fargier and Galvagnon (1999), Fargier et al. (2000), Dubois et al. (2003) where preliminary work for computing fuzzy latest starting times and fuzzy floats is described, especially in the case of series–parallel graphs....

    [...]

Journal ArticleDOI
TL;DR: The role of the existing body of fuzzy set aggregation operations in various kinds of problems where the process of fusion of items coming from several sources is central is discussed.

288 citations


Cites background from "Decision-theoretic foundations of q..."

  • ...When preserving commensurability between uncertainty and utility on Fnite chains, Sugeno integral has been proposed as a counterpart to expected utility [32]....

    [...]

  • ...[32], which use weighted versions of minimum and maximum operations....

    [...]

  • ...More recently, qualitative decision theory has promoted the notion of utility functions with values in Fnite ordered scales [32]....

    [...]

Journal ArticleDOI
TL;DR: A tentative assessment of the role of fuzzy sets in decision analysis is provided and a critical standpoint on the state-of-the-art is taken, in order to highlight the actual achievements and question what is often considered debatable by decision scientists observing the fuzzy decision analysis literature.

262 citations


Cites background from "Decision-theoretic foundations of q..."

  • ...In this section, we outline such a research program....

    [...]

References
More filters
Book
01 Jan 1944
TL;DR: Theory of games and economic behavior as mentioned in this paper is the classic work upon which modern-day game theory is based, and it has been widely used to analyze a host of real-world phenomena from arms races to optimal policy choices of presidential candidates, from vaccination policy to major league baseball salary negotiations.
Abstract: This is the classic work upon which modern-day game theory is based. What began more than sixty years ago as a modest proposal that a mathematician and an economist write a short paper together blossomed, in 1944, when Princeton University Press published "Theory of Games and Economic Behavior." In it, John von Neumann and Oskar Morgenstern conceived a groundbreaking mathematical theory of economic and social organization, based on a theory of games of strategy. Not only would this revolutionize economics, but the entirely new field of scientific inquiry it yielded--game theory--has since been widely used to analyze a host of real-world phenomena from arms races to optimal policy choices of presidential candidates, from vaccination policy to major league baseball salary negotiations. And it is today established throughout both the social sciences and a wide range of other sciences.

19,337 citations

Book
01 Jan 1976
TL;DR: This book develops an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions.
Abstract: Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.

14,565 citations

Journal ArticleDOI
TL;DR: The theory of possibility described in this paper is related to the theory of fuzzy sets by defining the concept of a possibility distribution as a fuzzy restriction which acts as an elastic constraint on the values that may be assigned to a variable.

8,918 citations

Book
01 Jan 1954

7,545 citations

Journal ArticleDOI
Daniel Ellsberg1
TL;DR: The notion of "degrees of belief" was introduced by Knight as mentioned in this paper, who argued that people tend to behave "as though" they assigned numerical probabilities to events, or degrees of belief to the events impinging on their actions.
Abstract: Are there uncertainties that are not risks? There has always been a good deal of skepticism about the behavioral significance of Frank Knight's distinction between “measurable uncertainty” or “risk”, which may be represented by numerical probabilities, and “unmeasurable uncertainty” which cannot. Knight maintained that the latter “uncertainty” prevailed – and hence that numerical probabilities were inapplicable – in situations when the decision-maker was ignorant of the statistical frequencies of events relevant to his decision; or when a priori calculations were impossible; or when the relevant events were in some sense unique; or when an important, once-and-for-all decision was concerned. Yet the feeling has persisted that, even in these situations, people tend to behave “as though” they assigned numerical probabilities, or “degrees of belief,” to the events impinging on their actions. However, it is hard either to confirm or to deny such a proposition in the absence of precisely-defined procedures for measuring these alleged “degrees of belief.” What might it mean operationally, in terms of refutable predictions about observable phenomena, to say that someone behaves “as if” he assigned quantitative likelihoods to events: or to say that he does not? An intuitive answer may emerge if we consider an example proposed by Shackle, who takes an extreme form of the Knightian position that statistical information on frequencies within a large, repetitive class of events is strictly irrelevant to a decision whose outcome depends on a single trial.

7,005 citations