scispace - formally typeset
Search or ask a question

Showing papers by "Henri Prade published in 2003"


01 Jan 2003
TL;DR: It is proved that the obtained possibility distribution is the least biased representation of the agent's state of knowledge compatible with the observed betting behaviour, and posed and solved the problem of finding the least informative belief function having a given pignistic probability.
Abstract: Based on the setting of exchangeable bets, this paper proposes a subjectivist view of numerical possibility theory. It relies on the assumption that when an agent constructs a probability measure by assigning prices to lotteries, this probability measure is actually induced by a belief function representing the agent's actual state of knowledge. We also assume that the probability measure proposed by the agent in the course of the elicitation procedure is constructed via the so-called pignistic transformation (mathematically equivalent to the Shapley value in game theory). We pose and solve the problem of finding the least informative belief function having a given pignistic probability. We prove that it is unique and consonant, thus induced by a possibility distribution. This result exploits a simple informational ordering, in agreement with partial orderings between belief functions, comparing their information content. The obtained possibility distribution is subjective in the same sense as in the subjectivist school in probability theory. However, we claim that it is the least biased representation of the agent's state of knowledge compatible with the observed betting behaviour.

99 citations


Book ChapterDOI
TL;DR: In this paper, a theoretical basis of fuzzy association rules is proposed by generalizing the classification of the data stored in a database into positive, negative, and irrelevant examples of a rule.
Abstract: Several approaches generalizing association rules to fuzzy association rules have been proposed so far While the formal specification of fuzzy associations is more or less straightforward, the evaluation of such rules by means of appropriate quality measures assumes an understanding of the semantic meaning of a fuzzy rule In this respect, most existing proposals can be considered ad-hoc to some extent In this paper, we suggest a theoretical basis of fuzzy association rules by generalizing the classification of the data stored in a database into positive, negative, and irrelevant examples of a rule

76 citations


Journal Article
TL;DR: The paper provides a detailed presentation of the calculus of fuzzy Allen relations (including the composition table of these relations), and discusses the patterns for propagating uncertainty about (fuzzy) Allen relations in a possibilistic way.
Abstract: This paper proposes a general discussion of the handling of imprecise and uncertain information in temporal reasoning in the framework of fuzzy sets and possibility theory. The introduction of fuzzy features in temporal reasoning can be related to different issues. First, it can be motivated by the need of a gradual, linguistic-like description of temporal relations even in the face of complete information. An extension of Allen relational calculus is proposed, based on fuzzy comparators expressing linguistic tolerance. Fuzzy Allen relations are defined from a fuzzy partition made by three possible fuzzy relations between dates (approximately equal, clearly smaller, and clearly greater). Second, the handling of fuzzy or incomplete information leads to pervade classical Allen relations, and more generally fuzzy Allen relations, with uncertainty. The paper provides a detailed presentation of the calculus of fuzzy Allen relations (including the composition table of these relations). Moreover, the paper discusses the patterns for propagating uncertainty about (fuzzy) Allen relations in a possibilistic way.

73 citations


Book ChapterDOI
25 May 2003
TL;DR: An overview of possibility theory is provides, emphasizing its historical roots and its recent developments, the simplest framework for statistical reasoning with imprecise probabilities.
Abstract: This paper provides an overview of possibility theory, emphasizing its historical roots and its recent developments. Possibility theory lies at the crossroads between fuzzy sets, probability and nonmonotonic reasoning. Possibility theory can be cast either in an ordinal or in a numerical setting. Qualitative possibility theory is closely related to belief revision theory, and commonsense reasoning with exception-tainted knowledge in Artificial Intelligence. It has been axiomatically justified in a decision-theoretic framework in the style of Savage, thus providing a foundation for qualitative decision theory. Quantitative possibility theory is the simplest framework for statistical reasoning with imprecise probabilities. As such it has close connections with random set theory and confidence intervals, and can provide a tool for uncertainty propagation with limited statistical or subjective information.

65 citations


Journal ArticleDOI
TL;DR: The notion of an “if …, then …” rule is examined in the context of positive and negative information and a new compositional rule of inference adapted to conjunctive rules, specific to positive information, is proposed.
Abstract: This article expresses the idea that information encoded on a computer may have a negative or positive emphasis. Negative information corresponds to the statement that some situations are impossible. Often, it is the case for pieces of background knowledge expressed in a logical format. Positive information corresponds to observed cases. It is encountered often in data-driven mathematical models, learning, etc. The notion of an “if …, then …” rule is examined in the context of positive and negative information. It is shown that it leads to the three-valued representation of a rule, after De Finetti, according to which a given state of the world is an example of the rule, a counterexample to the rule, or is irrelevant for the rule. This view also sheds light on the typology of fuzzy rules. It explains the difference between a fuzzy rule modeled by a many-valued implication and expressing negative information and a fuzzy rule modeled by a conjunction (a la Mamdani) and expressing positive information. A new compositional rule of inference adapted to conjunctive rules, specific to positive information, is proposed. Consequences of this framework on interpolation between sparse rules are also presented. © 2003 Wiley Periodicals, Inc.

51 citations


Journal ArticleDOI
TL;DR: A set of sound inference rules, involving the tolerance parameters, is provided, in full accordance with the combination/projection principle underlying the approximate reasoning method of Zadeh, to ensure a local propagation of fuzzy closeness and negligibility relations.
Abstract: This paper proposes a fuzzy set-based approach for handling relative orders of magnitude stated in terms of closeness and negligibility relations. At the semantic level, these relations are represented by means of fuzzy relations controlled by tolerance parameters. A set of sound inference rules, involving the tolerance parameters, is provided, in full accordance with the combination/projection principle underlying the approximate reasoning method of Zadeh. These rules ensure a local propagation of fuzzy closeness and negligibility relations. A numerical semantics is then attached to the symbolic computation process. Required properties of the tolerance parameter are investigated, in order to preserve the validity of the produced conclusions. The effect of the chaining of rules in the inference process can be controlled through the gradual deterioration of closeness and negligibility relations involved in the produced conclusions. Finally, qualitative reasoning based on fuzzy closeness and negligibility relations is used for simplifying equations and solving them in an approximate way, as often done by engineers who reason about a mathematical model. The problem of handling qualitative probabilities in reasoning under uncertainty is also investigated in this perspective.

45 citations


Journal ArticleDOI
TL;DR: A principled approach to multicriteria decision making (MCDM) where the worth of decisions along attributes is not supposed to be quantified, as in multiattribute utility theory, or even measured on a unique scale is proposed.
Abstract: This article proposes a principled approach to multicriteria decision making (MCDM) where the worth of decisions along attributes is not supposed to be quantified, as in multiattribute utility theory, or even measured on a unique scale. This approach actually generalizes additive concordance rules a la Electre and is rigorously justified in an axiomatic way by representation theorems. We indeed show that the use of a generalized concordance (GC) rule is the only possible approach when in a purely ordinal framework and that the satisfaction of very simple principles forces the use of possibility theory as the unique way of expressing the importance of coalitions of criteria. © 2003 Wiley Periodicals, Inc.

42 citations


Journal ArticleDOI
TL;DR: This paper presents an implemented information system (applied to a database describing houses to let), based on an approach developed in the fuzzy set and possibility theory setting that provides a unified framework for expressing users' preferences about what they are looking for.
Abstract: Queries to a database can be made more powerful by allowing flexibility in the specification of what has to be retrieved, and by referring to cases either for expressing the request, or for computing the answer. In this paper, we present an implemented information system (applied to a database describing houses to let), based on an approach developed in the fuzzy set and possibility theory setting. This provides a unified framework for expressing users' preferences about what they are looking for, for weighting the importance of requirements, for referring to examples that they like and/or counterexamples that they dislike, and for making case-based predictions. Thus information querying goes beyond the retrieving of items from a database, and involves associated tools which help the user to figure out the actual contents of the database.

42 citations


Journal ArticleDOI
TL;DR: This Special Issue is meant to bridge the gap between mainstream AI and current fuzzy set research, and to provide an organized view of recent works by gathering representative applications of fuzzy set and possibility theory-based methods to AI problems.

36 citations


Journal Article
TL;DR: A new logic is presented that encompasses possibilistic logic and quasi-classical logic, and preserves the merits of both logics, and can handle plain conflicts taking place at the same level of certainty.
Abstract: Possibilistic logic and quasi-classical logic are two logics that were developed in artificial intelligence for coping with inconsistency in different ways, yet preserving the main features of classical logic. This paper presents a new logic, called quasi-possibilistic logic, that encompasses possibilistic logic and quasi-classical logic, and preserves the merits of both logics. Indeed, it can handle plain conflicts taking place at the same level of certainty (as in quasi-classical logic), and take advantage of the stratification of the knowledge base into certainty layers for introducing gradedness in conflict analysis (as in possibilistic logic). When querying knowledge bases, it may be of interest to evaluate the extent to which the relevant available information is precise and consistent. The paper review measures of (im)precision and inconsistency/conflict existing in possibilistic logic and quasi-classical logic, and proposes generalized measures in the unified framework.

30 citations


Journal ArticleDOI
TL;DR: The paper studies more closely the representation of such fuzzy rules in terms of a convex combination of gradual rules, a special type of implication-based fuzzy rule inducing a crisp relation which is shown to be unique on the assumption that the implication operator used for modeling the fuzzy rule does not satisfy a special kind of strict monotonicity condition.

Journal ArticleDOI
01 Sep 2003
TL;DR: This paper deals with the extraction of default rules from a database of examples based on a special kind of probability distributions, called "big-stepped probabilities", which are known to provide a semantics for non-monotonic reasoning.
Abstract: This paper deals with the extraction of default rules from a database of examples. The proposed approach is based on a special kind of probability distributions, called "big-stepped probabilities", which are known to provide a semantics for non-monotonic reasoning. The rules which are learnt are genuine default rules, which could be used (under some conditions) in a non-monotonic reasoning system and can be encoded in possibilistic logic.

Book ChapterDOI
22 Sep 2003
TL;DR: Different types of first-order fuzzy rules and a method for learning each type and the interest of each type is discussed on a benchmark example.
Abstract: The interest of introducing fuzzy predicates when learning rules is twofold. When dealing with numerical data, it enables us to avoid arbitrary discretization. Moreover, it enlarges the expressive power of what is learned by considering different types of fuzzy rules, which may describe gradual behaviors of related attributes or uncertainty pervading conclusions. This paper describes different types of first-order fuzzy rules and a method for learning each type. Finally, we discuss the interest of each type of rules on a benchmark example.

Book ChapterDOI
TL;DR: Extensions of the classical confidence measure based on the α-cut decompositions of the fuzzy sets are proposed to address the problems associated with the normalization in scalar-valued generalizations of confidence.
Abstract: This paper investigates techniques to identify and evaluate associations in a relational database that are expressed by fuzzy if-then rules. Extensions of the classical confidence measure based on the α-cut decompositions of the fuzzy sets are proposed to address the problems associated with the normalization in scalar-valued generalizations of confidence. An analysis by α-level differentiates strongly and weakly supported associations and identifies robustness in an association. In addition, a method is proposed to assess the validity of a fuzzy association based on the ratio of examples to counterexamples.

Book ChapterDOI
TL;DR: An algorithm based on Inductive Logic Programming for inducing first order Horn clauses involving fuzzy predicates from a database using a probabilistic processing of fuzzy function, in agreement with the handling of probabilities in first order logic.
Abstract: The paper presents an algorithm based on Inductive Logic Programming for inducing first order Horn clauses involving fuzzy predicates from a database. For this, a probabilistic processing of fuzzy function is used, in agreement with the handling of probabilities in first order logic. This technique is illustrated on an experimental application. The interest of learning fuzzy first order logic expressions is emphasized.

01 Jan 2003
TL;DR: In this article, a caractere introductif montre l'interet d'une approche bipolaire de la representation des connaissances ou des preferences, laquelle permet de distinguer entre information negative and information positive.
Abstract: Cet article a caractere introductif montre l'interet d'une approche bipolaire de la representation des connaissances ou des preferences, laquelle permet de distinguer entre information negative et information positive. Cette distinction s'avere egalement fructueuse pour le traitement des enonces conditionnels.

Book ChapterDOI
02 Jul 2003
TL;DR: A method for inducing first-order rules with fuzzy predicates from a database allowing for some tolerance with respect to the interpretative scope of the predicates, and fuzzy rules aiming at expressing a set of ordinary rules in a global way are described.
Abstract: The paper describes a method for inducing first-order rules with fuzzy predicates from a database. First, the paper makes a distinction between fuzzy rules allowing for some tolerance with respect to the interpretative scope of the predicates, and fuzzy rules aiming at expressing a set of ordinary rules in a global way. Moreover the paper only considers the induction of Horn-like implicative-based fuzzy rules. Specific confidence degrees are associated with each kind of fuzzy rules in the inductive process. This technique is illustrated on an experimental application. The interest of learning various types of fuzzy first-order logic expressions is emphasized.


Book ChapterDOI
TL;DR: Conditions under which the relation between quantities estimated in terms of fuzzy absolute labels can be expressed in Terms of fuzzy relative orders of magnitude are provided.
Abstract: Fuzzy absolute and fuzzy relative orders of magnitude models are recalled. Then, the problem of the consistency of these models is addressed. The paper provides conditions under which the relation between quantities estimated in terms of fuzzy absolute labels can be expressed in terms of fuzzy relative orders of magnitude. Conversely, possible estimates in terms of fuzzy absolute labels are obtained from information about fuzzy relative and fuzzy absolute orders of magnitude.

Proceedings Article
09 Aug 2003
TL;DR: An approach to the approximate description of univariate real-valued functions in terms of precise or imprecise reference points and interpolation between these points is presented by means of gradual rules which express that the closer the variable to the abscissa of a reference point, the close the value of the function to the ordinate of this reference point.
Abstract: This paper presents an approach to the approximate description of univariate real-valued functions in terms of precise or imprecise reference points and interpolation between these points. It is achieved by means of gradual rules which express that the closer the variable to the abscissa of a reference point, the closer the value of the function to the ordinate of this reference point. Gradual rules enable us to specify sophisticated gauges, under the form of connected areas, inside of which the function belonging to the class under consideration should remain. This provides a simple and efficient tool for categorizing signals. This tool can be further improved by making the gauge flexible by means of fuzzy gradual rules. This is illustrated on a benchmark example.

Book ChapterDOI
04 Dec 2003
TL;DR: A new approach based on possibility theory is proposed, which integrates both the merits of argumentation-based negotiation and of heuristic methods looking for making trade-offs.
Abstract: Negotiation plays a key role as a means for sharing information and resources with the aim of looking for a common agreement This paper proposes a new approach based on possibility theory, which integrates both the merits of argumentation-based negotiation and of heuristic methods looking for making trade-offs This unified setting proves to be convenient not only for representing the mental states of the agents (beliefs possibly pervaded with uncertainty, and prioritized goals), but also for revising the belief bases and for selecting a new offer

Journal ArticleDOI
TL;DR: A general approach for fusing prioritized bases where priorities are encoded in the possibilistic logic framework, and provides a syntactic counterpart to the merging of propositional bases.
Abstract: Cet article presente une approche generale, syntaxique et semantique, pour la fusion de bases propositionnelles avec priorites explicites, ces dernieres etant representees dans le cadre de la logique possibiliste. Dans un premier temps, nous presentons differentes classes d'operateurs de fusion possibilistes conjonctifs ou disjonctifs, ayant ou non des effets de renforcement, etc. Nous montrons ensuite que les approches qui ont ete recemment proposees pour la fusion des bases propositionnelles sans priorites peuvent etre facilement exprimees dans le cadre possibiliste. Cela a deux avantages majeurs: le resultat de la fusion est une base avec priorites explicites, et par consequent le processus de fusion peut etre itere d'une maniere coherente. De plus, on fournit une contrepartie syntaxique, de la fusion de bases propositionnelles generalement definie de maniere semantique. This paper presents a general approach, both syntactic and semantic, for fusing prioritized bases where priorities are encoded in the possibilistic logic framework. First a survey of diflerent classes of possibilistic merging operators is provided, analyzing the behaviors of conjunctive and disjunctive operators, having or not a reinforcement effect. We then show that approaches which have been recently proposed for fusing propositional bases can be easily encoded with possibilistic setting. This has two advantages: first, the result of merging is a prioritized base, hence the merging process can be decomposed and iterated in a coherent way. Moreover, it provides a syntactic counterpart to the merging of propositional bases.

01 Jan 2003
TL;DR: This paper suggests a theoretical basis of fuzzy association rules by generalizing the classification of the data stored in a database into positive, negative, and irrelevant examples of a rule.

Proceedings Article
01 Sep 2003
TL;DR: The paper discusses a method for introducing membership degrees inside the interpolation graph by means of gradual rules for delimiting areas where the function may lie between known points.
Abstract: Functional laws may be known only at a finite number of points, and then the function can be completed by interpolation techniques obeying some smoothness conditions. We rather propose here to specify constraints by means of gradual rules for delimiting areas where the function may lie between known points. Such an approach results in an imprecise interpolation graph whose shape is controlled by tuning the fuzziness attached to the reference points. However, the graph so-built is still crisp, which means that different possible paths between the interpolation points cannot be distinguished according to their plausibility. The paper discusses a method for introducing membership degrees inside the interpolation graph. The developed formalism relies on the use of weighted nested graphs. It amounts to handling level 2 gradual rules for specifying a family of flexible constraints on the reference points. The proposed approach is compared with the one of extending gradual rules for dealing with type2 fuzzy reference points.

Book ChapterDOI
TL;DR: This paper presents an alternative to precise analytical modelling, by means of imprecise interpolative models, based on gradual rules that express constraints that govern the interpolation mechanism to the classification of time series.
Abstract: This paper presents an alternative to precise analytical modelling, by means of imprecise interpolative models. The model specification is based on gradual rules that express constraints that govern the interpolation mechanism. The modelling strategy is applied to the classification of time series. In this context, it is shown that good recognition performance can be obtained with models that are highly imprecise.


Book ChapterDOI
01 Jan 2003
TL;DR: This work formalizes case-based decision making within the framework of fuzzy sets and possibility theory and proposes a reasonable relaxation of the original decision principle, namely to look for acts which have yielded good results, not necessarily for all, but at least for most cases in the past.
Abstract: The idea of case-based decision making has recently been proposed as an alternative to expected utility theory. It combines concepts and principles from both decision theory and case-based reasoning. Loosely speaking, a case-based decision maker learns by storing already experienced decision problems, along with a rating of the results. Whenever a new problem needs to be solved, possible actions are assessed on the basis of experience from similar situations in which these actions have already been applied. We formalize case-based decision making within the framework of fuzzy sets and possibility theory. The basic idea underlying this approach is to give preference to acts which have always led to good results for problems which are similar to the current one. We also propose two extensions of the basic model. Firstly, we deal separately with situations where an agent has made very few, if any, observations. Obviously, such situations are difficult to handle for a case-based approach. Secondly, we propose a reasonable relaxation of the original decision principle, namely to look for acts which have yielded good results, not necessarily for all, but at least for most cases in the past.

Book ChapterDOI
01 Jan 2003
TL;DR: This paper discusses the fusion of multiple sources information in this setting, with different classes of merging operators considered, at the semantic and the syntactic level, including conjunctive, disjunctive, reinforcement, adaptive and averaging operators.
Abstract: The problem of merging or combining multiple sources information is central in many information processing areas such as databases integrating problems, expert opinion pooling, preference aggregation, etc. Possibilistic logic offers a qualitative framework for representing pieces of information associated with levels of uncertainty or priority. This paper discusses the fusion of multiple sources information in this setting. Different classes of merging operators are considered, at the semantic and the syntactic level, including conjunctive, disjunctive, reinforcement, adaptive and averaging operators. This framework appears to be the syntactic counterpart of the pointwise aggregation of possibility distributions or fuzzy sets.