scispace - formally typeset
Search or ask a question

Showing papers presented at "Uncertainty in Artificial Intelligence in 1991"


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, the problem of theory refinement under uncertainty is reviewed in the context of Bayesian statistics, a theory of belief revision, reduced to an incremental learning task as follows: the learning system is initially primed with a partial theory supplied by a domain expert, and thereafter maintains its own internal representation of alternative theories which is able to be interrogated by the domain expert and can be incrementally refined from data.
Abstract: Theory refinement is the task of updating a domain theory in the light of new cases, to be done automatically or with some expert assistance. The problem of theory refinement under uncertainty is reviewed here in the context of Bayesian statistics, a theory of belief revision. The problem is reduced to an incremental learning task as follows: the learning system is initially primed with a partial theory supplied by a domain expert, and thereafter maintains its own internal representation of alternative theories which is able to be interrogated by the domain expert and able to be incrementally refined from data. Algorithms for refinement of Bayesian networks are presented to illustrate what is meant by "partial theory", "alternative theory representation", etc. The algorithms are an incremental variant of batch learning algorithms from the literature so can work well in batch and incremental mode.

748 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, a Bayesian method for constructing Bayesian belief networks from a database of cases is presented, which can be used for automated hypothesis testing, automated scientific discovery, and automated construction of probabilistic expert systems.
Abstract: This paper presents a Bayesian method for constructing Bayesian belief networks from a database of cases. Potential applications include computer-assisted hypothesis testing, automated scientific discovery, and automated construction of probabilistic expert systems. Results are presented of a preliminary evaluation of an algorithm for constructing a belief network from a database of cases. We relate the methods in this paper to previous work, and we discuss open problems.

259 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, the authors discuss representing and reasoning with knowledge about the time-dependent utility of an agent's actions, and present a semantics for timedependent utility and describe the use of timedependent information in decision contexts.
Abstract: We discuss representing and reasoning with knowledge about the time-dependent utility of an agent's actions. Time-dependent utility plays a crucial role in the interaction between computation and action under bounded resources. We present a semantics for time-dependent utility and describe the use of time-dependent information in decision contexts. We illustrate our discussion with examples of time-pressured reasoning in Protos, a system constructed to explore the ideal control of inference by reasoners with limit abilities.

106 citations


Book ChapterDOI
13 Jul 1991
TL;DR: This paper presents a general approach based on linear constraint systems that naturally generates alternative explanations in an orderly and highly efficient manner that is applied to cost-based abduction problems as well as belief revision in Bayesian networks.
Abstract: In general, the best explanation for a given observation makes no promises on how good it is with respect to other alternative explanations. A major deficiency of message-passing schemes for belief revision in Bayesian networks is their inability to generate alternatives beyond the second best. In this paper, we present a general approach based on linear constraint systems that naturally generates alternative explanations in an orderly and highly efficient manner. This approach is then applied to cost-based abduction problems as well as belief revision in Bayesian networks.

101 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, an approach to reasoning with default rules where the proportion of exceptions, or more generally the probability of encountering an exception, can be at least roughly assessed is presented.
Abstract: An approach to reasoning with default rules where the proportion of exceptions, or more generally the probability of encountering an exception, can be at least roughly assessed is presented. It is based on local uncertainty propagation rules which provide the best bracketing of a conditional probability of interest from the knowledge of the bracketing of some other conditional probabilities. A procedure that uses two such propagation rules repeatedly is proposed in order to estimate any simple conditional probability of interest from the available knowledge. The iterative procedure, that does not require independence assumptions, looks promising with respect to the linear programming method. Improved bounds for conditional probabilities are given when independence assumptions hold.

85 citations


Book ChapterDOI
13 Jul 1991
TL;DR: Qualitative probabilistic analysis is employed to obtain bounds on the relative probability of partial hypotheses for the BN20 class of networks networks and a generalization replacing the noisy OR assumption by negative synergy.
Abstract: Since exact probabilistic inference is intractable in general for large multiply connected belief nets, approximate methods are required. A promising approach is to use heuristic search among hypotheses (instantiations of the network) to find the most probable ones, as in the TopN algorithm. Search is based on the relative probabilities of hypotheses which are efficient to compute. Given upper and lower bounds on the relative probability of partial hypotheses, it is possible to obtain bounds on the absolute probabilities of hypotheses. Best-first search aimed at reducing the maximum error progressively narrows the bounds as more hypotheses are examined. Here, qualitative probabilistic analysis is employed to obtain bounds on the relative probability of partial hypotheses for the BN20 class of networks networks and a generalization replacing the noisy OR assumption by negative synergy. The approach is illustrated by application to a very large belief network, QMR-BN, which is a reformulation of the Internist-1 system for diagnosis in internal medicine.

70 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, a branching time possible-worlds model for representing and reasoning about, beliefs, goals, intentions, time, actions, probabilities, and payoffs is presented.
Abstract: Deliberation plays an important role in the design of rational agents embedded in the real-world. In particular, deliberation leads to the formation of intentions, i.e., plans of action that the agent is committed to achieving. In this paper, we present a branching time possible-worlds model for representing and reasoning about, beliefs, goals, intentions, time, actions, probabilities, and payoffs. We compare this possible-worlds approach with the more traditional decision tree representation and provide a transformation from decision trees to possible worlds. Finally, we illustrate how an agent can perform deliberation using a decision-tree representation and then use a possible-worlds model to form and reason about his intentions.

44 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, a simple framework for Horn clause abduction, with probabilities associated with hypotheses, is presented, which can represent any probabilistic knowledge representable in a Bayesian belief network and can be used as a basis for a new way to implement Bayesian networks that allows for approximations to the value of the posterior probabilities.
Abstract: This paper presents a simple framework for Horn clause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logical and probabilistic notions of evidential reasoning. This can be used as a basis for a new way to implement Bayesian Networks that allows for approximations to the value of the posterior probabilities, and also points to a way that Bayesian networks can be extended beyond a propositional language.

37 citations


Book ChapterDOI
13 Jul 1991
TL;DR: Conceptual relations that synthesize utilitarian and logical concepts, extending the logics of preference of Rescher are introduced, which permit the combination and aggregation of goal-specific quality measures into global measures of utility.
Abstract: This paper introduces conceptual relations that synthesize utilitarian and logical concepts, extending the logics of preference of Rescher. We define first, in the context of a possible worlds model, constraint-dependent measures that quantify the relative quality of alternative solutions of reasoning problems or the relative desirability of various policies in control, decision, and planning problems. We show that these measures may be interpreted as truth values in a multi valued logic and propose mechanisms for the representation of complex constraints as combination of simpler restrictions. These extended logical operations permit also the combination and aggregation of goal-specific quality measures into global measures of utility. We identify also relations that represent differential preferences between alternative solutions and relate them to the previously defined desirability measures. Extending conventional modal logic formulations, we introduce structures for the representation of ignorance about the utility of alternative solutions. Finally, we examine relations between these concepts and similarity-based semantic models of fuzzy logic.

35 citations


Book ChapterDOI
13 Jul 1991
TL;DR: In this article, a Monte-Carlo algorithm for the calculation of Dempster-Shafer belief functions is described, which can be used to improve the complexity of the Shenoy Shafer algorithm on Markov trees.
Abstract: A very computationally-efficient Monte-Carlo algorithm for the calculation of Dempster-Shafer belief is described. If Bel is the combination using Dempster's Rule of belief functions Bel1,..., Belm, then, for subset b of the frame Θ, Bel(b) can he calculated in time linear in |Θ| and m (given that the weight of conflict is bounded). The algorithm can also be used to improve the complexity of the Shenoy-Shafer algorithms on Markov trees, and be generalised to calculate Dempster-Shafer Belief over other logics.

35 citations


Proceedings Article
01 Jan 1991
TL;DR: It is concern you to try reading management of uncertainty as one of the reading material to finish quickly to increase the knowledge.
Abstract: Feel lonely? What about reading books? Book is one of the greatest friends to accompany while in your lonely time. When you have no friends and activities somewhere and sometimes, reading book can be a great choice. This is not only for spending the time, it will increase the knowledge. Of course the b=benefits to take will relate to what kind of book that you are reading. And now, we will concern you to try reading management of uncertainty as one of the reading material to finish quickly.

Book ChapterDOI
13 Jul 1991
TL;DR: This paper proposes a new method of representing a Bayesian decision problem as a valuation-based system and applying a fusion algorithm for solving it that is a hybrid of local computational methods for computation of marginals of joint probability distributions and the local computational method for discrete optimization problems.
Abstract: This paper proposes a new method for solving Bayesian decision problems. The method consists of representing a Bayesian decision problem as a valuation-based system and applying a fusion algorithm for solving it. The fusion algorithm is a hybrid of local computational methods for computation of marginals of joint probability distributions and the local computational methods for discrete optimization problems.

Book ChapterDOI
13 Jul 1991
TL;DR: An efficient implementation of belief function propagation on the basis of the local computation technique is described, which avoids all the redundant computations in the propagation process, and makes the computational complexity decrease with respect to other existing implementations.
Abstract: The local computation technique (Shafer et al. 1987, Shafer and Shenoy 1988, Shenoy and Shafer 1986) is used for propagating belief functions in so-called a Markov Tree. In this paper, we describe an efficient implementation of belief function propagation on the basis of the local computation technique. The presented method avoids all the redundant computations in the propagation process, and so makes the computational complexity decrease with respect to other existing implementations (Hsia and Shenoy 1989, Zarley et al. 1988). We also give a combined algorithm for both propagation and re-propagation which makes the re-propagation process more efficient when one or more of the prior belief functions is changed.

Proceedings Article
01 Jan 1991
TL;DR: A representation capable of making this intradistribution structure explicit, and an extension to the SPI algorithm capable of utilizing this structural information to improve the efficiency of inference are presented.
Abstract: A Bayesian belief net is a factored representation for a joint probability distribution over a set of variables. This factoring is made possible by the conditional independence relationships among variables made evident in the sparseness of the graphical level of the net. There is, however, another source of factoring available which cannot be directly represented in this graphical structure. This source is the microstructure within an individual marginal or conditional distribution. We present a representation capable of making this intradistribution structure explicit, and an extension to the SPI algorithm capable of utilizing this structural information to improve the efficiency of inference. We discuss the expressivity of the local expression language, and present early experimental results showing the efficacy of the approach.

Book ChapterDOI
13 Jul 1991
TL;DR: The Dempster-Shafer theory of belief functions has been studied in this article, where the authors define the most cautious conjunction of beliefs and prove the existence of such a conjunction in the subcategory of separable belief functions.
Abstract: The categorial approach to evidential reasoning can be seen as a combination of the probability kinematics approach of Richard Jeffrey (1965) and the maximum (cross-) entropy inference approach of E. T. Jaynes (1957). As a consequence of that viewpoint, it is well known that category theory provides natural definitions for logical connectives. In particular, disjunction and conjunction are modelled by general categorial constructions known as products and coproducts. In this paper, I focus mainly on Dempster-Shafer theory of belief functions for which I introduce a category I call Dempster's category. I prove the existence of and give explicit formulas for conjunction and disjunction in the subcategory of separable belief functions. In Dempster's category, the new defined conjunction can be seen as the most cautious conjunction of beliefs, and thus no assumption about distinctness (of the sources) of beliefs is needed as opposed to Dempster's rule of combination, which calls for distinctness (of the sources) of beliefs.

Book ChapterDOI
13 Jul 1991
TL;DR: Several theorems are presented that state conditions under which it is possible to reliably infer the causal relation between two measured variables, regardless of whether latent variables are acting or not.
Abstract: The presence of latent variables can greatly complicate inferences about causal relations between measured variables from statistical data. In many cases, the presence of latent variables makes it impossible to determine for two measured variables A and B, whether A causes B, B causes A, or there is some common cause. In this paper I present several theorems that state conditions under which it is possible to reliably infer the causal relation between two measured variables, regardless of whether latent variables are acting or not.

Book ChapterDOI
John Fox1, Paul Krause1
13 Jul 1991
TL;DR: This paper argues that the concentration on theoretical techniques for the evaluation and selection of decision options has distracted attention from many of the wider issues in decision making, and describes work which is under way towards providing a theoretical framework for symbolic decision procedures.
Abstract: The ability to reason under uncertainty and with incomplete information is a fundamental requirement of decision support technology. In this paper we argue that the concentration on theoretical techniques for the evaluation and selection of decision options has distracted attention from many of the wider issues in decision making. Although numerical methods of reasoning under uncertainty have strong theoretical foundations, they are representationally weak and only deal with a small part of the decision process. Knowledge-based systems, on the other hand, offer greater flexibility but have not been accompanied by a clear decision theory. We describe here work which is under way towards providing a theoretical framework for symbolic decision procedures. A central proposal is an extended form of inference which we call argumentation; reasoning for and against decision options from generalised domain theories. The approach has been successfully used in several decision support applications, but it is argued that a comprehensive decision theory must cover autonomous decision making, where the agent can formulate questions as well as take decisions. A major theoretical challenge for this theory is to capture the idea of reflection to permit decision agents to reason about their goals, what they believe and why, and what they need to know or do in order to achieve their goals.

Book ChapterDOI
13 Jul 1991
TL;DR: In this article, the graphoid axioms for conditional independence have been used for probabilistic reasoning without resorting to their numerical definition, and a representation for independence statements using multiple undirected graphs and some simple graphical transformations has been proposed.
Abstract: The graphoid axioms for conditional independence, originally described by Dawid [1979], are fundamental to probabilistic reasoning [Pearl, 1988]. Such axioms provide a mechanism for manipulating conditional independence assertions without resorting to their numerical definition. This paper explores a representation for independence statements using multiple undirected graphs and some simple graphical transformations. The independence statements derivable in this system are equivalent to those obtainable by the graphoid axioms. Therefore, this is a purely graphical proof technique for conditional independence.

Book ChapterDOI
13 Jul 1991
TL;DR: A probabilistic analysis is used to justify a method for quickly identifying and rejecting useless paths and key conditions and assumptions necessary for marker-passing to perform well are identified.
Abstract: Useless paths are a chronic problem for marker-passing techniques. We use a probabilistic analysis to justify a method for quickly identifying and rejecting useless paths. Using the same analysis, we identify key conditions and assumptions necessary for marker-passing to perform well.

Book ChapterDOI
13 Jul 1991
TL;DR: A follow-up study of a sensitivity analysis study of Pathfinder suggests a viable extension of a common decision analytic sensitivity analysis to the larger, more complex settings generally encountered in artificial intelligence.
Abstract: At last year's Uncertainty in AI Conference, we reported the results of a sensitivity analysis study of Pathfinder. Our findings were quite unexpected--slight variations to Pathfinder's parameters appeared to lead to substantial degradations in system performance. A careful look at our first analysis, together with the valuable feedback provided by the participants of last year's conference, led us to conduct a follow-up study. Our follow-up differs from our initial study in two ways: (i) the probabilities 0.0 and 1.0 remained unchanged, and (ii) the variations to the probabilities that are close to both ends (0.0 or 1.0) were less than the ones close to the middle (0.5). The results of the follow-up study look more reasonable--slight variations to Pathfinder's parameters now have little effect on its performance. Taken together, these two sets of results suggest a viable extension of a common decision analytic sensitivity analysis to the larger, more complex settings generally encountered in artificial intelligence.

Book ChapterDOI
13 Jul 1991
TL;DR: In this paper, a new probabilistic network construction system, DYNASTY, is proposed for diagnostic reasoning given variables whose probabilities change over time, and is formulated as a sequential stochastic process.
Abstract: A new probabilistic network construction system, DYNASTY, is proposed for diagnostic reasoning given variables whose probabilities change over time. Diagnostic reasoning is formulated as a sequential stochastic process, and is modeled using influence diagrams. Given a set O of observations, DYNASTY creates an influence diagram in order to devise the best action given O. Sensitivity analyses are conducted to determine if the best network has been created, given the uncertainty in network parameters and topology. DYNASTY uses an equivalence class approach to provide decision thresholds for the sensitivity analysis. This equivalence-class approach to diagnostic reasoning differentiates diagnoses only if the required actions are different. A set of network-topology updating algorithms are proposed for dynamically updating the network when necessary.

Proceedings Article
01 Jan 1991

Book ChapterDOI
13 Jul 1991
TL;DR: The general approach is to eliminate aspects of heuristic rules which directly or indirectly include assumptions regarding the user's attitude towards risk, and replace them with explicit, user-specified probabilistic multi attribute utility and probability distribution functions.
Abstract: In mechanical design, there is often unavoidable uncertainty in estimates of design performance. Evaluation of design alternatives requires consideration of the impact of this uncertainty. Expert heuristics embody assumptions regarding the designer's attitude towards risk and uncertainty that might be reasonable in most cases but inaccurate in others. We present a technique to allow designers to incorporate their own unique attitude towards uncertainty as opposed to those assumed by the domain expert's rules. The general approach is to eliminate aspects of heuristic rules which directly or indirectly include assumptions regarding the user's attitude towards risk, and replace them with explicit, user-specified probabilistic multiattribute utility and probability distribution functions. We illustrate the method in a system for material selection for automobile bumpers.

Book ChapterDOI
13 Jul 1991
TL;DR: This paper outlines a methodology for analyzing the representational support for knowledgebased decision-modeling in a broad domain and some insights are gained into a design approach for integrating categorical and uncertain knowledge in a context-sensitive manner.
Abstract: This paper outlines a methodology for analyzing the representational support for knowledgebased decision-modeling in a broad domain. A relevant set of inference patterns and knowledge types are identified. By comparing the analysis results to existing representations, some insights are gained into a design approach for integrating categorical and uncertain knowledge in a context-sensitive manner.

Book ChapterDOI
13 Jul 1991
TL;DR: In this article, the authors study the uses and the semantics of non-monotonic negation in probabilistic deductive data bases and introduce the notion of stable formula, functions.
Abstract: In this paper we study the uses and the semantics of non-monotonic negation in probabilistic deductive data bases. Based on the stable semantics for classical logic programming, we introduce the notion of stable formula, functions. We show that stable formula, functions are minimal fixpoints of operators associated with probabilistic deductive databases with negation. Furthermore, since a. probabilistic deductive database may not necessarily have a stable formula function, we provide a stable class semantics for such databases. Finally, we demonstrate that the proposed semantics can handle default reasoning naturally in the context of probabilistic deduction.

Book ChapterDOI
13 Jul 1991
TL;DR: A general architecture for the monitoring and diagnosis of large scale sensor-based systems with real time diagnostic constraints is presented, combining a single monitoring level based on statistical methods with two model based diagnostic levels.
Abstract: We present a general architecture for the monitoring and diagnosis of large scale sensor-based systems with real time diagnostic constraints. This architecture is multileveled, combining a single monitoring level based on statistical methods with two model based diagnostic levels. At each level, sources of uncertainty are identified, and integrated methodologies for uncertainty management are developed. The general architecture was applied to the monitoring and diagnosis of a specific nuclear physics detector at Lawrence Berkeley National Laboratory that contained approximately 5000 components and produced over 500 channels of output data. The general architecture is scalable, and work is ongoing to apply it to detector systems one and two orders of magnitude more complex.

Book ChapterDOI
13 Jul 1991
TL;DR: In this article, a method for determining the variances in inferred probabilities is obtained under the assumption that a posterior distribution on the uncertainty variables can be approximated by the prior distribution.
Abstract: The belief network is a well-known graphical structure for representing independences in a joint probability distribution. The methods, which perform probabilistic inference in belief networks, often treat the conditional probabilities which are stored in the network as certain values. However, if one takes either a subjectivistic or a limiting frequency approach to probability, one can never be certain of probability values. An algorithm should not only be capable of reporting the probabilities of the alternatives of remaining nodes when other nodes are instantiated; it should also be capable of reporting the uncertainty in these probabilities relative to the uncertainty in the probabilities which are stored in the network. In this paper a method for determining the variances in inferred probabilities is obtained under the assumption that a posterior distribution on the uncertainty variables can be approximated by the prior distribution. It is shown that this assumption is plausible if their is a reasonable amount of confidence in the probabilities which are stored in the network. Furthermore in this paper, a surprising upper bound for the prior variances in the probabilities of the alternatives of all nodes is obtained in the case where the probability distributions of the probabilities of the alternatives are beta distributions. It is shown that the prior variance in the probability at an alternative of a node is bounded above by the largest variance in an element of the conditional probability distribution for that node.

Book ChapterDOI
13 Jul 1991
TL;DR: It is conjecture that whenever an agent entertains the belief that E will occur with c degree of confidence, the agent will be surprised (to the extent c) upon realizing that E did not occur.
Abstract: We motivate and describe a theory of belief in this paper. This theory is developed with the following view of human belief in mind. Consider the belief that an event E will occur (or has occurred or is occurring). An agent either entertains this belief or does not entertain this belief (i.e., there is no "grade" in entertaining the belief). If the agent chooses to exercise "the will to believe" and entertain this belief, he/she/it is entitled to a degree of confidence c (1 ≥ c > 0) in doing so. Adopting this view of human belief, we conjecture that whenever an agent entertains the belief that E will occur with c degree of confidence, the agent will be surprised (to the extent c) upon realizing that E did not occur.

Book ChapterDOI
13 Jul 1991
TL;DR: The concept of movable evidence masses that flow from supersets to subsets as specified by experts represents a suitable framework for reasoning under uncertainty as mentioned in this paper, where the mass flow is controlled by specialization matrices.
Abstract: The concept of movable evidence masses that flow from supersets to subsets as specified by experts represents a suitable framework for reasoning under uncertainty. The mass flow is controlled by specialization matrices. New evidence is integrated into the frame of discernment by conditioning or revision (Dempster's rule of conditioning), for which special specialization matrices exist. Even some aspects of non-monotonic reasoning can be represented by certain specialization matrices.

Book ChapterDOI
13 Jul 1991
TL;DR: A stochastic version of the EM-algorithm is adapted to probabilistic neural networks describing the associative dependency of variables, which have a probability distribution which can be combined allowing to integrate Probabilistic rules as well as unspecified associations in a sound way.
Abstract: The EM-algorithm is a general procedure to get maximum likelihood estimates if part of the observations on the variables of a network are missing. In this paper a stochastic version of the algorithm is adapted to probabilistic neural networks describing the associative dependency of variables. These networks have a probability distribution, which is a special case of the distribution generated by probabilistic inference networks. Hence both types of networks can be combined allowing to integrate probabilistic rules as well as unspecified associations in a sound way. The resulting network may have a number of interesting features including cycles of probabilistic rules, hidden 'unobservable' variables, and uncertain and contradictory evidence.