scispace - formally typeset
Search or ask a question
Author

Glenn Shafer

Bio: Glenn Shafer is an academic researcher from Rutgers University. The author has contributed to research in topics: Probability interpretations & Dempster–Shafer theory. The author has an hindex of 41, co-authored 148 publications receiving 14006 citations. Previous affiliations of Glenn Shafer include Princeton University & Royal Holloway, University of London.


Papers
More filters
Journal ArticleDOI
TL;DR: Three heuristics that are employed in making judgements under uncertainty are described: representativeness, availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development.

5,935 citations

Journal Article
TL;DR: This tutorial presents a self-contained account of the theory of conformal prediction and works through several numerical examples of how the model under which successive examples are sampled independently from the same distribution can be applied to any method for producing ŷ.
Abstract: Conformal prediction uses past experience to determine precise levels of confidence in new predictions. Given an error probability e, together with a method that makes a prediction ŷ of a label y, it produces a set of labels, typically containing ŷ, that also contains y with probability 1 – e. Conformal prediction can be applied to any method for producing ŷ: a nearest-neighbor method, a support-vector machine, ridge regression, etc. Conformal prediction is designed for an on-line setting in which labels are predicted successively, each one being revealed before the next is predicted. The most novel and valuable feature of conformal prediction is that if the successive examples are sampled independently from the same distribution, then the successive predictions will be right 1 – e of the time, even though they are based on an accumulating data set rather than on independent data sets. In addition to the model under which successive examples are sampled independently, other on-line compression models can also use conformal prediction. The widely used Gaussian linear model is one of these. This tutorial presents a self-contained account of the theory of conformal prediction and works through several numerical examples. A more comprehensive treatment of the topic is provided in Algorithmic Learning in a Random World, by Vladimir Vovk, Alex Gammerman, and Glenn Shafer (Springer, 2005).

648 citations

Book
01 Jan 2007
TL;DR: Algorithmic Learning in a Random World describes recent theoretical and experimental developments in building computable approximations to Kolmogorov's algorithmic notion of randomness and describes how several important machine learning problems cannot be solved if the only assumption is randomness.
Abstract: Algorithmic Learning in a Random World describes recent theoretical and experimental developments in building computable approximations to Kolmogorov's algorithmic notion of randomness. Based on these approximations, a new set of machine learning algorithms have been developed that can be used to make predictions and to estimate their confidence and credibility in high-dimensional spaces under the usual assumption that the data are independent and identically distributed (assumption of randomness). Another aim of this unique monograph is to outline some limits of predictions: The approach based on algorithmic theory of randomness allows for the proof of impossibility of prediction in certain situations. The book describes how several important machine learning problems, such as density estimation in high-dimensional spaces, cannot be solved if the only assumption is randomness.

636 citations

01 Jan 2008
TL;DR: This paper describes an abstract framework and axioms under which exact local computation of marginals is possible and shows how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework.
Abstract: In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We state three axioms for these operators and we derive the possibility of local computation from the axioms. Next, we describe a propagation scheme for computing marginals of a valuation when we have a factorization of the valuation on a hypertree. Finally we show how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework.

579 citations

Book ChapterDOI
01 Jun 1990
TL;DR: In this article, an abstract framework and axioms under which exact local computation of marginals is possible are presented. But the primitive objects of the framework are variables and valuations.
Abstract: In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We state three axioms for these operators and we derive the possibility of local computation from the axioms. Next, we describe a propagation scheme for computing marginals of a valuation when we have a factorization of the valuation on a hypertree. Finally we show how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework.

521 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, Shulman observa la historia de evaluaciones docentes, noting that the evaluación docente parecia preocuparse tanto por los conocimientos, como el siglo anterior se preoccupaba por la pedagogia.
Abstract: Este articulo fue un discurso presidencial en la reunion de America Educational Research Association de Chicago el ano 1985. -- Curioso sobre el por que el publico a menudo tiene una baja opinion sobre el conocimiento de los profesores, Shulman observa la historia de evaluaciones docentes. En la segunda mitad del 1800, las evaluaciones para quienes deseaban ensenar se basaban casi por completo en contenido. Para el ano en que el autor escribe el articulo, en 1985, la evaluacion era completamente distinta. En lugar de enfocarse en contenido, se enfocaba en topicos como planificacion de clases, sensibilizacion cultural, y otros aspectos de la conducta docente. Mientras los topicos usualmente tenian raices en la investigacion, claramente no representan el amplio espectro de habilidades y conocimientos que un docente necesita para ser efectivo. Mas especificamente, para los anos 80', la evaluacion docente parecia preocuparse tanto por los conocimientos, como el siglo anterior se preocupaba por la pedagogia.

15,740 citations

Book
01 Jan 1988
TL;DR: Probabilistic Reasoning in Intelligent Systems as mentioned in this paper is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty, and provides a coherent explication of probability as a language for reasoning with partial belief.
Abstract: From the Publisher: Probabilistic Reasoning in Intelligent Systems is a complete andaccessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty—and offers techniques, based on belief networks, that provide a mechanism for making semantics-based systems operational. Specifically, network-propagation techniques serve as a mechanism for combining the theoretical coherence of probability theory with modern demands of reasoning-systems technology: modular declarative inputs, conceptually meaningful inferences, and parallel distributed computation. Application areas include diagnosis, forecasting, image interpretation, multi-sensor fusion, decision support systems, plan recognition, planning, speech recognition—in short, almost every task requiring that conclusions be drawn from uncertain clues and incomplete information. Probabilistic Reasoning in Intelligent Systems will be of special interest to scholars and researchers in AI, decision theory, statistics, logic, philosophy, cognitive psychology, and the management sciences. Professionals in the areas of knowledge-based systems, operations research, engineering, and statistics will find theoretical and computational tools of immediate practical use. The book can also be used as an excellent text for graduate-level courses in AI, operations research, or applied probability.

15,671 citations

Journal ArticleDOI
TL;DR: Cumulative prospect theory as discussed by the authors applies to uncertain as well as to risky prospects with any number of outcomes, and it allows different weighting functions for gains and for losses, and two principles, diminishing sensitivity and loss aversion, are invoked to explain the characteristic curvature of the value function and the weighting function.
Abstract: We develop a new version of prospect theory that employs cumulative rather than separable decision weights and extends the theory in several respects. This version, called cumulative prospect theory, applies to uncertain as well as to risky prospects with any number of outcomes, and it allows different weighting functions for gains and for losses. Two principles, diminishing sensitivity and loss aversion, are invoked to explain the characteristic curvature of the value function and the weighting functions. A review of the experimental evidence and the results of a new experiment confirm a distinctive fourfold pattern of risk attitudes: risk aversion for gains and risk seeking for losses of high probability; risk seeking for gains and risk aversion for losses of low probability. Expected utility theory reigned for several decades as the dominant normative and descriptive model of decision making under uncertainty, but it has come under serious question in recent years. There is now general agreement that the theory does not provide an adequate description of individual choice: a substantial body of evidence shows that decision makers systematically violate its basic tenets. Many alternative models have been proposed in response to this empirical challenge (for reviews, see Camerer, 1989; Fishburn, 1988; Machina, 1987). Some time ago we presented a model of choice, called prospect theory, which explained the major violations of expected utility theory in choices between risky prospects with a small number of outcomes (Kahneman and Tversky, 1979; Tversky and Kahneman, 1986). The key elements of this theory are 1) a value function that is concave for gains, convex for losses, and steeper for losses than for gains,

13,433 citations

Journal ArticleDOI
TL;DR: In this article, the authors focus on the linkages between the industry analysis framework, the resource-based view of the firm, behavioral decision biases and organizational implementation issues, and connect the concept of Strategic Industry Factors at the market level with the notion of Strategic Assets at the firm level.
Abstract: We build on an emerging strategy literature that views the firm as a bundle of resources and capabilities, and examine conditions that contribute to the realization of sustainable economic rents. Because of (1) resource-market imperfections and (2) discretionary managerial decisions about resource development and deployment, we expect firms to differ (in and out of equilibrium) in the resources and capabilities they control. This asymmetry in turn can be a source of sustainable economic rent. The paper focuses on the linkages between the industry analysis framework, the resource-based view of the firm, behavioral decision biases and organizational implementation issues. It connects the concept of Strategic Industry Factors at the market level with the notion of Strategic Assets at the firm level. Organizational rent is shown to stem from imperfect and discretionary decisions to develop and deploy selected resources and capabilities, made by boundedly rational managers facing high uncertainty, complexity, and intrafirm conflict.

8,121 citations

Journal ArticleDOI
TL;DR: In this article, a study of market efficiency investigates whether people tend to "overreact" to unexpected and dramatic news events and whether such behavior affects stock prices, based on CRSP monthly return data, is consistent with the overreaction hypothesis.
Abstract: Research in experimental psychology suggests that, in violation of Bayes' rule, most people tend to "overreact" to unexpected and dramatic news events. This study of market efficiency investigates whether such behavior affects stock prices. The empirical evidence, based on CRSP monthly return data, is consistent with the overreaction hypothesis. Substantial weak form market inefficiencies are discovered. The results also shed new light on the January returns earned by prior "winners" and "losers." Portfolios of losers experience exceptionally large January returns as late as five years after portfolio formation. As ECONOMISTS INTERESTED IN both market behavior and the psychology of individual decision making, we have been struck by the similarity of two sets of empirical findings. Both classes of behavior can be characterized as displaying overreaction. This study was undertaken to investigate the possibility that these phenomena are related by more than just appearance. We begin by describing briefly the individual and market behavior that piqued our interest. The term overreaction carries with it an implicit comparison to some degree of reaction that is considered to be appropriate. What is an appropriate reaction? One class,,of tasks which have a well-established norm are probability revision problems for which Bayes' rule prescribes the correct reaction to new information. It has now been well-established that Bayes' rule is not an apt characterization of how individuals actually respond to new data (Kahneman et al. [14]). In revising their beliefs, individuals tend to overweight recent information and underweight prior (or base rate) data. People seem to make predictions according to a simple matching rule: "The predicted value is selected so that the standing of the case in the distribution of outcomes matches its standing in the distribution of impressions" (Kahneman and Tversky [14, p. 416]). This rule-of-thumb, an instance of what Kahneman and Tversky call the representativeness heuristic, violates the basic statistical principal that the extremeness of predictions must be moderated by considerations of predictability. Grether [12] has replicated this finding under incentive compatible conditions. There is also considerable evidence that the actual expectations of professional security analysts and economic forecasters display the same overreaction bias (for a review, see De Bondt [7]). One of the earliest observations about overreaction in markets was made by J. M. Keynes:"... day-to-day fluctuations in the profits of existing investments,

7,032 citations