scispace - formally typeset
Search or ask a question

Showing papers on "Von Neumann–Morgenstern utility theorem published in 2006"


Journal ArticleDOI
TL;DR: This article developed a model of reference-dependent preferences and loss aversion where the gain-loss utility is derived from standard consumption utility and the reference point is determined endogenously by the economic environment.
Abstract: We develop a model of reference-dependent preferences and loss aversion where “gain‐loss utility” is derived from standard “consumption utility” and the reference point is determined endogenously by the economic environment. We assume that a person’s reference point is her rational expectations held in the recent past about outcomes, which are determined in a personal equilibrium by the requirement that they must be consistent with optimal behavior given expectations. In deterministic environments, choices maximize consumption utility, but gain‐loss utility influences behavior when there is uncertainty. Applying the model to consumer behavior, we show that willingness to pay for a good is increasing in the expected probability of purchase and in the expected prices conditional on purchase. In within-day labor-supply decisions, a worker is less likely to continue work if income earned thus far is unexpectedly high, but more likely to show up as well as continue work if expected income is high.

2,079 citations


Journal ArticleDOI
TL;DR: In this article, a model of random choice and random expected utility is developed and analyzed, and it is shown that a random choice rule maximizes some random utility function if and only if it is mixture continuous, monotone (the probability that a lottery is chosen does not increase when other lotteries are added to the decision problem), extreme (lotteries that are not extreme points of the problem are chosen with probability 0), and linear (satisfies the independence axiom).
Abstract: We develop and analyze a model of random choice and random expected utility. A decision problem is a finite set of lotteries that describe the feasible choices. A random choice rule associates with each decision problem a probability measure over choices. A random utility function is a probability measure over von Neumann-Morgenstern utility functions. We show that a random choice rule maximizes some random utility function if and only if it is mixture continuous, monotone (the probability that a lottery is chosen does not increase when other lotteries are added to the decision problem), extreme (lotteries that are not extreme points of the decision problem are chosen with probability 0), and linear (satisfies the independence axiom).

163 citations


Journal ArticleDOI
TL;DR: The authors calibrate the relationship between risk attitudes over small-stakes and large-stakes gambles and find that rejecting small gambles is consistent with expected utility, contrary to a recent literature that concludes that expected utility is fundamentally unfit to explain decisions under uncertainty Paradoxical behavior is only obtained when calibrations are made in a region of the parameter space that is not empirically relevant.

75 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider several theories for decision under uncertainty: the classical expected utility paradigm, Yaari's dual approach, maximin expected utility theory, Choquet expected utilities theory and Quiggin's rank-dependent utility theory.
Abstract: Risk measures have been studied for several decades in the actuarial literature, where they appeared under the guise of premium calculation principles. Risk measures and properties that risk measures should satisfy have recently received considerable attention in the financial mathematics literature. Mathematically, a risk measure is a mapping from a class of random variables to the real line. Economically, a risk measure should capture the preferences of the decision-maker. This paper complements the study initiated in Denuit, Dhaene & Van Wouwe (1999) and considers several theories for decision under uncertainty: the classical expected utility paradigm, Yaari's dual approach, maximin expected utility theory, Choquet expected utility theory and Quiggin's rank-dependent utility theory. Building on the actuarial equivalent utility pricing principle, broad classes of risk measures are generated, of which most classical risk measures appear to be particular cases. This approach shows that most risk measures studied recently in the financial mathematics literature disregard the utility concept (i.e., correspond to linear utilities), restricting their applicability. Some alternatives proposed in the literature are discussed.

66 citations


Journal ArticleDOI
TL;DR: In this paper, the Allais paradox is used to explain choice shifts in group decision-making, showing that a failure of expected utility is equivalent to a particular configuration of choice shifts.
Abstract: The phenomenon of choice shifts in group decision-making has received much attention in the social psychology literature. Faced with a choice between a “safe” and “risky” decision, group members appear to move to one extreme or the other, relative to the choices each member might have made on her own. Both risky and cautious shifts have been identified in different situations. This paper demonstrates that from an individual decision-making perspective, choice shifts may be viewed as a systematic violation of expected utility theory. We propose a model in which a well-known failure of expected utility — captured by the Allais paradox — is equivalent to a particular configuration of choice shifts. Thus, our results imply a connection between two well-known behavioral regularities, one in individual decision theory and another in the social psychology of groups.

62 citations


Reference EntryDOI
15 Jan 2006
TL;DR: The standard theory of individual choice under uncertainty consists of the joint hypothesis of expected utility risk preferences and probabilistic beliefs Experimental work by both psychologists and economists has uncovered systematic departures from both hypotheses, and has led to the development of alternative, usually more general, models as discussed by the authors.
Abstract: The standard theory of individual choice under uncertainty consists of the joint hypothesis of expected utility risk preferences and probabilistic beliefs Experimental work by both psychologists and economists has uncovered systematic departures from both hypotheses, and has led to the development of alternative, usually more general, models Keywords: risk; uncertainty; expected utility; ambiguity; decision making

61 citations


Journal ArticleDOI
TL;DR: In this paper, the authors studied the differentiability of the value functions of the primal and dual optimization problems that appear in the setting of expected utility maximization in incomplete markets, and showed that the key conditions for the results to hold true are that the relative risk aversion coefficient of the utility function is uniformly bounded away from zero and infinity, and that the prices of traded securities are sigma-bounded under the numeraire given by the optimal wealth process.
Abstract: We study the two-times differentiability of the value functions of the primal and dual optimization problems that appear in the setting of expected utility maximization in incomplete markets. We also study the differentiability of the solutions to these problems with respect to their initial values. We show that the key conditions for the results to hold true are that the relative risk aversion coefficient of the utility function is uniformly bounded away from zero and infinity, and that the prices of traded securities are sigma-bounded under the numeraire given by the optimal wealth process.

58 citations


Journal ArticleDOI
TL;DR: In this article, the authors develop an axiomatic theory of decision-making under uncertainty that dispenses with the state-space and develops subjective expected utility models with unique, action-dependent, subjective probabilities, and a utility function defined over wealth-effect pairs.

53 citations


Journal ArticleDOI
Byung Jin Kang1, Tong Suk Kim1
TL;DR: In this article, the authors extend the analysis to more general cases by assuming wider classes of utility functions, and evaluate the forecasting ability of RN-PDFs and subjective PDFs with five assumed utility functions.

52 citations


Journal ArticleDOI
TL;DR: It is argued that each of Savage's P3 and P4 are incompatible with the Strong Pareto property and a representation theorem for social preferences satisfying Pare to indifference and conforming to the state-dependent expected utility model is provided.

45 citations


Posted Content
TL;DR: In this article, a single period asset allocation problem of the investor who maximizes the expected utility with respect to non-additive beliefs is studied, and the explicit form solutions for the bounds of no-transaction regions are provided.
Abstract: We study single period asset allocation problems of the investor who maximizes the expected utility with respect to non-additive beliefs. The non-additive beliefs of the investor model the presence of an uncertainty and they are assumed to be consistent with the Maxmin expected utility theory of Gilboa and Schmeidler (1989). The proportional transaction costs are incorporated into the model. We provide the explicit form solutions for the bounds of no-transaction regions which completely determine the optimal policy of the investor.

01 Jan 2006
TL;DR: In this paper, the authors consider the implications of allowing subjects to have natural reference points, in the sense that they derive from "homegrown" expectations about earnings in a dynamic task, rather than from a frame presumed by the observer.
Abstract: Reference points play a major role in differentiating theories of choice under uncertainty. Under expected utility theory the reference point is implicit in the assumptions made about asset integration, which is the same thing as assuming different arguments of the utility function. Under prospect theory the reference point differentiates gains and losses, and the manner in which prospects are evaluated. We consider the implications for both models of allowing subjects to have natural reference points, in the sense that they derive from "homegrown" expectations about earnings in a dynamic task, rather than from a frame presumed by the observer. We elicit initial beliefs about expected earnings, and implement a dynamic decision process in which subjects could win or lose money, and even go bankrupt. In short, we cultivate a fertile and natural breeding ground for the effects of reference points to emerge. To characterize the latent data-generating process in a flexible statistical manner we assume that some observations are generated by an expected utility model and that some observations are generated by a cumulative prospect theory model. This specification leads to a finite mixture model in which reference points may be different from the frame presented by the lottery prizes. We report several striking findings. First, expected utility theory accounts for a large fraction of the observations, despite this setting providing a seemingly more natural breeding ground for prospect theory. Assuming homogenous, representative agents for each model type, expected utility theory accounts for b of the observations. With demographic heterogeneity controlled for it still accounts for ½ of the observations. Second, the expected utility theory subjects appear to have utility functions defined over their cumulative income over the sequence of tasks, rather than as defined over the prizes in each individual lottery choice. Finally, we identify demographic characteristics which differentiate the probability that a decision-maker used expected utility theory. Men are much more likely to use expected utility theory than women, racial minorities are more likely than others, those with higher grades are more likely, and those from poorer households are more likely. Thus our results provide insights into the domain of applicability of each of the major choice models, rather than claiming one to be the sole, true model.

Journal ArticleDOI
TL;DR: A procedure which selects a non-dominated strategy that realizes a compromise between the decision maker’s discordant goals at the different decision nodes, and confirms the computational tractability of the model.

Proceedings Article
02 Jun 2006
TL;DR: This work presents an alternate foundation for decision making, in which the primitive objects of choice are syntactic programs, and a representation theorem is proved in the spirit of standard representation theorems, showing that if the DM's preference relation on programs satisfies appropriate axioms, then the state space and outcome space are subjective.
Abstract: In almost all current approaches to decision making, it is assumed that a decision problem is described by a set of states and set of outcomes, and the decision maker (DM) has preferences over a rather rich set of acts, which are functions from states to outcomes. However, most interesting decision problems do not come with a state space and an outcome space. Indeed, in complex problems it is often far from clear what the state and outcome spaces would be. We present an alternate foundation for decision making, in which the primitive objects of choice are syntactic programs. A program can be given semantics as a function from states to outcomes, but does not necessarily have to be described this way. A representation theorem is proved in the spirit of standard representation theorems, showing that if the DM's preference relation on programs satisfies appropriate axioms, then there exist a set S of states, a set O of outcomes, a way of viewing program as functions from S to O, a probability on S, and a utility function on O, such that the DM prefers program a to program b if and only if the expected utility of a is higher than that of b. Thus, the state space and outcome space are subjective, just like the probability and utility; they are not part of the description of the problem. A number of benefits of this approach are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors consider an economic agent with dynamic preference over a set of uncertain monetary payofis and show that the agent's preferences are given by utility functions, which are updated in a time-consistent way as more information is becoming available.
Abstract: We consider an economic agent with dynamic preference over a set of uncertain monetary payofis. We assume that the agent’s preferences are given by utility functions, which are updated in a time-consistent way as more information is becoming available. Our main result is that the agent’s indifierence prices are time-consistent if and only if his preferences can be represented with utility functions that are additive with respect to cash. We call such utility functions monetary. The proof is based on a characterization of time-consistency of dynamic utility functions in terms of indifierence sets. As a special case, we obtain the result that expected utility leads to time-consistent indifierence prices if and only if it is based on a linear or exponential function.

Journal ArticleDOI
TL;DR: The analysis shows that although interpreting utility as a cdf and thinking about achieving targets works fine in the case of a single attribute, this approach should be used with caution in the multiattribute case, with cdf representations requiring more caution than target-oriented representations.
Abstract: Targets are used quite often as a management tool, and it has been argued that thinking in terms of targets may be more natural than thinking in terms of utilities. The standard expected-utility framework with a single attribute (such as money) and nondecreasing, bounded utility is equivalent to a target-oriented setting. A utility function, properly scaled, can be expressed as a cumulative distribution function (cdf) and related to the probability of meeting a target value. We consider whether the equivalence of the two approaches extends to the case of multiattribute utility. Our analysis shows that a multiattribute utility function cannot always be expressed in the form of a cumulative distribution function and, furthermore, cannot always be expressed in the form of a target-oriented utility function. However, in each case equivalence does hold for certain well-known classes of utility functions. In general, our results imply that although interpreting utility as a cdf and thinking about achieving targets works fine in the case of a single attribute, this approach should be used with caution in the multiattribute case, with cdf representations requiring more caution than target-oriented representations.

Posted Content
TL;DR: This paper showed that the utility function under preference homogeneity obeys an additional and important restriction that was not noted by Tversky and Kahnemann (1992), which simplifies the use of prospect theory by reducing the number of free parameters by one.
Abstract: Prospect theory is the main behavioral alternative to expected utility. Tversky and Kahnemann (1992) motivate the utility function for gains and losses under prospect theory by using the axiom of preference homogeneity. However, they do not provide the formal proof. We provide the relevant proof. Furthermore, we show that the utility function under preference homogeneity obeys an additional and important restriction that is not noted by Tversky and Kahnemann (1992). This simplifies the use of prospect theory by reducing the number of free parameters by one.

Posted ContentDOI
TL;DR: In the television show Affari Tuoi, an individual faces a sequence of binary choices between a risky lottery with equiprobable prizes of up to half a million euros and an monetary amount for certain this paper.
Abstract: In the television show Affari Tuoi an individual faces a sequence of binary choicesnbetween a risky lottery with equiprobable prizes of up to half a million euros and anmonetary amount for certain. The decisions of 114 show participants are used to test the predictions of ten decision theories: risk neutrality, expected utility theory, fanning-out hypothesis (weighted utility theory, transitive skew-symmetric bilinear utility theory), (cumulative) prospect theory, regret theory, rank-dependent expected utility theory, Yaari’s dual model, prospective reference theory and disappointment aversion theory.nAssumptions of risk neutrality and loss aversion are clearly violated, respectively, byn55% and 46% of all contestants. There appears to be no evidence of nonlinear probabilitynweighting or disappointment aversion. Observed decisions are generally consistent withnthe assumption of regret aversion and there is strong evidence for the fanning-outnhypothesis. Nevertheless, we find no behavioral patterns that cannot be reconciled withinnthe expected utility framework (or prospective reference theory that gives identicalnpredictions).

DissertationDOI
01 Jan 2006
TL;DR: In this article, the authors consider the problem of long-term evaluation with a focus on time and uncertainty structure and introduce the concept of intertemporal risk aversion, which takes up an important concern of the precautionary principle regarding a higher willingness to undergo preventive measures to avoid a threat of harm.
Abstract: The present work is dedicated to theoretical aspects of long-term evaluation with a focus on time and uncertainty structure. Motivated along the lines of global warming, the analysis renders contributions to the fields of environmental economics, decision theory, the economics of sustainability and cost benefit analysis. The thesis is structured in three parts. The first part examines the relation between the concepts of weak and strong sustainability and the weight given to future consumption. The second part introduces a generalized evaluation model and a new concept of risk aversion. The latter concept, termed intertemporal risk aversion, takes up an important concern of the precautionary principle. The third part extends the underlying model and analyzes the interaction with other characteristics of intertemporal decision making. The latter include an implied preference for the timing of uncertainty resolution as well as different stationarity assumptions. The first part of the thesis relates to the sustainability debate and the concepts of weak versus strong sustainability. While the advocates of the weak sustainability concept consider man made goods and capital a fair substitute for environmental goods and capital, the advocates of the strong sustainability concept judge such substitutability as highly limited. I show in a stylized growth model, how social discount rates generally fall for a weak sustainability specification of welfare, while they grow for a strong sustainability specification. It turns out that under the given assumptions a strong sustainability specification of welfare implies a lower weight given to future consumption streams than a weak sustainability specification. The second part of the thesis introduces the concept of intertemporal risk aversion in a didactically simplified two period framework. The concept takes up an important concern of the precautionary principle regarding a higher willingness to undergo preventive measures in order to avoid a threat of harm. I show that the concern is substantiated as well by von Neumann Morgenstern’s widespread axioms for choice under uncertainty when carefully integrated into a temporal setting. In such a generalized framework, the standard model of intertemporally additive expected utility corresponds to intertemporal risk neutrality. In contrast to the classical concept of (atemporal) risk aversion, the concept of intertemporal risk aversion can be applied immediately to the multi-commodity setting. For the one commodity special case, the concept closely relates to the attempts of disentangling atemporal risk aversion from intertemporal substitutability. The third part of the thesis extends the model to an arbitrary finite time horizon with generalized preferences and elaborates the corresponding axiomatic and functional characterizations of intertemporal risk aversion. Moreover, I identify different assumptions that allow to simplify the model structure. On the one hand, these assumptions are concerned with a stationary evaluation of certain and uncertain consumption plans. On the other hand, they relate to a deduced preference for the timing of uncertainty resolution. The resulting simplifications allow to characterize intertemporal risk aversion in a single parameter, as well as to disentangle atemporal risk aversion from intertemporal substitutability in a non-recursive evaluation structure. Finally, I show that a normatively motivated combination of the assumptions implies that a time consistent, intertemporal risk averse decision maker has to choose a zero rate of pure time preference. Instead of devaluing the future for reasons of sheer impatience, such a decision maker is only allowed to give reduced weight to future welfare if uncertainty increases over time. The major implications of the present work can be divided into two fields. The first field relates to the sustainability debate and the evaluation of the long run. In this regard, the analysis in the first part of the thesis shows that the characterization of weak and strong sustainability through the degree of substitutability between environmental and produced goods stands in a surprising and possibly unwanted relation to the sustainability demand in the sense of a stronger commitment to future consumption streams. The analyses carried out in the last part of the thesis implies that a zero rate of pure time preference cannot only be founded on moral considerations, but also on assumptions concerning a time consistent evaluation of uncertainty. The second field of implications concerns the handling of uncertainty. In particular, the concept of intertemporal risk aversion mediates between the advocates and the opponents of the precautionary principle. On the one hand, it takes up the concern regarding a higher willingness to undergo preventive action than implied by the standard model. On the other hand, intertemporal risk aversion formalizes this concern and reconciles it with the standard assumptions underlying economic evaluation. That way, it encounters the critique of the precautionary principle as being vague, arbitrary and, thus, paralyzing.

Posted ContentDOI
TL;DR: In this paper, an individual makes random errors when evaluating the expected utility of a risky lottery, but these errors are symmetrically distributed around zero as long as an individual does not make transparent mistakes such as choosing a riskier lottery over its highest possible outcome for certain.
Abstract: An individual makes random errors when evaluating the expected utility of a risky lottery. Errors are symmetrically distributed around zero as long as an individual does not make transparent mistakes such as choosing a risky lottery over its highest possible outcome for certain. This stochastic decision theory explains many well-known violations of expected utility theory such as the fourfold pattern of risk attitudes, the discrepancy between certainty equivalent and probability equivalent elicitation methods, the preference reversal phenomenon, the generalized common consequence effect (the Allais paradox), the common ratio effect and the violations of the betweenness.

Posted Content
TL;DR: In this article, the authors extend the theory of decision-making under uncertainty from a classical environment into a non-classical one and provide representation theorems for qualitative measures and expected utility.
Abstract: In this paper we extend Savage’s theory of decision-making under uncertainty from a classical environment into a non-classical one. We formulate the corresponding axioms and provide representation theorems for qualitative measures and expected utility. We also propose an application in simple game context in the spirit of Harsanyi.

Journal ArticleDOI
TL;DR: In this paper, the authors show how returns on a stock and prices of call options written on that stock can be used jointly to recover utility of wealth function of the marginal investor in the stock.
Abstract: What do investor utility functions look like? We show how returns on a stock and prices of call options written on that stock can be used jointly to recover utility of wealth function of the marginal investor in the stock We study whether non-standard preferences have an impact sufficiently large that it is present in the stock prices Using options on the stocks in the Dow Jones Index, we show support for non-concave utility functions with reference points proposed by Kahneman and Tversky, Friedman and Savage, and Markowitz The evidence for Kahneman and Tversky Prospect Theory value function, and Friedman and Savage and Markowitz utility functions is much stronger than the support for the standard concave utility function Together the utility functions with convex regions and with reference points account for 80% of the market capitalization of the sample stocks This is the first study to report findings of these utility functions using the prices of individual stocks (nonexperimental data) We also investigate a closely related question of whether different assets reflect different risk preferences We find evidence showing that different stocks reflect different types of investor utility function

Book ChapterDOI
01 Jan 2006
TL;DR: In this article, it is shown that the robust-satisficing decision maker may appear paradoxical or irrational if the behavior is modeled from the perspective of expected-utility theory, but perfectly rational if modeled with the info-gap robust satisfaction paradigm.
Abstract: This chapter describes the Ellsberg and Allais “paradoxes,” which are empirical results challenging the classical theory of expected utility on the basis of info - gap decision theory. It demonstrates that both the Ellsberg and the Allais observations are consistent with info - gap robust - satisficing behavior and applies the info - gap robust - satisficing paradigm to the model of Arrow - Pratt risk aversion as it is developed in the expected - utility framework. In a broad range of situations, robust - satisficing is a better bet—even without knowing the relevant probability distributions—than the direct optimization strategy. The chapter also presents an info - gap robust - satisficing explanation of the equity premium puzzle in financial economics. The robust - satisficing decision maker may appear “paradoxical” or “irrational” if the behavior is modeled from the perspective of expected - utility theory, but perfectly rational if modeled with the info-gap robust-satisficing paradigm. From the info - gap perspective, it is not only the shape of the utility curve which determines risk orientation but also the interaction between the utility curve and the uncertainty perception. Evidence from competitive biological and economic systems, as well as from the Allais “paradox” seems to suggest that robust - satisficing may be, in some rather general situations, a better bet than direct optimization.

Posted Content
TL;DR: In this article, it is shown that the extension of expected utility theory to multiple periods destroys the axiomatic base by introducing timing contradictions in what the chooser knows at a single time point.
Abstract: It is here shown that the extension of expected utility theory to multiple periods destroys the axiomatic base by introducing timing contradictions in what the chooser knows at a single time point. It is shown that some of these timing contradictions remain even if, as Samuelson (1952) proposed, no segment of the outcome space (to which utility attaches) commences before all risk is passed.

01 Jan 2006
TL;DR: In this article, a general n-player link between non-cooperative bargaining and the Nash solution was established, and the equivalence of stationary equilibria of the unanimity bargaining game and the stable set solution was demonstrated.
Abstract: We establish a general n-player link between non-cooperative bargaining and the Nash solution. Non-cooperative bargaining is captured in a reduced form through the von Neumann-Morgenstern (1944) stability concept. A stable set always exists. Moreover, if the utility set has a smooth surface, then any stable set converges to the Nash bargaining solution. Finally, the equivalence of stationary equilibria of the unanimity bargaining game and the stable set solution is demonstrated. JEL Classification: C71, C78

Journal ArticleDOI
TL;DR: Two alternative ways of resolving the decision problem whenever the outcome is sequence sensitive are proposed, and one way yields a rationalizable choice set that is equivalent to von Neumann–Morgenstern’s stable set.
Abstract: Describing a procedure in which choice proceeds in a sequence, we propose two alternative ways of resolving the decision problem whenever the outcome is sequence sensitive. One way yields a rationalizable choice set, and the other way produces a weakly rationalizable choice set that is equivalent to von Neumann–Morgenstern’s stable set. It is shown that for quasi-transitive rationalization, the maximal set must coincide with its stable set.

Proceedings ArticleDOI
08 May 2006
TL;DR: In the framework of MAUT (multi-attribute utility theory) > several methods have been proposed to aggregate utility of attributes and represent a decision makers (DM) utility function, the Choquet integral was proposed, which permits to model preference structures whose attributes are interdependence.
Abstract: In the framework of MAUT (multi-attribute utility theory) > several methods have been proposed to aggregate utility of attributes and represent a decision makers (DM) utility function. Most of them are additive ones and they hypotheses that the decision attributes are independent of each other. But those additive methods can't guarantee to find a utility function which is coherent with the available information since they do not allow including additional information such as an interaction among criteria. Then the utility function expressed in terms of fuzzy measure and Choquet integral was proposed, which permits to model preference structures whose attributes are interdependence. In the literature, the properties of the Choquet integral will be concluded firstly; the conclusion that the Choquet integral suits the requirement of the aggregation in the MAUT will be drawn; then the compassion of the non-additive utility functions in the framework of the Choquet integral and the utility axioms which advanced by Von Neumann and Morgenstern will be analyzed; and at last the reason why utility functions expressed in terms of Choquet integral can consistent with the utility axioms will be given.

Proceedings Article
13 Jul 2006
TL;DR: In this paper, two axiomatizations of algebraic expected utility, which is a particular generalized expected utility in a von Neumann-Morgenstern setting, are provided.
Abstract: In this paper, we provide two axiomatizations of algebraic expected utility, which is a particular generalized expected utility, in a von Neumann-Morgenstern setting, i.e. uncertainty representation is supposed to be given and here to be described by a plausibility measure valued on a semiring, which could be partially ordered. We show that axioms identical to those for expected utility entail that preferences are represented by an algebraic expected utility. This algebraic approach allows many previous propositions (expected utility, binary possibilistic utility,...) to be unified in a same general framework and proves that the obtained utility enjoys the same nice features as expected utility: linearity, dynamic consistency, autoduality of the underlying uncertainty representation, autoduality of the decision criterion and possibility of modeling decision maker's attitude toward uncertainty.

Journal ArticleDOI
TL;DR: In this paper, the authors show that typical risk experimental results are impossible to reconcile with conventional dynamic consumption theory under risk, where people are time consistent and integrate all sources of income perfectly.
Abstract: Recent papers by Cox and Sadiraj (2006) and Rubinstein (2006) have pointed out that expected utility theory is more general than has sometimes been acknowledged, and can hence not be refuted as easily by means of experiments. While acknowledging this fact, this note nevertheless demonstrates that typical risk experimental results are impossible to reconcile with conventional dynamic consumption theory under risk, where people are time consistent and integrate all sources of income perfectly.

Journal ArticleDOI
TL;DR: In this article, a new approach for solving the dynamic portfolio selection prob- lem, also known as the Merton (1969) problem, is introduced, based on the dual expected utility (DEU) theory which is a particular class of non-expected utility theory presented in Yaari (1987).
Abstract: In this paper the dynamic portfolio selection problem is studied for the first time in a dual utility theory framework. The Wang transform is used as distortion function and well diversified optimal portfolios result both with and without short sales allowed. In this paper a new approach for solving the dynamic portfolio selection prob- lem, also known as the Merton (1969) problem, is introduced. This approach is based on the dual expected utility (DEU) theory which is a particular class of non-expected utility theory presented in Yaari (1987). Unlike the classical expected utility (EU) theory by von Neumann and Morgenstern (1944) the DEU theory overcomes some paradoxes such as Allais (1953) and Ellsberg (1961), as shown in Quiggin (1993). In the DEU framework ''attitudes toward risks are characterized by a distor- tion applied to probability distribution functions, in contrast to expected utility in which attitudes toward risks are characterized by a utility function of wealth'' (Wang-Young (1998)). As far as the authors know in financial and economic literature there is not any work concerned with the application of the DEU theory to dynamic selec- tion of an asset portfolio. This may be due to the fact that in Yaari (1987) it is shown that DEU theory leads to not diversified portfolios when the decision maker has only two assets available, one risky and one not. However in Hadar- Kun Seo (1995) it is shown that in the presence of many risky assets DEU theory