scispace - formally typeset
Search or ask a question

Showing papers by "Paris Dauphine University published in 2006"


Journal ArticleDOI
TL;DR: The deviance information criterion is reassessed for missing data models, testing the behaviour of variousextensions in the cases of mixture and random models.
Abstract: The deviance information criterion (DIC) introduced by is directly inspired by linear and generalised linear models, but it is not so naturally defined for missing data models. In this paper, we reassess the criterion for such models, testing the behaviour of various extensions in the cases of mixture and random effect models.

860 citations


Journal ArticleDOI
TL;DR: Lasry et al. as mentioned in this paper introduce an approche generale for modeliser des jeux avec un tres grand nombre of joueurs, and consider des equilibres de Nash a N joues for des problemes stochastiques en temps long and deduisons rigoureusement les equations de type « champ moyen » quand N tend vers l'infini.

802 citations


Journal ArticleDOI
TL;DR: Lasry et al. as mentioned in this paper considered the case of Nash equilibria for stochastic control type problems in finite horizon and presented general existence and uniqueness results for the partial differential equations systems that they introduced.

776 citations


Journal ArticleDOI
TL;DR: In this article, the authors established quantitative concentration estimates for the empirical measure of many independent variables, in transportation distances, and provided some error bounds for particle simulations in a model mean field problem.
Abstract: We establish quantitative concentration estimates for the empirical measure of many independent variables, in transportation distances. As an application, we provide some error bounds for particle simulations in a model mean field problem. The tools include coupling arguments, as well as regularity and moment estimates for solutions of certain diffusive partial differential equations.

272 citations


Book ChapterDOI
01 Jan 2006
TL;DR: In this paper, a dual characterization of law invariant coherent risk measures, satisfying the Fatou property, was given, and it was shown that the hypothesis of Fatou properties may actually be dropped as it is automatically implied by the hypothesis for law invariance.
Abstract: S. Kusuoka [K01, Theorem 4] gave an interesting dual characterization of law invariant coherent risk measures, satisfying the Fatou property. The latter property was introduced by F. Delbaen [D 02]. In the present note we extend Kusuoka’s characterization in two directions, the first one being rather standard, while the second one is somewhat surprising. Firstly we generalize — similarly as M. Fritelli and E. Rossaza Gianin [FG 05] — from the notion of coherent risk measures to the more general notion of convex risk measures as introduced by H. Follmer and A. Schied [FS 04]. Secondly — and more importantly — we show that the hypothesis of Fatou property may actually be dropped as it is automatically implied by the hypothesis of law invariance.

253 citations


Proceedings ArticleDOI
17 Jun 2006
TL;DR: This paper presents a low-level system for boundary extraction and segmentation of natural images and the evaluation of its performance proves that this system outperforms significantly two widely used hierarchical segmentation techniques, as well as the state of the art in local edge detection.
Abstract: This paper presents a low-level system for boundary extraction and segmentation of natural images and the evaluation of its performance. We study the problem in the framework of hierarchical classification, where the geometric structure of an image can be represented by an ultrametric contour map, the soft boundary image associated to a family of nested segmentations. We define generic ultrametric distances by integrating local contour cues along the regions boundaries and combining this information with region attributes. Then, we evaluate quantitatively our results with respect to ground-truth segmentation data, proving that our system outperforms significantly two widely used hierarchical segmentation techniques, as well as the state of the art in local edge detection.

230 citations


Journal ArticleDOI
TL;DR: It is found that the rise in health care expenditures due to ageing is relatively small and the impact of changes in practices is 3.8 times larger than the increase in spending due to population ageing.
Abstract: In this paper we evaluate the respective effects of demographic change, changes in morbidity and changes in practices on growth in health care expenditures. We use microdata, i.e. representative samples of 3441 and 5003 French individuals observed in 1992 and 2000. Our data provide detailed information about morbidity and allow us to observe three components of expenditures: ambulatory care, pharmaceutical and hospital expenditures. We propose an original microsimulation method to identify the components of the drift observed between 1992 and 2000 in the health expenditure age profile. On the one hand, we find empirical evidence of health improvement at a given age: changes in morbidity induce a downward drift of the profile. On the other hand, the drift due to changes in practices is upward and sizeable. Detailed analysis attributes most of this drift to technological innovation. After applying our results at the macroeconomic level, we find that the rise in health care expenditures due to ageing is relatively small. The impact of changes in practices is 3.8 times larger. Furthermore, changes in morbidity induce savings which more than offset the increase in spending due to population ageing.

219 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare the impact of reduced fines and positive rewards and argue that rewarding individuals, including firm employees, can deter collusion in a more effective way, and explore explanations for the puzzling fact that managers keep incriminating evidence and argue reward programs actually provide additional incentives for keeping such evidence.

216 citations


Posted Content
TL;DR: In this article, the authors evaluate the respective effects of demographic change, changes in morbidity and changes in practices on growth in health care expenditures, and find that the drift due to changes in practice is upward and sizeable.
Abstract: In this paper we evaluate the respective effects of demographic change, changes in morbidity and changes in practices on growth in health care expenditures. We use microdata, i.e. representative samples of 3441 and 5003 French individuals observed in 1992 and 2000. Our data provide detailed information about morbidity and allow us to observe three components of expenditures: ambulatory care, pharmaceutical and hospital expenditures.We propose an original microsimulation method to identify the components of the drift observed between 1992 and 2000 in the health expenditure age profile. On the one hand, we find empirical evidence of health improvement at a given age: changes in morbidity induce a downward drift of the profile. On the other hand, the drift due to changes in practices is upward and sizeable. Detailed analysis attributes most of this drift to technological innovation.After applying our results at the macroeconomic level, we find that the rise in health care expenditures due to ageing is relatively small. The impact of changes in practices is 3.8 times larger. Furthermore, changes in morbidity induce savings which more than offset the increase in spending due to population ageing.

209 citations


Journal ArticleDOI
TL;DR: This paper studies an abstract negotiation framework where agents can agree on multilateral deals to exchange bundles of indivisible resources and shows how certain classes of deals are both sufficient and necessary to guarantee that a socially optimal allocation of resources will be reached eventually.
Abstract: A multiagent system may be thought of as an artificial society of autonomous software agents and we can apply concepts borrowed from welfare economics and social choice theory to assess the social welfare of such an agent society. In this paper, we study an abstract negotiation framework where agents can agree on multilateral deals to exchange bundles of indivisible resources. We then analyse how these deals affect social welfare for different instances of the basic framework and different interpretations of the concept of social welfare itself. In particular, we show how certain classes of deals are both sufficient and necessary to guarantee that a socially optimal allocation of resources will be reached eventually.

157 citations


Posted Content
TL;DR: By increasing in a regular way the number of observed choices from the authors' class of budget sets one can fully identify the underlying preference relation and obtain testable implications of rational behavior for a wide class of economic environments and a constructive method to derive individual preferences from observed choices.
Abstract: Afriat (1967) showed the equivalence of the strong axiom of revealed preference and the existence of a solution to a set of linear inequalities. From this solution he constructed a utility function rationalizing the choices of a competitive consumer. We extend Afriat's theorem to a class of nonlinear budget sets. We obtain testable implications of rational behavior for a wide class of economic environments, and a constructive method to derive individual preferences from observed choices. In an application to market games, we identify a set of observable restrictions characterizing Nash equilibrium outcomes.

Journal ArticleDOI
TL;DR: For the spatially homogeneous Boltzmann equation with hard potentials and Grad's cutoff (e.g. hard spheres), this paper gave quantitative estimates of exponential convergence to equilibrium, and showed that the rate of exponential decay is governed by the spectral gap.
Abstract: For the spatially homogeneous Boltzmann equation with hard potentials and Grad's cutoff (e.g. hard spheres), we give quantitative estimates of exponential convergence to equilibrium, and we show that the rate of exponential decay is governed by the spectral gap for the linearized equation, on which we provide a lower bound. Our approach is based on establishing spectral gap-like estimates valid near the equilibrium, and then connecting the latter to the quantitative nonlinear theory. This leads us to an explicit study of the linearized Boltzmann collision operator in functional spaces larger than the usual linearization setting.

Journal ArticleDOI
TL;DR: This paper proposes mathematical programs to infer veto-related parameters, first considering only one criterion, then all criteria simultaneously, using the original version of Electre outranking relation and two variants, and shows that these inference procedures lead tolinear programming, 0–1 linear programming, or separable programming problems, depending on the case.

Journal ArticleDOI
TL;DR: IBS has a significant impact on HRQOL of patients, and specific characteristics such as gender, symptom severity and time since onset of symptoms are predictive of more impaired health-related quality of life.
Abstract: Resume Objectif Evaluer l’impact du syndrome de l’intestin irritable (SII) sur la qualite de vie (QdV) des malades. Methode Deux echelles de QdV ont ete administrees par voie telephonique a un echantillon de 253 malades francais atteints de SII recrutes en population generale. Le SII a ete diagnostique a partir des criteres de Manning, Rome I et Rome II. Les malades qui presentaient une maladie organique etaient exclus de l’etude. Une echelle generique, la SF-36 et une echelle specifique, l’IBSQOL, ont ete utilises. Resultats Chez les malades avec SII, les scores de QdV etaient significativement inferieurs (p Conclusion Le SII a un fort impact sur la QdV des malades. Des caracteristiques specifiques tels que le sexe feminin, la severite des symptomes ainsi que l’anciennete des troubles peuvent predire une qualite de vie encore plus deterioree.

Journal ArticleDOI
TL;DR: A new stochastic algorithm for Bayesian-optimal design in nonlinear and high-dimensional contexts and a formalization of the problem in the framework of Bayesian decision theory, taking into account physicians' knowledge and motivations is proposed.
Abstract: We propose a new stochastic algorithm for Bayesian-optimal design in nonlinear and high-dimensional contexts. Following Peter Muller, we solve an optimization problem by exploring the expected utility surface through Markov chain Monte Carlo simulations. The optimal design is the mode of this surface considered a probability distribution. Our algorithm relies on a “particle” method to efficiently explore high-dimensional multimodal surfaces, with simulated annealing to concentrate the samples near the modes. We first test the method on an optimal allocation problem for which the explicit solution is available, to compare its efficiency with a simpler algorithm. We then apply our method to a challenging medical case study in which an optimal protocol treatment needs to be determined. For this case, we propose a formalization of the problem in the framework of Bayesian decision theory, taking into account physicians' knowledge and motivations. We also briefly review further improvements and alternatives.

Journal ArticleDOI
TL;DR: In this article, the authors considered the cell division equation which describes the continuous growth of cells and their division in two pieces and gave general assumptions on the coefficient so that they can prove the existence of a solution (λ, N, ϕ) to the related eigenproblem.
Abstract: We consider the cell division equation which describes the continuous growth of cells and their division in two pieces. Growth conserves the total number of cells while division conserves the total mass of the system but increases the number of cells. We give general assumptions on the coefficient so that we can prove the existence of a solution (λ, N, ϕ) to the related eigenproblem. We also prove that the solution can be obtained as the sum of an explicit series. Our motivation, besides its applications to the biology and fragmentation, is that the eigenelements allow to prove a priori estimates and long-time asymptotics through the General Relative Entropy.16.

Journal ArticleDOI
TL;DR: In this article, a general version of the super-replication theorem has been proved for the case of a matrix-valued cadlag bid-ask process evolving in continuous time.
Abstract: We prove a general version of the super-replication theorem, which applies to Ka- banov's model of foreign exchange markets under proportional transaction costs. The market is described by a matrix-valued cadlag bid-ask process ( t)t2(0,T) evolving in continuous time. We propose a new definition of admissible portfolio processes as predictable (not necessarily right or left continuous) processes of finite variation related to the bid-ask process by economically meaningful relations. Under the assumption of existence of a Strictly Consistent Price System (SCPS), we prove a closure property for the set of attainable vector-valued contingent claims. We then obtain the super-replication theorem as a consequence of that property, thus generalizing to possibly discontinuous bid-ask processes analogous results obtained by Kabanov (11), Kabanov and Last (12) and Kabanov and Stricker (15). Rasonyi's counter-example (16) served as an important motivation for our approach.

Journal ArticleDOI
TL;DR: In this article, the authors examine the difficulties involved in the refurbishment of large complex organizations and examine these difficulties empirically empirically using the case of an Australian public sector agency subject to "corporatization".
Abstract: Purpose – Modern bureaucracies are under reconstruction, bureaucracy being no longer “modern”; they are becoming “post” bureaucratic Defining the post‐bureaucratic organization as a hybrid form provides insight into the intrinsic difficulties involved in the refurbishment of large complex organizations The purpose of this paper is to examine these difficulties empiricallyDesign/methodology/approach – The paper describes the case of an Australian public sector agency, subject to “corporatization” – a metamorphosis from a strictly public sector outlook to one that was imputedly more commercial It focuses on the transition from personnel management to strategic HRM in the HR functionFindings – A series of difficulties affected these changes: difficulties in inventing a new identity; differences in perception of that identity; organizational philosophy towards strategic HRM; unsuitability of extent networks; and identity conflicts Two factors emerge as the core explanation for the difficulties encounter

Journal ArticleDOI
TL;DR: In this paper, the authors examine how buyers control their suppliers in asymmetric interfirm transactional relationships and propose a conceptual framework based on transactional cost economics and relational exchange view.

Posted Content
TL;DR: In this paper, the authors analyze the impact of heterogeneous beliefs in an otherwise standard competitive complete markets discrete time economy and show that the construction of a consensus belief, as well as a consensus consumer are valid modulo a predictable aggregation bias, which takes the form of a discount factor.
Abstract: The aim of the paper is to analyze the impact of heterogeneous beliefs in an otherwise standard competitive complete markets discrete time economy. The construction of a consensus belief, as well as a consensus consumer are shown to be valid modulo a predictable aggregation bias, which takes the form of a discount factor. We use our construction of a consensus consumer to investigate the impact of beliefs heterogeneity on the CCAPM and on the expression of the risk free rate. We focus on the pessimism/doubt of the consensus consumer and we study their impact on the equilibrium characteristics (market price of risk, risk free rate). We finally analyze how pessimism and doubt at the aggregate level result from pessimism and doubt at the individual level.

Journal ArticleDOI
TL;DR: In this article, the authors extend Mousseau et al. (2003) to incorporate information about the confidence attached to each assignment example, hence providing inconsistency resolutions that the DMs are most likely to accept.
Abstract: Sorting models consist in assigning alternatives evaluated on several criteria to ordered categories. To implement such models it is necessary to set the values of the preference parameters used in the model. Rather than fixing the values of these parameters directly, a usual approach is to infer these values from assign- ment examples provided by the decision maker (DM), i.e., alternatives for which (s)he specifies a required category. However, assignment examples provided by DMs can be inconsistent, i.e., may not match the sorting model. In such situations, it is necessary to support the DMs in the resolution of this inconsistency. In this paper, we extend algorithms from Mousseau et al. (2003) that calculate different ways to remove assignment examples so that the information can be represented in the sorting model. The extension concerns the possibility to relax (rather than to delete) assignment examples. These algorithms incorporate information about the confidence attached to each assignment example, hence providing inconsistency resolutions that the DMs are most likely to accept.

Journal ArticleDOI
TL;DR: In this paper, a Markov chain game with lack of information on one side is considered, where only Player 1 is informed of the current state, then the corresponding matrix game is played, and the actions chosen are observed by both players before proceeding to the next stage.
Abstract: We consider a two-player zero-sum game, given by a Markov chain over a finite set of states and a family of matrix games indexed by states. The sequence of states follows the Markov chain. At the beginning of each stage, only Player 1 is informed of the current state, then the corresponding matrix game is played, and the actions chosen are observed by both players before proceeding to the next stage. We call such a game a Markov chain game with lack of information on one side. This model generalizes the model of Aumann and Maschler of zero-sum repeated games with lack of information on one side (which corresponds to the case where the transition matrix of the Markov chain is the identity matrix). We generalize the proof of Aumann and Maschler and, from the definition and the study of appropriate nonrevealing auxiliary games with infinitely many stages, show the existence of the uniform value. An important difference with Aumann and Maschler's model is that here the notions for Player 1 of using the information and revealing a relevant information are distinct.

Posted Content
TL;DR: In this article, it was shown that the ex ante incentive compatible core of an exchange economy with private information is always non-empty, even if utility functions are quasi-linear.
Abstract: The ex ante incentive compatible core of an exchange economy with private information is the (standard) core of a socially designed characteristic function, which expresses the fact that coalitions allocate goods by means of random incentive compatible mechanisms. We first survey some results in the case of perfectly divisible goods. Examples then show that the ex ante incentive compatible core can be empty, even if utility functions are quasi-linear. If, in addition to quasi-linearity, further assumptions are made (like independent private values), the non-emptiness of the core follows nevertheless from d'Aspremont and Gerard-Varet's construction of incentive compatible, ex post efficient mechanisms. We also introduce a private information version of Shapley and Scarf's economies with indivisible goods, and prove that the ex ante incentive compatible core is always non-empty in this framework.

Journal ArticleDOI
TL;DR: In this paper, the authors examine open market stock repurchases in France and find a positive average market reaction to the repurchase announcement, however, the magnitude of the price reaction is found to depend on a number of corporate governance structure measures.
Abstract: This paper examines open market stock repurchases in France. We find a positive average market reaction to the repurchase announcement. However, the magnitude of the price reaction is found to depend on a number of corporate governance structure measures. The positive aspects of the announcement only appear for a company with a low likelihood of being taken over, and with a low risk of minority shareholder expropriation. Specifically, stock repurchase programs are good news when the firm is supported by foreign institutional investors, and in the case of controlled firms, when the firm has a second large shareholder, which guarantees an effective balance of power for the controlling shareholders.

Journal ArticleDOI
TL;DR: In this article, a new methodology for modeling intraday volume which allows for a reduction of the execution risk in VWAP (Volume Weighted Average Price) orders is presented, based on decomposition of traded volume into two parts: one reflects volume changes due to market evolutions; the second describes the stock specific volume pattern.
Abstract: In this paper, we present a new methodology for modelling intraday volume which allows for a reduction of the execution risk in VWAP (Volume Weighted Average Price) orders. The results are obtained for the all stocks included in the CAC40 index at the beginning of September 2004. The idea of considered models is based on the decomposition of traded volume into two parts: one reflects volume changes due to market evolutions; the second describes the stock specific volume pattern. The dynamics of the specific part of volume is depicted by ARMA, and SETAR models. The implementation of VWAP strategies imposes some dynamical adjustments within the day.

Journal ArticleDOI
TL;DR: In this article, it was shown that there exists a positive constant C(γ) such that, if γ > d/2, then X i∈N∗ (λi(V )) −γ ≤ C(α) Z Rd V d 2 −γ dx (∗)

Journal ArticleDOI
TL;DR: In this paper, it was shown that for a Dirac operator, with no resonance at thresholds nor eigenvalue at thresholds, the propagator satisfies propagation and dispersive estimates, when this linear operator has only two simple eigenvalues sufficiently close to each other.
Abstract: We prove that for a Dirac operator, with no resonance at thresholds nor eigenvalue at thresholds, the propagator satisfies propagation and dispersive estimates. When this linear operator has only two simple eigenvalues sufficiently close to each other, we study an associated class of nonlinear Dirac equations which have stationary solutions. As an application of our decay estimates, we show that these solutions have stable directions which are tangent to the subspaces associated with the continuous spectrum of the Dirac operator. This result is the analogue, in the Dirac case, of a theorem by Tsai and Yau about the Schrodinger equation. To our knowledge, the present work is the first mathematical study of the stability problem for a nonlinear Dirac equation

Posted Content
01 Jan 2006
TL;DR: In this article, the authors investigate the latent structures in the volatilities of the business cycle and stock market valuations by estimating a Markov switching stochastic volatility model and propose a sequential Monte Carlo technique for the Bayesian inference on both the unknown parameters and the latent variables of the hidden Markov model.
Abstract: The recent observed decline of business cycle variability suggests that broad macroeconomic risk may have fallen as well. This may in turn have some impact on equity risk premia. We investigate the latent structures in the volatilities of the business cycle and stock market valuations by estimating a Markov switching stochastic volatility model. We propose a sequential Monte Carlo technique for the Bayesian inference on both the unknown parameters and the latent variables of the hidden Markov model. Sequential importance sampling is used for filtering the latent variables and kernel estimator with a multiple-bandwidth is employed to reconstruct the parameter posterior distribution. We find that the switch to lower variability has occurred in both business cycle and stock market variables along similar patterns.

Proceedings Article
02 Jun 2006
TL;DR: The aim in this paper is to establish correspondences between certain types of weighted formulas and well-known classes of utility functions (such as monotonic, concave or k-additive functions) and to obtain results on the comparative succinctness of different types of Weighted Formula for representing the same class of Utility functions.
Abstract: As proposed in various places, a set of propositional formulas, each associated with a numerical weight, can be used to model the preferences of an agent in combinatorial domains. If the range of possible choices can be represented by the set of possible assignments of propositional symbols to truth values, then the utility of an assignment is given by the sum of the weights of the formulas it satisfies. Our aim in this paper is twofold: (1) to establish correspondences between certain types of weighted formulas and well-known classes of utility functions (such as monotonic, concave or k-additive functions); and (2) to obtain results on the comparative succinctness of different types of weighted formulas for representing the same class of utility functions.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relationship between trading activity of a crossing network and the liquidity of a traditional dealer market by comparing data from the SEAQ quote-driven segment of the London Stock Exchange (LSE) and internal data from POSIT crossing network.
Abstract: This article provides new insights into market competition between traditional exchanges and alternative trading systems in Europe. It investigates the relationship between the trading activity of a crossing network (CN) and the liquidity of a traditional dealer market (DM) by comparing data from the SEAQ quote-driven segment of the London Stock Exchange (LSE) and internal data from the POSIT crossing network. A cross-sectional analysis of bid-ask spreads shows that DM spreads are negatively related to CN executions. Risk-sharing benefits from CN trading dominate fragmentation and cream-skimming costs. Further, risk-sharing gains are found to be related to dealer trading in the CN.