scispace - formally typeset
Search or ask a question

Showing papers by "Paris Dauphine University published in 2007"


Journal ArticleDOI
TL;DR: In this paper, the authors present three examples of the mean-field approach to modelling in economics and finance (or other related subjects) and show that these nonlinear problems are essentially well-posed problems with unique solutions.
Abstract: We survey here some recent studies concerning what we call mean-field models by analogy with Statistical Mechanics and Physics. More precisely, we present three examples of our mean-field approach to modelling in Economics and Finance (or other related subjects...). Roughly speaking, we are concerned with situations that involve a very large number of “rational players” with a limited information (or visibility) on the “game”. Each player chooses his optimal strategy in view of the global (or macroscopic) informations that are available to him and that result from the actions of all players. In the three examples we mention here, we derive a mean-field problem which consists in nonlinear differential equations. These equations are of a new type and our main goal here is to study them and establish their links with various fields of Analysis. We show in particular that these nonlinear problems are essentially well-posed problems i.e., have unique solutions. In addition, we give various limiting cases, examples and possible extensions. And we mention many open problems.

2,385 citations


Journal ArticleDOI
TL;DR: In this article, a measure of the contribution of unequal opportunities to earnings inequality is proposed, which is based on the distinction between "circumstance" and "effort" variables in John Roemer's work on equality of opportunity.
Abstract: This paper proposes a measure of the contribution of unequal opportunities to earnings inequality. Drawing on the distinction between "circumstance" and "effort" variables in John Roemer's work on equality of opportunity, we associate inequality of opportunities with five observed circumstances which lie beyond the control of the individual--father's and mother's education; father's occupation; race; and region of birth. The paper provides a range of estimates of the importance of these opportunity-forming circumstances in accounting for earnings inequality in one of the world's most unequal countries. We also decompose the effect of opportunities into a direct effect on earnings and an indirect component, which works through the "effort" variables. The decomposition is applied to the distribution of male earnings in urban Brazil, in 1996. The five observed circumstances are found to account for between 10 and 37 percent of the Theil index, depending on cohort and allowing for the possibility of biased coefficient estimates due to unobserved correlates. On average, 60 percent of this impact operates through the direct effect on earnings. Parental education is the most important circumstance affecting earnings, but the occupation of the father and race also play a role.

496 citations


Journal ArticleDOI
TL;DR: This work introduces a semi‐implicit coupling scheme which remains stable for a reasonable range of the discretization parameters and proves (conditional) stability of the scheme for a fully discrete formulation.
Abstract: We address the numerical simulation of fluid-structure systems involving an incompressible viscous fluid. This issue is particularly difficult to face when the fluid added-mass acting on the structure is strong, as it happens in hemodynamics for example. Indeed, several works have shown that, in such situations, implicit coupling seems to be necessary in order to avoid numerical instabilities. Although significant improvements have been achieved during the last years, solving implicit coupling often exhibits a prohibitive computational cost. In this work, we introduce a semi-implicit coupling scheme which remains stable for a reasonable range of the discretization parameters. The first idea consists in treating implicitly the added-mass effect, whereas the other contributions (geometrical non-linearities, viscous and convective effects) are treated explicitly. The second idea, relies on the fact that this kind of explicit-implicit splitting can be naturally performed using a Chorin-Temam projection scheme in the fluid. We prove (conditional) stability of the scheme for a fully discrete formulation. Several numerical experiments point out the efficiency of the present scheme compared to several implicit approaches.

273 citations


Journal ArticleDOI
TL;DR: Men with acute myocardial infarction have a higher hospital mortality rate than women as mentioned in this paper, this difference has been attributed to their older age, more frequent comorbidities, and less freque...
Abstract: Background— Women with acute myocardial infarction have a higher hospital mortality rate than men. This difference has been ascribed to their older age, more frequent comorbidities, and less freque...

259 citations


Book ChapterDOI
20 Jan 2007
TL;DR: This short paper gives a general introduction to computational social choice, by proposing a taxonomy of the issues addressed by this discipline, together with some illustrative examples and an (incomplete) bibliography.
Abstract: Computational social choice is an interdisciplinary field of study at the interface of social choice theory and computer science, promoting an exchange of ideas in both directions. On the one hand, it is concerned with the application of techniques developed in computer science, such as complexity analysis or algorithm design, to the study of social choice mechanisms, such as voting procedures or fair division algorithms. On the other hand, computational social choice is concerned with importing concepts from social choice theory into computing. For instance, the study of preference aggregation mechanisms is also very relevant to multiagent systems. In this short paper we give a general introduction to computational social choice, by proposing a taxonomy of the issues addressed by this discipline, together with some illustrative examples and an (incomplete) bibliography.

255 citations


Journal ArticleDOI
TL;DR: An axiomatic analysis of the partitions of alternatives into two categories that can be obtained using what are called “noncompensatory sorting models”, which have strong links with the pessimistic version of ELECTRE TRI.

153 citations


Journal ArticleDOI
TL;DR: In this article, the impact of share repurchases on the liquidity of French companies was investigated. But, the analysis was limited to the French market, and the results showed that corporate share repurchase has a significant adverse effect on liquidity as measured by bid-ask spread or depth.
Abstract: Research on the impact of open market share repurchases has been hindered by the lack of data available on actual share repurchases in many countries, including the US. Using a previously unused database containing detailed information on 36,848 repurchases made by 352 French firms, we show that corporate share repurchases have a significant adverse effect on liquidity as measured by bid–ask spread or depth. Our results also indicate that share repurchases largely reflect contrarian trading rather than managerial timing ability.

144 citations


Journal ArticleDOI
TL;DR: Le fichier attache est egalement edite dans les Cahier de la Chaire "Les Particuliers face aux Risques" de l'Institut de Finance de Dauphine, cahier n°7, mars 2007 as discussed by the authors.
Abstract: Le fichier attache est egalement edite dans les Cahier de la Chaire "Les Particuliers face aux Risques" de l'Institut de Finance de Dauphine, cahier n°7, mars 2007

137 citations


Journal ArticleDOI
TL;DR: The aim of the paper is to analyse the type of activities occurring between a “client” and an “analyst” both engaged in a decision process to present the decision aiding process as an extension of the decision process.
Abstract: The paper presents the concept of decision aiding process as an extension of the decision process. The aim of the paper is to analyse the type of activities occurring between a “client” and an “analyst” both engaged in a decision process. The decision aiding process is analysed both under a cognitive point of view and an operational point of view: i.e. considering the “products”, or cognitive artifacts the process will deliver at the end. Finally the decision aiding process is considered as a reasoning process for which the update and revision problems hold.

131 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide an axiomatic analysis of non-compensatory sorting models, with or without veto effects, which contain the pessimistic version of ELECTRE TRI as a particular case.

130 citations


Journal ArticleDOI
TL;DR: This article reviews bandlet approaches to geometric image representations using an adaptive segmentation and a local geometric flow well suited to capture the anisotropic regularity of edge structures to lead to state of the art results for image denoising and super-resolution.
Abstract: This article reviews bandlet approaches to geometric image representations. Orthogonal bandlets using an adaptive segmentation and a local geometric flow well suited to capture the anisotropic regularity of edge structures. They are constructed with a “bandletization” which is a local orthogonal transformation applied to wavelet coefficients. The approximation in these bandlet bases exhibits an asymptotically optimal decay for images that are regular outside a set of regular edges. These bandlets can be used to perform image compression and noise removal. More flexible orthogonal bandlets with less vanishing moments are constructed with orthogonal grouplets that group wavelet coefficients alon a multiscale association field. Applying a translation invariant grouplet transform over a translation invariant wavelet frame leads to state of the art results for image denoising and super-resolution.

Journal ArticleDOI
TL;DR: In this article, the authors systematically and rigorously investigate various stochastic volatility models used in Mathematical Finance and obtain necessary and sufficient conditions on the parameters, such as correlation, of these models in order to have integrable or L p solutions.
Abstract: We investigate here, systematically and rigorously, various stochastic volatility models used in Mathematical Finance. Mathematically, such models involve coupled stochastic differential equations with coefficients that do not obey the natural and classical conditions required to make these models “well-posed”. And we obtain necessary and sufficient conditions on the parameters, such as correlation, of these models in order to have integrable or L p solutions (for 1 p ∞ ).

Journal ArticleDOI
TL;DR: In this article, a mixture of importance functions, called a D-kernel, can be iteratively optimized to achieve the minimum asymptotic variance for a function of interest among all possible mixtures.
Abstract: Variance reduction has always been a central issue in Monte Carlo experiments. Population Monte Carlo can be used to this effect, in that a mixture of importance functions, called a D-kernel, can be iteratively optimized to achieve the minimum asymptotic variance for a function of interest among all possible mixtures. The implementation of this iterative scheme is illustrated for the computation of the price of a European option in the Cox-Ingersoll-Ross model. A Central Limit theorem as well as moderate deviations are established for the D-kernel Population Monte Carlo methodology.

Journal ArticleDOI
01 Aug 2007
TL;DR: This paper shows how IRIS may be used to help the group to iteratively reach an agreement on how to sort one or a few actions at a time, preserving the consistency of these sorting examples both at the individual level and at the collective level.
Abstract: This paper addresses the situation where a group wishes to cooperatively develop a common multicriteria evaluation model to sort actions (projects, candidates) into classes. It is based on an aggregation/disaggregation approach for the ELECTRE TRI method, implemented on the Decision Support System IRIS. We provide a methodology in which the group discusses how to sort some exemplary actions (possibly fictitious ones), instead of discussing what values the model parameters should take. This paper shows how IRIS may be used to help the group to iteratively reach an agreement on how to sort one or a few actions at a time, preserving the consistency of these sorting examples both at the individual level and at the collective level. The computation of information that may guide the discussion among the group members is also suggested. We provide an illustrative example and discuss some paths for future research motivated by this work.

Journal ArticleDOI
TL;DR: In this article, qualitative properties of efficient insurance contracts in the presence of background risk were examined for all strictly risk-averse expected utility maximizers, and the concept of stochastic increasingness was used.

Journal ArticleDOI
TL;DR: Blanc et al. as discussed by the authors presented some variants of stochastic homogenization theory for scalar elliptic equations of the form − div [ A (x e, ω ) ∇ u (x, ω) ] = f.

Journal ArticleDOI
TL;DR: It is shown that the limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem.
Abstract: We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, that is, a density of the form $f_0=\exp\varphi_0$ where $\varphi_0$ is a concave function on $\mathbb{R}$. The pointwise limiting distributions depend on the second and third derivatives at 0 of $H_k$, the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of $\varphi_0=\log f_0$ at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode $M(f_0)$ and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.

Journal ArticleDOI
TL;DR: Masmoudi et al. as discussed by the authors proved the global existence of weak solutions for the co-rotational FENE dumbbell model and the Doi model also called the Rod model based on propagation of compactness.

Journal ArticleDOI
TL;DR: The authors assesses how policies and institutions affect private returns to invest in tertiary human capital, the ability of individuals to finance this investment and the institutional characteristics of tertiary education systems.
Abstract: This paper assesses how policies and institutions affect private returns to invest in tertiary human capital, the ability of individuals to finance this investment and the institutional characteristics of tertiary education systems. Focusing on core tertiary education services, the paper presents new measures of private returns to tertiary education, the institutional setting for supplying tertiary education and the availability of individual financing in OECD countries. Using a panel of 19 countries, the number of new tertiary graduates (a proxy for investment in tertiary education) is regressed on these new proposed measures, as well as other standard determinants of investment in tertiary education. The resulting estimates are used to assess empirically the relative importance of several education, taxation and social policies affecting investment in tertiary education. Several avenues for reform and the trade-offs they present for public policy are discussed.

Journal ArticleDOI
TL;DR: In this paper, an overview of some mathematical results, which provide elements of rigorous basis for some multiscale computations in materials science, is presented, focusing on atomistic to continuum limits for crystalline materials.
Abstract: The present article is an overview of some mathematical results, which provide elements of rigorous basis for some multiscale computations in materials science. The emphasis is laid upon atomistic to continuum limits for crystalline materials. Various mathematical approaches are addressed. The setting is stationary. The relation to existing techniques used in the engineering literature is investigated.

Journal ArticleDOI
TL;DR: A new method to approximate the Pareto front based on an evolutionary algorithm with local search is proposed, which allows to solve large-size instances in a reasonable CPU time and generates high quality solutions.
Abstract: Districting problems are of high importance in many different fields. Multiple criteria models seem a more adequate representation of districting problems in real-world situations. Real-life decision situations are by their very nature multidimensional. This paper deals with the problem of partitioning a territory into “homogeneous” zones. Each zone is composed of a set of elementary territorial units. A district map is formed by partitioning the set of elementary units into connected zones without inclusions. When multiple criteria are considered, the problem of enumerating all the efficient solutions for such a model is known as being NP-hard, which is why we decided to avoid using exact methods to solve large-size instances. In this paper, we propose a new method to approximate the Pareto front based on an evolutionary algorithm with local search. The algorithm presents a new solution representation and the crossover/mutation operators. Its main features are the following: it deals with multiple criteria; it allows to solve large-size instances in a reasonable CPU time and generates high quality solutions. The algorithm was applied to a real-world problem, that of the Paris region public transportation. Results will be used for a discussion about the reform of its current pricing system.

Journal ArticleDOI
TL;DR: In this paper, a case study of a large French group, carried out through an interpretativist approach conducted by way of 41 semi-structured interviews, allowed the conceptualization of the problematique of perceptual evaluation of IS in a particular field study.
Abstract: This research work concerns the perceptual evaluation of the performance of information systems (IS) and more particularly, the construct of user satisfaction. Faced with the difficulty of obtaining objective measures for the success of IS, user satisfaction appeared as a substitutive measure of IS performance (DeLone & McLean, 1992). Some researchers have indeed shown that the evaluation of an IS could not happen without an analysis of the feelings and perceptions of individuals who make use of it. Consequently, the concept of satisfaction has been considered as a guarantee of the performance of an IS. Also it has become necessary to ponder the drivers of user satisfaction. The analysis of models and measurement tools for satisfaction as well as the adoption of a contingency perspective has allowed the description of principal dimensions that have a direct or less direct impact on user perceptionsThe case study of a large French group, carried out through an interpretativist approach conducted by way of 41 semi-structured interviews, allowed the conceptualization of the problematique of perceptual evaluation of IS in a particular field study. This study led us to confirm the impact of certain factors (such as perceived usefulness, participation, the quality of relations with the IS Function and its resources and also the fit of IS with user needs). On the contrary, other dimensions regarded as fundamental do not receive any consideration or see their influence nuanced in the case studied (the properties of IS, the ease of use, the quality of information). Lastly, this study has allowed for the identification of the influence of certain contingency and contextual variables on user satisfaction and, above all, for the description of the importance of interactions between the IS Function and the users

Proceedings ArticleDOI
15 Apr 2007
TL;DR: This work proposes a scalable distributed data structure (SDDS) called SD-Rtree, which uses a distributed balanced binary spatial tree that scales with insertions to potentially any number of storage servers through splits of the overloaded ones.
Abstract: We propose a scalable distributed data structure (SDDS) called SD-Rtree. We intend our structure for point and window queries over possibly large spatial datasets distributed on clusters of interconnected servers. SD-Rtree generalizes the well-known Rtree structure. It uses a distributed balanced binary spatial tree that scales with insertions to potentially any number of storage servers through splits of the overloaded ones. A user/application manipulates the structure from a client node. The client addresses the tree through its image that the splits can make outdated. This may generate addressing errors, solved by the forwarding among the servers. Specific messages towards the clients incrementally correct the outdated images.

Book ChapterDOI
30 May 2007
TL;DR: A best basis extension of compressed sensing recovery is proposed that makes use of sparsity in a tree-structured dictionary of orthogonal bases and improves the recovery with respect to fixed sparsity priors.
Abstract: This paper proposes an extension of compressed sensing that allows to express the sparsity prior in a dictionary of bases. This enables the use of the random sampling strategy of compressed sensing together with an adaptive recovery process that adapts the basis to the structure of the sensed signal. A fast greedy scheme is used during reconstruction to estimate the best basis using an iterative refinement. Numerical experiments on sounds and geometrical images show that adaptivity is indeed crucial to capture the structures of complex natural signals.

Journal ArticleDOI
TL;DR: Blanchet et al. as discussed by the authors systematically studied weighted Poincare type inequalities which are closely connected with Hardy type inequalities and established the form of the optimal constants in some cases, such inequalities are then used to relate entropy with entropy production and get intermediate asymptotics results for fast diffusion equations.

Proceedings Article
06 Jan 2007
TL;DR: This work shows how to set up a distributed negotiation framework that will allow a group of agents to reach an allocation of goods that is both efficient and envy-free.
Abstract: Mechanisms for dividing a set of goods amongst a number of autonomous agents need to balance efficiency and fairness requirements. A common interpretation of fairness is envy-freeness, while efficiency is usually understood as yielding maximal overall utility. We show how to set up a distributed negotiation framework that will allow a group of agents to reach an allocation of goods that is both efficient and envy-free.

Journal ArticleDOI
TL;DR: This paper investigates, for the first time in the literature, the approximation of min–max (regret) versions of classical problems like shortest path, minimum spanning tree, and knapsack, using dynamic programming and classical trimming techniques to establish fully polynomial-time approximation schemes.

Journal ArticleDOI
TL;DR: Improved approximation algorithms and hardness results for MinLST and MinLP and the goal is to identify an s–t path minimizing the combined cost of its labels.
Abstract: Let G=(V,E) be a connected multigraph, whose edges are associated with labels specified by an integer-valued function ℒ:E→ℕ. In addition, each label l∈ℕ has a non-negative cost c(l). The minimum label spanning tree problem (MinLST) asks to find a spanning tree in G that minimizes the overall cost of the labels used by its edges. Equivalently, we aim at finding a minimum cost subset of labels I⊆ℕ such that the edge set {e∈E:ℒ(e)∈I} forms a connected subgraph spanning all vertices. Similarly, in the minimum label s – t path problem (MinLP) the goal is to identify an s–t path minimizing the combined cost of its labels. The main contributions of this paper are improved approximation algorithms and hardness results for MinLST and MinLP.

Journal ArticleDOI
TL;DR: It is proved that it is NP-complete to decide whether a bipartite graph of maximum degree three on nk vertices can be partitioned into n paths of length k and some approximation and inapproximation results are proposed for several related problems.

Journal ArticleDOI
TL;DR: In the context of the Future Future Learning: New Learning Paradigms Conference in Paris, this article argued that the digital divide between the economically developed world and the developing and underdeveloped worlds is no longer tenable.
Abstract: This article emerged from a series of debates and workshops on the impact of the Digital Divide on educational practice at the ‘Futures of Learning: New Learning Paradigms Conference’ in Paris. The conceptualisation of the Digital Divide into the ‘haves’ and the ‘have-nots’, with a perception of the economically developed world as ‘high tech’ and the developing and underdeveloped worlds as ‘low tech’, is no longer tenable. Building on the recognition based on mounting evidence that old perceptions of the Digital Divide are simplistic and that the Divide encompasses not one but many discontinuities, the nature of such a discontinuity between student and tutor becomes the focus of the argument presented here. Many have argued that increased use and availability of digital technologies in schools bring important benefits and opportunities for learning and teaching strategies but are staff and students able work together to ensure positive outcomes? If not, why might this be the case? In examining the implications of the student/teacher Digital Divide some questions concerning the future direction of education emerge.