scispace - formally typeset
Search or ask a question

Showing papers by "Paris Dauphine University published in 2005"


Journal ArticleDOI
TL;DR: An algorithm to split an image into a sum u + v of a bounded variation component and a component containing the textures and the noise is constructed, inspired from a recent work of Y. Meyer.
Abstract: We construct an algorithm to split an image into a sum u + v of a bounded variation component and a component containing the textures and the noise. This decomposition is inspired from a recent work of Y. Meyer. We find this decomposition by minimizing a convex functional which depends on the two variables u and v, alternately in each variable. Each minimization is based on a projection algorithm to minimize the total variation. We carry out the mathematical study of our method. We present some numerical results. In particular, we show how the u component can be used in nontextured SAR image restoration.

369 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce the notion of General Relative Entropy Inequality for several linear PDEs, and give several types of applications of the GRIIN: a priori estimates and existence of solution, long time asymptotic to a steady state, attraction to periodic solutions for periodic forcing.

276 citations


Book ChapterDOI
01 Jan 2005
TL;DR: In this paper, the authors present an overview of the main operational approaches of decision-making-based decision aid and discuss some more philosophical aspects of MCDA, such as the path of realism which leads to the quest for a discussion for discovering, the axiomatic path which is often associated with the quest of norms for prescribing, or the task of constructivism which goes hand-in-hand with the working hypothesis for recommending.
Abstract: The purpose of this introductory part is to present an overall view of what MCDA is today. In Section 1, I will attempt to bring answers to questions such as: what is it reasonable to expect from MCDA? Why decision aiding is more often multicriteria than monocriterion? What are the main limitations to objectivity? Section 2 will be devoted to a presentation of the conceptual architecture that constitutes the main keys for analyzing and structuring problem situations. Decision aiding cannot and must not be envisaged jointly with a hypothesis of perfect knowledge. Different ways for apprehending the various sources of imperfect knowledge will be introduced in Section 3. A robustness analysis is necessary in most cases. The crucial question of how can we take into account all criteria comprehensively in order to compare potential actions to one another will be tackled in Section 4. In this introductory part, I will only present a general framework for positioning the main operational approaches that exist today. In Section 5, I will discuss some more philosophical aspects of MCDA. For providing some aid in a decision context, we have to choose among different paths which one seems to be the most appropriate, or how to combine some of them: the path of realism which leads to the quest for a discussion for discovering, the axiomatic path which is often associated with the quest of norms for prescribing, or the path of constructivism which goes hand in hand with the quest of working hypothesis for recommending.

275 citations


Journal ArticleDOI
TL;DR: The starting point is a selection-mutation equation describing the adaptive dynamics of a quantitative trait under the influence of an ecological feedback loop based on the assumption of small (but frequent) mutations, which is derived from a Hamilton-Jacobi equation.

258 citations


Journal ArticleDOI
TL;DR: In this article, a dual characterization of law invariant coherent risk measures, satisfying the Fatou property, was given, and the notion of the Lebesgue property of a convex risk measure was introduced.
Abstract: S. Kusuoka [K 01, Theorem 4] gave an interesting dual characterization of law invariant coherent risk measures, satisfying the Fatou property. The latter property was introduced by F. Delbaen [D 02]. In the present note we extend Kusuoka's characterization in two directions, the first one being rather standard, while the second one is somewhat surprising. Firstly we generalize — similarly as M. Fritelli and E. Rossaza Gianin [FG05] — from the notion of coherent risk measures to the more general notion of convex risk measures as introduced by H. F¨ollmer and A. Schied [FS 04]. Secondly — and more importantly — we show that the hypothesis of Fatou property may actually be dropped as it is automatically implied by the hypothesis of law invariance.We also introduce the notion of the Lebesgue property of a convex risk measure, where the inequality in the definition of the Fatou property is replaced by an equality, and give some dual characterizations of this property.

252 citations


Journal ArticleDOI
TL;DR: In this paper, the existence of weak solutions for an unsteady fluid-structure interaction problem with a flexible elastic plate located on one part of the fluid boundary is studied.
Abstract: The purpose of this work is to study the existence of solutions for an unsteady fluid-structure interaction problem. We consider a three-dimensional viscous incompressible fluid governed by the Navier–Stokes equations, interacting with a flexible elastic plate located on one part of the fluid boundary. The fluid domain evolves according to the structure’s displacement, itself resulting from the fluid force. We prove the existence of at least one weak solution as long as the structure does not touch the fixed part of the fluid boundary. The same result holds also for a two-dimensional fluid interacting with a one-dimensional membrane.

214 citations


Journal ArticleDOI
TL;DR: The ability of the algorithm to track intermittent objects both in sequences of synthetic data and in experimental measurements obtained with individual QD-tagged receptors in the membrane of live neurons is demonstrated.
Abstract: Semiconductor quantum dots (QDs) are new fluorescent probes with great promise for ultrasensitive biological imaging. When detected at the single-molecule level, QD-tagged molecules can be observed and tracked in the membrane of live cells over unprecedented durations. The motion of these individual molecules, recorded in sequences of fluorescence images, can reveal aspects of the dynamics of cellular processes that remain hidden in conventional ensemble imaging. Due to QD complex optical properties, such as fluorescence intermittency, the quantitative analysis of these sequences is, however, challenging and requires advanced algorithms. We present here a novel approach, which, instead of a frame by frame analysis, is based on perceptual grouping in a spatiotemporal volume. By applying a detection process based on an image fluorescence model, we first obtain an unstructured set of points. Individual molecular trajectories are then considered as minimal paths in a Riemannian metric derived from the fluorescence image stack. These paths are computed with a variant of the fast marching method and few parameters are required. We demonstrate the ability of our algorithm to track intermittent objects both in sequences of synthetic data and in experimental measurements obtained with individual QD-tagged receptors in the membrane of live neurons. While developed for tracking QDs, this method can, however, be used with any fluorescent probes.

161 citations


Journal ArticleDOI
TL;DR: In this article, the authors characterize the calibrability of bounded convex sets in Open image in new window by the mean curvature of its boundary, extending the known analogous result in dimension 2.
Abstract: The main purpose of this paper is to characterize the calibrability of bounded convex sets in Open image in new window by the mean curvature of its boundary, extending the known analogous result in dimension 2. As a by-product of our analysis we prove that any bounded convex set C of class C1,1 has a convex calibrable set K in its interior, and and for any volume V ∈ [|K|,|C|] the solution of the perimeter minimizing problem with fixed volume V in the class of sets contained in C is a convex set. As a consequence we describe the evolution of convex sets in Open image in new window by the minimizing total variation flow.

149 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compare the model values of derivatives on quadratic variation for the two types of models and find substantial differences in the value of derivatives for both models.
Abstract: Models which hypothesize that returns are pure jump processes with independent increments have been shown to be capable of capturing the observed variation of market prices of vanilla stock options across strike and maturity. In this paper, these models are employed to derive in closed form the prices of derivatives written on future realized quadratic variation. Alternative work on pricing derivatives on quadratic variation has alternatively assumed that the underlying returns process is continuous over time. We compare the model values of derivatives on quadratic variation for the two types of models and find substantial differences.

119 citations


Journal ArticleDOI
01 Dec 2005
TL;DR: A generic query language to retrieve objects that match mobility patterns describing a sequence of moves is defined to express only deterministic queries and evaluation techniques to maintain incrementally the result of queries are discussed.
Abstract: We present a data model for tracking mobile objects and reporting the result of queries. The model relies on a discrete view of the spatio-temporal space, where the 2D space and the time axis are respectively partitioned in a finite set of user-defined areas and in constant-size intervals. We define a generic query language to retrieve objects that match mobility patterns describing a sequence of moves. We also identify a subset of restrictions to this language in order to express only deterministic queries for which we discuss evaluation techniques to maintain incrementally the result of queries. The model is conceptually simple, efficient, and constitutes a practical and effective solution to the problem of continuously tracking moving objects with sequence queries.

103 citations


Posted Content
TL;DR: In this paper, it was shown that weakly upper semicontinuous concave Schur concave functions are the infinimum of nonnegative affine combinations of Choquet integrals with respect to a convex continuous distortion of the underlying probability.
Abstract: A representation result is provided for concave Schur concave functions on L proportional to infinity (Omega). In particular, it is proven that any monotone concave Schur concave weakly upper semicontinuous function is the infinimum of a family of nonnegative affine combinations of Choquet integrals with respect to a convex continuous distortion of the underlying probability. The method of proof is based on the concave Fenchel transform and on Hardy and Littlewood's inequality. Under the assumption that the probability space is nonatomic, concave, weakly upper semicontinuous, law-invariant functions are shown to coincide with weakly upper semicontinuous concave Schur concave functions. A representation result is, thus, obtained for weakly upper semicontinuous concave law-invariant functions.

Journal ArticleDOI
TL;DR: This paper investigates the complexity of the min-max and min- max regret assignment problems both in the discrete scenario and interval data cases and shows that these problems are strongly NP-hard for an unbounded number of scenarios.

Journal ArticleDOI
TL;DR: In this paper, the impact of regulation of efforts to minimize costs has been widely discussed, but remains difficult to measure, since different firms can be under different regulatory regimes (price cap or rate-of-return) in the same geographical area at the same time.
Abstract: The impact of regulation of efforts to minimize costs has been widely discussed, but remains difficult to measure. The sophisticated regulation of water utilities in Wisconsin allows us to attempt such on assessment since different firms can be under different regulatory regimes (price cap or rate-of-return) in the same geographical area at the same time. To measure the impact of regulation on efficiency, we use a stochastic cost frontier approach defining the unobservable efficiency of a water utility as a function of exogeneous variables. Using a panel of 211 water utilities observed from 1998 to 2000, we show that their efficiency scores can be partly explained by the regulatory framework.

Journal ArticleDOI
TL;DR: In this paper, it was shown that weakly upper semicontinuous concave Schur concave functions coincide with concave Fenchel transform and Hardy and Littlewood's inequality.
Abstract: A representation result is provided for concave Schur concave functions on L∞(Ω). In particular, it is proven that any monotone concave Schur concave weakly upper semicontinuous function is the infinimum of a family of nonnegative affine combinations of Choquet integrals with respect to a convex continuous distortion of the underlying probability. The method of proof is based on the concave Fenchel transform and on Hardy and Littlewood's inequality. Under the assumption that the probability space is nonatomic, concave, weakly upper semicontinuous, law-invariant functions are shown to coincide with weakly upper semicontinuous concave Schur concave functions. A representation result is, thus, obtained for weakly upper semicontinuous concave law-invariant functions.

Journal ArticleDOI
TL;DR: In this paper, a new class of modified logarithmic Sobolev inequalities, interpolating between Poincare and log-car-Sobolev inequality, is presented, suitable for measures of the type $exp(exp(-|x|^\al)$ or
Abstract: We present a new class of modified logarithmic Sobolev inequality, interpolating between Poincare and logarithmic Sobolev inequalities, suitable for measures of the type $\exp(-|x|^\al)$ or $\exp(-|x|^\al\log^\beta(2+|x|))$ ($\al\in]1,2[$ and $\be\in\dR$) which lead to new concentration inequalities. These modified inequalities share common properties with usual logarithmic Sobolev inequalities, as tensorisation or perturbation, and imply as well Poincare inequality. We also study the link between these new modified logarithmic Sobolev inequalities and transportation inequalities.

Journal ArticleDOI
TL;DR: In this paper, the authors proved the existence of a global minimizer of the Bogoliubov-Dirac-Fock model for any cut-off Λ and without any constraint on the external field.
Abstract: We study the Bogoliubov–Dirac–Fock model introduced by Chaix and Iracane (1989 J. Phys. B: At. Mol. Opt. Phys. 22 3791–814) which is a mean-field theory deduced from no-photon QED. The associated functional is bounded from below. In the presence of an external field, a minimizer, if it exists, is interpreted as the polarized vacuum and it solves a self-consistent equation. In a recent paper, we proved the convergence of the iterative fixed-point scheme naturally associated with this equation to a global minimizer of the BDF functional, under some restrictive conditions on the external potential, the ultraviolet cut-off Λ and the bare fine structure constant α. In the present work, we improve this result by showing the existence of the minimizer by a variational method, for any cut-off Λ and without any constraint on the external field. We also study the behaviour of the minimizer as Λ goes to infinity and show that the theory is 'nullified' in that limit, as predicted first by Landau: the vacuum totally cancels the external potential. Therefore, the limit case of an infinite cut-off makes no sense both from a physical and mathematical point of view. Finally, we perform a charge and density renormalization scheme applying simultaneously to all orders of the fine structure constant α, on a simplified model where the exchange term is neglected.

Journal ArticleDOI
TL;DR: In this article, the problem of optimal portfolios of zero-coupon bonds is solved for general utility functions, under a condition of no-arbitrage in the zero coupon market, and a mutual fund theorem is proved in the case of deterministic volatilities.
Abstract: We introduce a bond portfolio management theory based on foundations similar to those of stock portfolio management. A general continuous-time zero-coupon market is considered. The problem of optimal portfolios of zero-coupon bonds is solved for general utility functions, under a condition of no-arbitrage in the zero-coupon market. A mutual fund theorem is proved, in the case of deterministic volatilities. Explicit expressions are given for the optimal solutions for several utility functions.


Journal ArticleDOI
TL;DR: In this article, the authors study the control by electromagnetic fields of molecular alignment and orientation in a linear, rigid-rotor model and find that the optimal field is in the microwave part of the spectrum and acts by resonantly exciting the rotation of the molecule progressively from the ground state, i.e., by rotational ladder climbing.
Abstract: We study the control by electromagnetic fields of molecular alignment and orientation in a linear, rigid-rotor model. With the help of a monotonically convergent algorithm, we find that the optimal field is in the microwave part of the spectrum and acts by resonantly exciting the rotation of the molecule progressively from the ground state, i.e., by rotational ladder climbing. This mechanism is present not only when maximizing orientation or alignment, but also when using prescribed target states that simultaneously optimize the efficiency of orientation/alignment and its duration. The extension of the optimization method to consider a finite rotational temperature is also presented.

Journal ArticleDOI
TL;DR: In this paper, a high-availability scalable distributed data structure (LHaRS) is proposed, in which the value of k transparently grows with the file to offset the reliability decline and only the number of the storage nodes potentially limits the file growth.
Abstract: LHaRS is a high-availability scalable distributed data structure (SDDS). An LHaRS file is hash partitioned over the distributed RAM of a multicomputer, for example, a network of PCs, and supports the unavailability of any k g 1 of its server nodes. The value of k transparently grows with the file to offset the reliability decline. Only the number of the storage nodes potentially limits the file growth. The high-availability management uses a novel parity calculus that we have developed, based on Reed-Salomon erasure correcting coding. The resulting parity storage overhead is about the lowest possible. The parity encoding and decoding are faster than for any other candidate coding we are aware of. We present our scheme and its performance analysis, including experiments with a prototype implementation on Wintel PCs. The capabilities of LHaRS offer new perspectives to data intensive applications, including the emerging ones of grids and of P2P computing.

Journal ArticleDOI
01 Nov 2005
TL;DR: The model allows the synthesis, evaluation and update of available information in order to help human regulators in their monitoring task and proposes a multi-agent representation of an incident allowing the integration of the disturbance processing within the activity of a network system.
Abstract: This paper presents the modeling of a disturbance on a public transportation line. The proposed model allows the synthesis, evaluation and update of available information in order to help human regulators in their monitoring task. It begins with a formal modeling of the disturbance concept. This modeling makes it possible to capitalize the knowledge available within a monitoring station and to follow up the evolution of the disturbances in real time. The paper goes on to propose a multi-agent representation of an incident allowing the integration of the disturbance processing within the activity of a network system.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relationship between strategic alliances and relational risk in French biotechnology firms and found that strategic alliances are often described as risky, dangerous, and instable.
Abstract: Purpose – Strategic alliances are often described as risky, dangerous, and instable. When firms adopt these strategies, they are confronted with a relational risk. Nevertheless, little empirical work has been down on relational risk in alliances. For this reason, this research is founded and constructed on two principal questions: what is relational risk? And how is this risk to be managed?Design/methodology/approach – From a methodological point of view, neither one paradigm nor the other concerning previous research was favoured. The process of the empirical research is based on an inductive non‐demonstrative step. It was carried out in two phases. Firstly, exploratory research was aimed at complementing previous research and formulating hypotheses. These hypotheses were tested with survey data on 87 partnerships of French biotechnology firms.Findings – The results demonstrate the multidimensional character of relational risk and the duality of relational control. Relational control includes autonomous ...

Journal ArticleDOI
TL;DR: This paper proves the existence of natural Poly-APX- and Poly-D APX-complete problems under the well known PTAS-reduction and under the DPTAS- reduction and introduces approximation preserving reductions, called FT and DFT, respectively, and proves that, under these new reductions, natural problems are PTAS, or DPTAs-complete.

Journal ArticleDOI
TL;DR: The C–K design theory is extended, using ideas and principles from situated cognition, and the new version provides a theoretical background for building personal design assistants—creative and adaptive design tools.
Abstract: The C–K theory is a recent theory of reasoning in design. Despite many practical applications, the theory has not yet been operationalized in the form of a computational design tool. In this paper, we argue that, in order to build such tools, a third space—an environment space E—must be introduced to the theory. Therefore, we extend the C–K design theory, using ideas and principles from situated cognition. As we discuss, the new version provides a theoretical background for building personal design assistants—creative and adaptive design tools. Best Young Design Researcher paper at the Design 2004 Society Conference in Dubrovnik, as judged by a panel in which Professors Marjanovic, Birkhofer and Andreasen were included.

Journal ArticleDOI
TL;DR: It is shown that the binary relations that can be obtained using comparison of coalitions of attributes within a general framework for conjoint measurement that allows for intransitive preferences are mainly characterized by the very rough differentiation of preference differences that they induce on each attribute.

Journal ArticleDOI
TL;DR: In this paper, the existence of the mobility coefficient for a tagged particle in a simple symmetric exclusion process with adsorption/desorption of particles, in the presence of an external force field interacting with the particle, was rigorously established using a perturbative argument.
Abstract: In this paper we rigorously establish the existence of the mobility coefficient for a tagged particle in a simple symmetric exclusion process with adsorption/desorption of particles, in a presence of an external force field interacting with the particle. The proof is obtained using a perturbative argument. In addition, we show that, for a constant external field, the mobility of a particle equals to the self-diffusivity coefficient, the so-called Einstein relation. The method can be applied to any system where the environment has a Markovian evolution with a fast convergence to equilibrium (spectral gap property). In this context we find a necessary relation between forward and backward velocity for the validity of the Einstein relation. This relation is always satisfied by reversible systems. We provide an example of a non-reversible system, where the Einstein relation is valid.

Journal ArticleDOI
TL;DR: Analysis of a multilateral negotiation framework, where autonomous agents agree on a sequence of deals to exchange sets of discrete resources in order to both further their own goals and to achieve a distribution of resources that is socially optimal, can distinguish different aspects of complexity.
Abstract: We study the complexity of a multilateral negotiation framework, where autonomous agents agree on a sequence of deals to exchange sets of discrete resources in order to both further their own goals and to achieve a distribution of resources that is socially optimal. When analysing such a framework, we can distinguish different aspects of complexity: How many deals are required to reach an optimal allocation of resources? How many communicative exchanges are required to agree on one such deal? How complex a communication language do we require? And finally, how complex is the reasoning task faced by each agent?

Journal ArticleDOI
01 Aug 2005
TL;DR: The New Economic Sociology (NE Sociology) as discussed by the authors is a discipline of sociologues dedicated to the discipline of economic discipline, and it is the discipline that is most closely related to ours.
Abstract: Depuis le milieu des annees soixante-dix, la « New Economic Sociology » a ete amenee a considerer de pres le fonctionnement des marches, domaine jusqu’alors quasiment reserve a la discipline economique car, malgre l’importance de ce fait social (Orlean 2005), les sociologues ne se sentaient pas une legitimite suffisante pour intervenir sur ce domaine. A cote de la dynamique propre aux Etats-Unis (Guillen 2002 ; Swedberg 2003), il existe aussi en Europe un interet marque pour l’approche sociol...

Book ChapterDOI
12 Oct 2005
TL;DR: This paper considers a weighted version of the coloring problem which consists in finding a partition of the vertex set V of G into stable sets and minimizing ∑i=1kw(Si) where the weight of S is defined as max{w(v) : v ∈ S}.
Abstract: Given a vertex-weighted graph G = (V,E;w), w(v) ≥ 0 for any v ∈ V, we consider a weighted version of the coloring problem which consists in finding a partition ${\mathcal S}=(S_{1}...,S_{k})$of the vertex set V of G into stable sets and minimizing ∑i=1kw(Si) where the weight of S is defined as max{w(v) : v ∈ S}. In this paper, we keep on with the investigation of the complexity and the approximability of this problem by mainly answering one of the questions raised by D. J. Guan and X. Zhu (”A Coloring Problem for Weighted Graphs”, Inf. Process. Lett. 61(2):77-81 1997).

Journal ArticleDOI
TL;DR: This paper deals with both the complexity and the approximability of the labeled perfect matching problem in bipartite graphs.