scispace - formally typeset
Search or ask a question

Showing papers by "Paris Dauphine University published in 1996"


Journal ArticleDOI
TL;DR: In this article, the authors trace the emergence of the European School of Multi-criteria Decision Analysis (MSDA) and provide a general review of the current major research topics developed within this framework.
Abstract: Multiple-criteria decision analysis has evolved considerably since its birth during the 1960s. As part of this evolution, several schools of thought have developed emphasizing different techniques and, more generally, different attitudes as to the way of supporting or aiding decision making. One of these schools is now commonly referred to as the ‘European School’, its members being part of a European Working Group entitled ‘Multicriteria Aid for Decisions’. In the first part of this paper (Section 1) we follow a historical perspective in order to trace the emergence of the European School. Its distinctive features and main ideas are then outlined in Section 2. Finally we provide a general review of the current major research topics developed within this framework (Section 3).

274 citations


Journal ArticleDOI
TL;DR: In this paper, the authors characterize the so-called Black and Scholes implied volatility as a function of two arguments the ratio of the strike to the underlying asset price and the instantaneous value of the volatility.
Abstract: In the stochastic volatility framework of Hull and White (1987), we characterize the so-called Black and Scholes implied volatility as a function of two arguments the ratio of the strike to the underlying asset price and the instantaneous value of the volatility By studying the variation m the first argument, we show that the usual hedging methods, through the Black and Scholes model, lead to an underhedged (resp. overhedged) position for in-the-money (resp out-of the-money) options, and a perfect partial hedged position for at the-money options These results are shown to be closely related to the smile effect, which is proved to be a natural consequence of the stochastic volatility feature the deterministic dependence of the implied volatility on the underlying volatility process suggests the use of implied volatility data for the estimation of the parameters of interest A statistical procedure of filtering (of the latent volatility process) and estimation (of its parameters) is shown to be strongly consistent and asymptotically normal.

262 citations


Journal ArticleDOI
TL;DR: In this paper, the authors define a theoretical framework to analyze the notion of importance of criteria under very general conditions and show that the importance of the criteria is taken into account in very different ways in various aggregation procedures.
Abstract: Multiple-criteria decision aid almost always requires the use of weights, importance coefficients or even a hierarchy of criteria, veto thresholds, etc. These are importance parameters that are used to differentiate the role devoted to each criterion in the construction of comprehensive preferences. Many researchers have studied the problem of how to assign values to such parameters, but few of them have tried to analyse in detail what underlies the notion of importance of criteria and to give a clear formal definition of it. In this paper our purpose is to define a theoretical framework so as to analyse the notion of the importance of criteria under very general conditions. Within this framework it clearly appears that the importance of criteria is taken into account in very different ways in various aggregation procedures. This framework also allows us to shed new light on fundamental questions such as: Under what conditions is it possible to state that one criterion is more important than another? Are importance parameters of the various aggregation procedures dependent on or independent of the encoding of criteria? What are the links between the two concepts of the importance of criteria and the compensatoriness of preferences? This theoretical framework seems to us sufficiently general to ground further research in order to define theoretically valid elicitation methods for importance parameters.

158 citations


Journal ArticleDOI
TL;DR: In this paper, the existence of soliton-like solutions of the Maxwell-Dirac system in (3+1)-Minkowski space-time has been proved by a variational method, as critical points of an energy functional.
Abstract: The Maxwell-Dirac system describes the interaction of an electron with its own electromagnetic field. We prove the existence of soliton-like solutions of Maxwell-Dirac in (3+1)-Minkowski space-time. The solutions obtained are regular, stationary in time, and localized in space. They are found by a variational method, as critical points of an energy functional. This functional is strongly indefinite and presents a lack of compactness. We also find soliton-like solutions for the Klein-Gordon-Dirac system, arising in the Yukawa model.

124 citations


Journal ArticleDOI
TL;DR: In this article, a coupled Navier?Stokes Boltzman solver was proposed for the calculation of hypersonic rarefied flows around manoeuvering vehicles.

88 citations


Journal ArticleDOI
TL;DR: In this article, explicit finite difference schemes are given for a collection of parabolic equations which may have all of the following complex features: degeneracy, quasilinearity, full nonlinearity, and singularities.
Abstract: Explicit finite difference schemes are given for a collection of parabolic equations which may have all of the following complex features: degeneracy, quasilinearity, full nonlinearity, and singularities. In particular, the equation of “motion by mean curvature” is included. The schemes are monotone and consistent, so that convergence is guaranteed by the general theory of approximation of viscosity solutions of fully nonlinear problems. In addition, an intriguing new type of nonlocal problem is analyzed which is related to the schemes, and another very different sort of approximation is presented as well.

85 citations


Journal ArticleDOI
TL;DR: In this paper, a polynomial approximation theory linked to combinatorial optimization is defined and a notion of equivalence among optimization problems is introduced, which includes translation or affine transformation of the objective function or yet some aspects of equivalencies between maximization and minimization problems.

70 citations


01 Jan 1996
TL;DR: An approach to the interleaving of execution and planning which is based on the RPN semantics is provided and it is shown how this approach can be used to coordinate agents' plans in a shared and dynamic environment.
Abstract: Distributed planning is fundamental to the generation of cooperative activities in Multi-Agent Systems. It requires both an adequate plan representation and efficient interacting methods allowing agents to coordinate their plans. This paper proposes a recursive model for the representation and the handling of plans by means of Recursive Petri Nets (RPN) which support the specification of concurrent activities, reasoning about simultaneous actions and continuous processes, a theory of verification and mechanisms of transformation (e.g. abstraction, refinement, merging) . The main features of the RPN formalism are domain independence, broad coverage of interacting situations and operational coordination. This paper also provides an approach to the interleaving of execution and planning which is based on the RPN semantics and gives some significant methods allowing plan management in distributed planning. It goes on to show how this approach can be used to coordinate agents' plans in a shared and dynamic environment.

70 citations


Journal ArticleDOI
TL;DR: A cost-benefit study of influenza vaccination for the employed adult population showed that vaccination is a cost-saving strategy, although this was also contingent upon the problems associated with measuring indirect benefits as well as the effectiveness rate of vaccination in real conditions.
Abstract: The cost-of-illness studies of influenza performed in France for the years 1985 and 1989 have shown that the major economic consideration is the respective sizes of indirect and direct costs. Depending on the point of view (from the perspective of National Health Insurance or the societal perspective) and the method used for measuring indirect costs, it was estimated that they could be between 1.5 and 9 times higher than direct costs. A cost-benefit study of influenza vaccination for the employed adult population showed that vaccination is a cost-saving strategy, although this was also contingent upon the problems associated with measuring indirect benefits as well as the effectiveness rate of vaccination in real conditions.

54 citations


Journal ArticleDOI
TL;DR: In this article, the authors use the method of characteristics to extend the Jacobi conjugate points theory to the Bolza problem arising in nonlinear optimal control, which yields necessary and sufficient optimality conditions for weak and strong local minima stated in terms of the existence of a solution to a corresponding matrix Riccati differential equation.
Abstract: In this paper the authors use the method of characteristics to extend the Jacobi conjugate points theory to the Bolza problem arising in nonlinear optimal control. This yields necessary and sufficient optimality conditions for weak and strong local minima stated in terms of the existence of a solution to a corresponding matrix Riccati differential equation. The same approach allows to investigate as well smoothness of the value function.

52 citations


Book ChapterDOI
01 Jan 1996
TL;DR: A mathematical and computational model is deduced to detect the “atoms” of the image: level lines joining T- or X-junctions and the adequate modification of morphological filtering algorithms are proposed so that they smooth the „atoms" without altering the junctions.
Abstract: Based on the phenomenological description of Gaetano Kanizsa, we discuss the physical generation process of images as a combination of basic operations: occlusions, transparencies and constrast changes. These operations generate the essential singularities, which we call junctions. We deduce a mathematical and computational model to detect the „atoms” of the image: level lines joining T- or X-junctions. Then we propose the adequate modification of morphological filtering algorithms so that they smooth the „atoms” without altering the junctions. Finally, we give some experiments on real and synthetic images.

Proceedings ArticleDOI
19 Jun 1996
TL;DR: Several schemes for mirroring LH* files are presented, and the authors incorporate this technique into the LH* algorithms for scalable distributed linear hash files.
Abstract: Mirroring is a popular technique for enhancing file availability. The authors incorporate this technique into the LH* algorithms for scalable distributed linear hash files. Several schemes for mirroring LH* files are presented in this paper. The schemes increase the availability of LH* files in the presence of node failures. Every record remains accessible in the presence of a single node failure, and usually in the presence of multiple-node failures. The price is, as usual, twice as much storage for data, and an increase in the number of messages. The different schemes are characterized by different trade-offs, and they accommodate diverse application requirements. The additional messaging cost per insert is about the same for all the schemes, and is roughly only one message. The cost of a bucket recovery may in contrast vary greatly, from one message for one type of scheme, to a few for another, and many for yet another.

Journal ArticleDOI
TL;DR: In this article, the authors characterized the epigraph of the value function of a discounted infinite horizon optimal control problem as the viability kernel of an auxiliary differential inclusion, and then the algorithm applied to this problem provided the value functions of the discretized optimal control problems as the supremum of a nondecreasing sequence of functions iteratively defined.

Journal ArticleDOI
TL;DR: In this paper, the authors describe an empirical assessment of several hypotheses associated with organization design in the context of new and flexible technologies, which illustrate how workplaces are being organized within a context of flexible, new technology, and the movement toward an emerging new paradigm of work.
Abstract: This paper describes an empirical assessment of several hypotheses associated with organization design in the context of new and flexible technologies. Three distinct technologies were examined in four different geographical locations in Sweden, France, and Canada. The hypotheses illustrate how workplaces are being organized within the context of flexible, new technology, and the movement toward an emerging new paradigm of work. Flexible, new technologies and this paradigm are entering organizations at the same time. The former are changing the ground on which assumptions underlying the emerging paradigm of organization have been built. Several of the hypotheses examined have been revised as a consequence. The data from twelve companies are plotted on a matrix of organization design principles against organization design implementation to illustrate changing organization design patterns as well as geographic differences between the companies.

Journal ArticleDOI
TL;DR: In this paper, the inner product of the eigenfunctions of a Schr6dinger-type operator with a coherent state was studied and the leading coefficient in the expansion was shown to depend on whether the classical trajectory through (x, 4) is periodic or not.
Abstract: In this paper we concern ourselves with the small h asymptotics of the inner products of the eigenfunctions of a Schr6dinger-type operator with a coherent state. More precisely, let 0~ and E) denote the eigenfunctions and eigenvalues of a Schr6dinger-type operator H~ with discrete spectrum. Let 0(x,~) be a coherent state centered at the point (x, ~) in phase space. We estimate as h --+ 0 the averages of the squares of the inner products (o(a,~,OJ~) over an energy interval of size h around a fixed energy, E. This follows from asymptotic expansions of the form for certain test functions (p and Schwartz amplitudes a of the coherent state. We compute the leading coefficient in the expansion, which depends on whether the classical trajectory through (x, 4) is periodic or not. In the periodic case the iterates of the trajectory contribute to the leading coefficient. We also discuss the case of the Laplacian on a compact Riernannian manifold.

Journal ArticleDOI
TL;DR: Lower semi-continuity as discussed by the authors was introduced by Tonelli, who was the first to prove that the integrals arising in the classical calculus of variations were lower semidefinite.

Journal ArticleDOI
TL;DR: In this paper, the Lipschitz continuity and semiconcavity of the value function are investigated in order to investigate the differentiability of value functions along optimal trajectories.
Abstract: In this paper we continue to study properties of the value function and of optimal solutions of a semilinear Mayer problem in infinite dimensions. Applications concern systems governed by a state equation of parabolic type. In particular, the issues of the joint Lipschitz continuity and semiconcavity of the value function are treated in order to investigate the differentiability of the value function along optimal trajectories.

Book ChapterDOI
24 Jun 1996
TL;DR: This work derives necessary conditions on the modeled systems that allow for the two methods to be combined, and develops a model based upon synchronization of “global” tokens moving across submodels, which covers a large range of real life systems.
Abstract: Stochastic Well Formed Nets (SWNs) are a powerful Petri Net model which allows the computation of performance indices with an aggregation method Decomposition methods initiated by B Plateau are another way to reduce the complexity of such a computation We have shown in a previous work, how to combine these two approaches for systems with synchronous composition Despite similarities between the asynchronous and synchronous cases, it turns out that the former presents specificities that need theoretical foundations We undertake this task in the present paper We derive necessary conditions on the modeled systems that allow for the two methods to be combined For parallel systems satisfying these necessary conditions we develop a model with the corresponding algorithm This model, based upon synchronization of “global” tokens moving across submodels, covers a large range of real life systems An example shows the intuitive ideas behind these developments


Book ChapterDOI
01 Feb 1996
TL;DR: This paper presents an adequate framework for the representation and handling of plans in MAS and shows how an approach based on a plan representation by means of a partial order model enables the definition of a coordination algorithm for the possible enrichment of plans.
Abstract: One of the major interests of Multi-Agent Systems (MAS), which are able to handle distributed planning, is coordination. This coordination requires both an adequate plan representation and efficient interacting methods between agents. Interactions are based on information exchange (e.g. data, partial or global plan) and allow agents to update their own plans by including the exchanged information. Coordination generally produces two effects: it cancels negative interactions (e.g. resource sharing) and it takes advantage of helpful ones (e.g. handling redundant actions). A coordination model should satisfy the following requirements: domain independence, broad covering of interacting situations, operational coordination semantics and natural expression for the designer. This paper presents an adequate framework for the representation and handling of plans in MAS. It then shows how an approach based on a plan representation by means of a partial order model enables the definition of a coordination algorithm for the possible enrichment of plans.

Journal ArticleDOI
TL;DR: In this article, the authors present a result concerning the existence of stationary solutions for the classical Maxwell-Dirac equations in the Lorentz gauge, which is obtained by using variational arguments.
Abstract: In this Letter we present a result concerning the existence of stationary solutions for the classical Maxwell-Dirac equations in the Lorentz gauge. We believe that such a result can be of interest for a field quantization approach in QED. This result is obtained by using variational arguments. By a similar method, we also find an infinity of distinct solutions for the Klein-Gordon-Dirac system, arising in the so-called Yukawa model.

Book ChapterDOI
03 Jul 1996
TL;DR: This paper analyses the extension of integrity constraint mechanisms in order to maintain consistency in multiversion databases, and the database versions model is used as it offers a sound basis for the definition of consistency.
Abstract: This paper analyses the extension of integrity constraint mechanisms in order to maintain consistency in multiversion databases is studied. Unlike monoversion databases, a multiversion database represents several states of the modeled universe. Thus, both the notion of consistency and the means to maintain it have to be extended. To this aim, we consider new integrity constraints induced by versioning. Constraints are characterized according to several criteria, and a general framework for optimizing their checking in the context of ACID transactions is given. The database versions model [CJ90] is used as it offers a sound basis for the definition of consistency.

Proceedings ArticleDOI
25 Aug 1996
TL;DR: This paper studies the problem of face identification for the particular application of an automatic cash machine withdrawal: the problem is to decide if a person identifying himself by a secret code is the same person registered in the database.
Abstract: This paper studies the problem of face identification for the particular application of an automatic cash machine withdrawal: the problem is to decide if a person identifying himself by a secret code is the same person registered in the database. The identification process consists of three main stages. The localization of salient features is obtained by using morphological operators and spatio-temporal information. The location of these features are used to achieve a normalization of the face image with regard to the corresponding face in the database. Facial features, such as eyes, mouth and nose, are extracted by an active contour model which is able to incorporate information about the global shape of each object. Finally, the identification is achieved by face warping including a deformation measure.

Journal Article
TL;DR: An economic advantage, plus a reduction of 6 in the number of deaths, corresponding to 131 life-years saved, demonstrates the favorable cost-benefit ratio of this treatment strategy.
Abstract: Un essai, controle en double insu, conduit par Lewis et coll. a montre qu'un traitement par captopril, chez 409 patients ayant un diabete insulino-dependant, une proteinurie superieure a 500 mg/jour et une creatinemie inferieure a 221 μmol/l, permettait de diminuer de 50% (p = 0,006) le risque d'evenements combines (mortalite, dialyse et transplantation renales) et de 48% (p = 0,007) le risque de doublement de la creatinemie. Nous avons evalue le rapport cout-efficacite de ce traitement a partir des resultats medicaux de cette etude et des donnees economiques du systeme de soin francais. Le surcout lie au captopril est totalement compense par une baisse des couts, lie au traitement evite par la dialyse et la transplantation, aux traitements antihypertenseurs associes et aux hospitalisations. L'economie des depenses de sante dans le groupe captopril s'eleve a 6 millions de Frs dans cette etude. Depenser Fr. 100.-pour le traitement de la nephropathie diabetique par le captopril, fait economiser Fr. 575.-. Ce gain economique, associe a 6 deces evites correspondant a 131 annees de vie sauvees, temoigne du rapport cout-efficacite favorable de cette strategie therapeutique.

Journal ArticleDOI
TL;DR: While it is too early at this stage to determine whether these new approaches will be beneficial in the long term, some changes in drug prescribing have already been observed, and it is clear that the new policy has also encouraged more healthy relationships between policymakers, the medical profession, and the pharmaceutical industry in France.
Abstract: The pharmaceutical market in France is characterised by low prices and high sales volumes. Despite these advantageous market conditions, the French pharmaceutical industry has in general been an underperformer in the global context. Acknowledgement of the contributory role made by state regulation of drug expenditures in creating this situation has resulted in a number of attempts to correct problems within the market. At best, these have achieved only temporary improvements. Since 1993, however, a new drug policy, which emphasises voluntary moderation by physicians of their own prescribing activities rather than the use of budgetary means to cut expenditures, has been in operation in France. This 'medicalised strategy' involves 2 main instruments, viz., a list of guidelines for clinical practice (References Medicales Opposables) and a set of 'industrial conventions' (agreed between each drug company and the government) for determining drug prices. While it is too early at this stage to determine whether these new approaches will be beneficial in the long term, some changes in drug prescribing have already been observed, and it is clear that the new policy has also encouraged more healthy relationships between policymakers, the medical profession, and the pharmaceutical industry in France.



Journal ArticleDOI
04 Apr 1996
TL;DR: This work proposes to support exceptions to the behavioral consistency rules without sacrificing type safety by detecting unsafe statements in a method code at compile-time and checking them at run-time.
Abstract: Object-oriented databases enforce behavioral schema consistency rules to guarantee type safety, i.e., that no run-time type error can occur. When the schema must evolve, some schema updates may violate these rules. In order to maintain behavioral schema consistency, traditional solutions require significant changes to the types, the type hierarchy and the code of existing methods. Such operations are very expensive in a database context. To ease schema evolution, we propose to support exceptions to the behavioral consistency rules without sacrificing type safety. The basic idea is to detect unsafe statements in a method code at compile-time and check them at run-time. The run-time check is performed by a specific clause that is automatically inserted around unsafe statements. This check clause warns the programmer of the safety problem and lets him provide exception-handling code. Schema updates can therefore be performed with only minor changes to the code of methods.