scispace - formally typeset
Search or ask a question
Author

Léopold Simar

Bio: Léopold Simar is an academic researcher from Université catholique de Louvain. The author has contributed to research in topics: Estimator & Nonparametric statistics. The author has an hindex of 68, co-authored 249 publications receiving 23115 citations. Previous affiliations of Léopold Simar include University of Toulouse & Katholieke Universiteit Leuven.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, a coherent data-generating process (DGP) is described for nonparametric estimates of productive efficiency on environmental variables in two-stage procedures to account for exogenous factors that might affect firms’ performance.

2,915 citations

Journal ArticleDOI
TL;DR: In this paper, the authors provide a general methodology of bootstrapping in nonparametric frontier models and some adapted methods are illustrated in analyzing the bootstrap sampling variations of input efficiency measures of electricity plants.
Abstract: Efficiency scores of production units are generally measured relative to an estimated pro-duction frontier. Nonparametric estimators (DEA, FDH,···) are based on a finite sample of observed production units. The bootstrap is one easy way to analyze the sensitivity of efficiency scores relative to the sampling variations of the estimated frontier. The main point in order to validate the bootstrap is to define a reasonable data-generating process in this complex framework and to propose a reasonable estimator of it. This paper provides a general methodology of bootstrapping in nonparametric frontier models. Some adapted methods are illustrated in analyzing the bootstrap sampling variations of input efficiency measures of electricity plants.

2,024 citations

Journal ArticleDOI
TL;DR: In this article, the authors define a statistical model allowing determination of the statistical properties of the nonparametric estimators in the multi-output and multi-input case, and provide the asymptotic sampling distribution of the FDH estimator in a multivariate setting and of the DEA estimator for the bivariate case.
Abstract: Efficiency scores of firms are measured by their distance to an estimated production frontier. The economic literature proposes several nonparametric frontier estimators based on the idea of enveloping the data (FDH and DEA-type estimators). Many have claimed that FDH and DEA techniques are non-statistical, as opposed to econometric approaches where particular parametric expressions are posited to model the frontier. We can now define a statistical model allowing determination of the statistical properties of the nonparametric estimators in the multi-output and multi-input case. New results provide the asymptotic sampling distribution of the FDH estimator in a multivariate setting and of the DEA estimator in the bivariate case. Sampling distributions may also be approximated by bootstrap distributions in very general situations. Consequently, statistical inference based on DEA/FDH-type estimators is now possible. These techniques allow correction for the bias of the efficiency estimators and estimation of confidence intervals for the efficiency measures. This paper summarizes the results which are now available, and provides a brief guide to the existing literature. Emphasizing the role of hypotheses and inference, we show how the results can be used or adapted for practical purposes.

1,099 citations

Book
25 Aug 2008
TL;DR: In this paper, a short excursion into Matrix Algebra Moving to Higher Dimensions Multivariate Distributions Theory of the Multinormal Theory of Estimation Hypothesis Testing is described. But it is not discussed in detail.
Abstract: I Descriptive Techniques: Comparison of Batches.- II Multivariate Random Variables: A Short Excursion into Matrix Algebra Moving to Higher Dimensions Multivariate Distributions Theory of the Multinormal Theory of Estimation Hypothesis Testing.- III Multivariate Techniques: Decomposition of Data Matrices by Factors Principal Components Analysis Factor Analysis Cluster Analysis Discriminant Analysis.- Correspondence Analysis.- Canonical Correlation Analysis.- Multidimensional Scaling.- Conjoint Measurement Analysis.- Application in Finance.- Computationally Intensive Techniques.- A: Symbols and Notations.- B: Data.- Bibliography.- Index.

1,081 citations

Journal ArticleDOI
TL;DR: In this paper, a nonparametric estimator based on the concept of expected minimum input function (or expected maximal output function) is proposed, which is related to the FDH estimator but will not envelop all the data.

1,023 citations


Cited by
More filters
Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI
TL;DR: Principal component analysis (PCA) as discussed by the authors is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables, and its goal is to extract the important information from the table, to represent it as a set of new orthogonal variables called principal components, and display the pattern of similarity of the observations and of the variables as points in maps.
Abstract: Principal component analysis PCA is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables. Its goal is to extract the important information from the table, to represent it as a set of new orthogonal variables called principal components, and to display the pattern of similarity of the observations and of the variables as points in maps. The quality of the PCA model can be evaluated using cross-validation techniques such as the bootstrap and the jackknife. PCA can be generalized as correspondence analysis CA in order to handle qualitative variables and as multiple factor analysis MFA in order to handle heterogeneous sets of variables. Mathematically, PCA depends upon the eigen-decomposition of positive semi-definite matrices and upon the singular value decomposition SVD of rectangular matrices. Copyright © 2010 John Wiley & Sons, Inc.

6,398 citations

Posted Content
TL;DR: In this paper, the authors investigated conditions sufficient for identification of average treatment effects using instrumental variables and showed that the existence of valid instruments is not sufficient to identify any meaningful average treatment effect.
Abstract: We investigate conditions sufficient for identification of average treatment effects using instrumental variables. First we show that the existence of valid instruments is not sufficient to identify any meaningful average treatment effect. We then establish that the combination of an instrument and a condition on the relation between the instrument and the participation status is sufficient for identification of a local average treatment effect for those who can be induced to change their participation status by changing the value of the instrument. Finally we derive the probability limit of the standard IV estimator under these conditions. It is seen to be a weighted average of local average treatment effects.

3,154 citations

Journal ArticleDOI
TL;DR: The authors survey 130 studies that apply frontier efficiency analysis to financial institutions in 21 countries and find that the various efficiency methods do not necessarily yield consistent results and suggest some ways that these methods might be improved to bring about findings that are more consistent, accurate, and useful.

2,983 citations

Journal ArticleDOI
TL;DR: In this article, a nonparametric maximum likelihood estimator for the distribution of unobservables and a computational strategy for implementing it is developed. But the estimator does not account for population variation in observed and unobserved variables unless it is assumed that individuals are homogeneous.
Abstract: Conventional analyses of single spell duration models control for unobservables using a random effect estimator with the distribution of unobservables selected by ad hoc criteria. Both theoretical and empirical examples indicate that estimates of structural parameters obtained from conventional procedures are very sensitive to the choice of mixing distribution. Conventional procedures overparameterize duration models. We develop a consistent nonparametric maximum likelihood estimator for the distribution of unobservables and a computational strategy for implementing it. For a sample of unemployed workers our estimator produces estimates in concordance with standard search theory while conventional estimators do not. ECONOMIC THEORIES of search unemployment (Lippman and McCall [34]; Flinn and Heckman [14]), job turnover (Jovanovic [25]), mortality (Harris [17]), labor supply (Heckman and Willis [23]) and marital instability (Becker [3]) produce structural distributions for durations of occupancy of states. These theories generate qualitative predictions about the effects of changes in parameters on these structural distributions, and occasionally predict their functional forms.2 In order to test economic theories about durations and recover structural parameters, it is necessary to account for population variation in observed and unobserved variables unless it is assumed a priori that individuals are homogeneous.3 In every microeconomic study in which the hypothesis of heterogeneity is subject to test, it is not rejected. Temporally persistent unobserved components are an empirically important fact of life in microeconomic data (Heckman [19]). Since the appearance of papers by Silcock [39] and Blumen, Kogan, and McCarthy [5], social scientists have been aware that failure to adequately control for population heterogeneity can produce severe bias in structural estimates of duration models. Serious empirical analysts attempt to control for these unob

2,940 citations