scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Some Models for Estimating Technical and Scale Inefficiencies in Data Envelopment Analysis

01 Sep 1984-Management Science (INFORMS)-Vol. 30, Iss: 9, pp 1078-1092
TL;DR: The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs as mentioned in this paper.
Abstract: In management contexts, mathematical programming is usually used to evaluate a collection of possible alternative courses of action en route to selecting one which is best. In this capacity, mathematical programming serves as a planning aid to management. Data Envelopment Analysis reverses this role and employs mathematical programming to obtain ex post facto evaluations of the relative efficiency of management accomplishments, however they may have been planned or executed. Mathematical programming is thereby extended for use as a tool for control and evaluation of past accomplishments as well as a tool to aid in planning future activities. The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs. A separation into technical and scale efficiencies is accomplished by the methods developed in this paper without altering the latter conditions for use of DEA directly on observational data. Technical inefficiencies are identified with failures to achieve best possible output levels and/or usage of excessive amounts of inputs. Methods for identifying and correcting the magnitudes of these inefficiencies, as supplied in prior work, are illustrated. In the present paper, a new separate variable is introduced which makes it possible to determine whether operations were conducted in regions of increasing, constant or decreasing returns to scale in multiple input and multiple output situations. The results are discussed and related not only to classical single output economics but also to more modern versions of economics which are identified with "contestable market theories."
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a modified version of DEA based upon comparison of efficient DMUs relative to a reference technology spanned by all other units is developed, which provides a framework for ranking efficient units and facilitates comparison with rankings based on parametric methods.
Abstract: Data Envelopment Analysis DEA evaluates the relative efficiency of decision-making units DMUs but does not allow for a ranking of the efficient units themselves. A modified version of DEA based upon comparison of efficient DMUs relative to a reference technology spanned by all other units is developed. The procedure provides a framework for ranking efficient units and facilitates comparison with rankings based on parametric methods.

3,320 citations

Journal ArticleDOI
TL;DR: This paper addresses the "super-efficiency" issue of Data Envelopment Analysis by using the slacks-based measure (SBM) of efficiency, which the author proposed in his previous paper [European Journal of Operational Research 130 (2001) 498].
Abstract: In most models of Data Envelopment Analysis (DEA), the best performers have the full efficient status denoted by unity (or 100%), and, from experience, we know that usually plural Decision Making Units (DMUs) have this “efficient status”. To discriminate between these efficient DMUs is an interesting subject. This paper addresses this “super-efficiency” issue by using the slacks-based measure (SBM) of efficiency, which the author proposed in his previous paper [European Journal of Operational Research 130 (2001) 498]. The method differs from the traditional one based on the radial measure, e.g. Andersen and Petersen model, in that the former deals directly with slacks in inputs/outputs, while the latter does not take account of the existence of slacks. We will demonstrate the rationality of our approach by comparing it with the radial measure of super-efficiency. The proposed method will be particularly useful when the number of DMUs are small compared with the number of criteria employed for evaluation.

2,575 citations

31 Jan 2001
TL;DR: In this paper, the slacks-based measure (SBM) of efficiency was proposed to discriminate the efficient decision making units (DMUs) based on the existence of slacks.
Abstract: In most models of Data Envelopment Analysis (DEA), the best performers have the full efficient status denoted by unity (or 100%), and, from experience, we know that usually plural Decision Making Units (DMUs) have this “efficient status”. To discriminate between these efficient DMUs is an interesting subject. This paper addresses this “super-efficiency” issue by using the slacks-based measure (SBM) of efficiency, which the author proposed in his previous paper [European Journal of Operational Research 130 (2001) 498]. The method differs from the traditional one based on the radial measure, e.g. Andersen and Petersen model, in that the former deals directly with slacks in inputs/outputs, while the latter does not take account of the existence of slacks. We will demonstrate the rationality of our approach by comparing it with the radial measure of super-efficiency. The proposed method will be particularly useful when the number of DMUs are small compared with the number of criteria employed for evaluation.

1,979 citations

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the mathematical programming approach to frontier estimation known as Data Envelopment Analysis (DEA) and examine the effect of model orientation on the efficient frontier.
Abstract: This paper discusses the mathematical programming approach to frontier estimation known as Data Envelopment Analysis (DEA). We examine the effect of model orientation on the efficient frontier and the effect of convexity requirements on returns to scale. Transformations between models are provided. Methodological extensions and alternate models that have been proposed are reviewed and the advantages and limitations of a DEA approach are presented.

1,873 citations

Journal ArticleDOI
TL;DR: The construction and analysis of Pareto-efficient frontier production functions by a new Data Envelopment Analysis method is presented in the context of new theoretical characterizations of the inherent structure and capabilities of such empirical production functions.
Abstract: The construction and analysis of Pareto-efficient frontier production functions by a new Data Envelopment Analysis method is presented in the context of new theoretical characterizations of the inherent structure and capabilities of such empirical production functions. Contrasts and connections with other developments, including solutions of some remaining problems, are made re aspects such as informatics, economies of scale, isotonicity and non-concavity, discretionary and non-discretionary inputs, piecewise linearity, partial derivatives and Cobb-Douglas properties of the functions. Non-Archimedean constructs are nor required.

1,605 citations

References
More filters
Journal ArticleDOI
TL;DR: A nonlinear (nonconvex) programming model provides a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs and methods for objectively determining weights by reference to the observational data for the multiple outputs and multiple inputs that characterize such programs.
Abstract: A nonlinear (nonconvex) programming model provides a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs. A scalar measure of the efficiency of each participating unit is thereby provided, along with methods for objectively determining weights by reference to the observational data for the multiple outputs and multiple inputs that characterize such programs. Equivalences are established to ordinary linear programming models for effecting computations. The duals to these linear programming models provide a new way for estimating extremal relations from observational data. Connections between engineering and economic approaches to efficiency are delineated along with new interpretations and ways of using them in evaluating and controlling managerial behavior in public programs.

25,433 citations

Journal ArticleDOI
01 May 1957

14,922 citations


"Some Models for Estimating Technica..." refers background or methods in this paper

  • ...Once again, however, we emphasize that we are proceeding directly from observational data in our fine tuning of the CCR developments found in Charnes, Cooper and Rhodes (1978a). Mix and scale variations are likely to occur together in such observational data and so safeguards are required, as in our minimum extrapolation postulate, insofar as we cannot effect a separation of scale variations as in the formal theory of economics....

    [...]

  • ...The linear programming problems in (14) and (15) are employed to estimate the overall technical and scale efficiencies of a DMU. The linear programming formulations in (19A) and (20A) take into account the possibility that the average productivity at the most productive scale size may not be attainable for other scale sizes at which a particular DMU may be operating. These formulations estimate the pure technical efficiency of a DMU at the given scale of operation. The estimation of most productive scale size in DEA is discussed in Banker (1984). Figure 3 illustrates these concepts of technical and scale efficiencies....

    [...]

  • ...I This could describe the CCR efficiency measure, except that each ur and vi in U and V is only required to be nonnegative rather than strictly positive. See (1). The positivity requirement would be obtained, however, if we followed the nonArchimedean formulation and development in Charnes, Cooper and Rhodes (1979),13 as we shall do below....

    [...]

  • ...REMARK. Fare and Lovell (1978) discuss the relationship between Farrell's efficiency measure and Shephard's distance measure....

    [...]

  • ...This results in a characterization of the DMUs associated with P2 and P3 as being "equally inefficient" relative to the DMU associated with PI. This characterization may be satisfactory in some cases. In other cases we may want to "fine tune" the developments in Charnes, Cooper and Rhodes (1978a) so that we can locate differences such as are portrayed in the P2 and P3 situations....

    [...]

Book
01 Jan 1970
TL;DR: In this article, a unified treatment of cost and production functions underlie the economic theory of production is presented, and the duality between cost function and production function is developed by introducing a cost correspondence, showing that these two functions are given in terms of each other by dual minimum problems.
Abstract: A sequel to his frequently cited Cost and Production Functions (1953), this book offers a unified, comprehensive treatment of these functions which underlie the economic theory of production. The approach is axiomatic for a definition of technology, by mappings of input vectors into subsets of output vectors that represent the unconstrained technical possibilities of production. To provide a completely general means of characterizing a technology, an alternative to the production function, called the Distance Function, is introduced. The duality between cost function and production function is developed by introducing a cost correspondence, showing that these two functions are given in terms of each other by dual minimum problems. The special class of production structures called Homothetic is given more general definition and extended to technologies with multiple outputs. Originally published in 1971. The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These paperback editions preserve the original texts of these important books while presenting them in durable paperback editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.

3,222 citations

Book
01 Jan 1953
TL;DR: In this paper, the authors present a mathematical interpretation of the duality between cost and production function, and present a heuristic principle of minimum costs and a Cobb-Douglas production function.
Abstract: 1. The Process Production Function.- 2. Heuristic Principle of Minimum Costs.- 3. The Producer's Minimum Cost Function.- 4. Dual Determination of Production Function From Cost Function.- 5. Geometric Interpretation of the Duality Between Cost and Production Function.- 6. Constraints on the Factors of Production.- 7. Homothetic Production Functions.- 8. The Cobb-Douglas Production Function.- 9. The Problem of Aggregation.- 10. The Dynamics of Monopoly.

1,799 citations