scispace - formally typeset
Search or ask a question

Showing papers on "Data envelopment analysis published in 1996"



Journal ArticleDOI
TL;DR: The purpose of this paper is to briefly trace the evolution of DEA from the initial publication by Charnes et al. (1978b) to the current state of the art (SOA).
Abstract: The purpose of this paper is to briefly trace the evolution of DEA from the initial publication by Charnes et al. (1978b) to the current state of the art (SOA). The state of development of DEA is characterized at four points in time to provide a perspective in both directions—past and future. An evolution map is provided which illustrates DEA growth during the 17-year period, the timing of the major events, and the interconnections and influences between topics. An extensive DEA bibliography is provided.

836 citations


Journal ArticleDOI
TL;DR: In this article, the authors define environmental performance indicators as analytical tools that allow one to compare various plants in a firm, or various firms in an industry, with each other and with respect to certain environmental characteristics.

589 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used data envelopment analysis (DEA) to compare the technical efficiency of 201 large banks from 1984 to 1990 and found that technical inefficiency averaged just over 5 percent, much lower than found in existing estimates.
Abstract: Significant difficulties in commercial banking in the late 1980s raise questions about bank performance and efficiency. With the use of data envelopment analysis (DEA), we consider the relative technical efficiency of 201 large banks from 1984 to 1990. Bank technical inefficiency averages just over 5 percent, much lower than found in existing estimates. Larger and more profitable banks have higher levels of technical efficiency. At the same time, however, larger banks are more likely to operate under decreasing returns to scale.

549 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyzed the efficiency of local governments in Belgium using a broad variety of non-parametric and parametric reference technologies, including Free Disposal Hull (FDH), variable returns to scale Data Envelopment Analysis (DEA), and three parametric frontiers (one deterministic and two stochastic).

435 citations


Journal ArticleDOI
TL;DR: In this paper, the application of DEA (Data Envelopment Analysis) in conjunction with financial ratios to help bank regulators in Taiwan not only distinguish the efficient banks from the inefficient ones but also to gain insight into various financial dimensions that somehow link to the bank's financial operational decisions.
Abstract: While financial ratios are currently the method most often used to evaluate a bank's performance, there is no clear-cut rationale which would allow one to acquire a composite score on the overall financial soundness of a bank. This paper demonstrates the application of DEA (Data Envelopment Analysis) in conjunction with financial ratios to help bank regulators in Taiwan not only to distinguish the efficient banks from the inefficient ones but also to gain insight into various financial dimensions that somehow link to the bank's financial operational decisions.

370 citations


Journal ArticleDOI
TL;DR: The nonparametric data envelopment analysis (DEA) model has become increasingly popular in the analysis of productive efficiency, and the number of empirical applications is now very large as discussed by the authors.
Abstract: The nonparametric data envelopment analysis (DEA) model has become increasingly popular in the analysis of productive efficiency, and the number of empirical applications is now very large. Recent theoretical and mathematical research has also contributed to a deeper understanding of the seemingly simple but inherently complex DEA model. Less effort has, however, been directed toward comparisons between DEA and other competing efficiency analysis models. This paper undertakes a comparison of the DEA, the deterministic parametric (DFA), and the stochastic frontier (SFA) models. Efficiency comparisons across models in the above categories are done based on 15 Colombian cement plants observed during 1968–1988.

337 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a method for ranking efficient units, not by their efficiency, but by importance as benchmarks for the inefficient units, in contrast to earlier suggestions in the literature which rank units high if they are specialized.
Abstract: In non-parametric methods many units are calculated as efficient. The article suggests a method for ranking efficient units, not by their efficiency, but by importance as benchmarks for the inefficient units, in contrast to earlier suggestions in the literature which rank units high if they are specialized. However, the total potentials for improvement frequently remain unrevealed by calculating radial efficiency measures of the Farrell type only. The article therefore first develops efficiency measures that explicitly extend the radial measures to include slacks. The new measures are applied to a typical multidimensional small-sample data set for Norwegian employment offices.

310 citations


Journal ArticleDOI
TL;DR: A substantial body of recent work has opened the way to explore the statistical properties of DEA estimators of production frontiers and related efficiency measures as mentioned in this paper, including returns to scale, input substitutability, and model specification.
Abstract: A substantial body of recent work has opened the way to exploring the statistical properties of DEA estimators of production frontiers and related efficiency measures. The purpose of this paper is to survey several possibilities that have been pursued, and to present them in a unified framework. These include the development of statistics to test hypotheses about the characteristics of the production frontier, such as returns to scale, input substitutability, and model specification, and also about variation in efficiencies relative to the production frontier.

305 citations


Journal ArticleDOI
TL;DR: Evidence is provided that the existing Data Envelopment Analysis model will overestimate the level of technical inefficiency and that the modified model developed in this paper does a better job controlling for exogenous factors.

298 citations


Journal ArticleDOI
TL;DR: In this article, an analytical approach based on rank statistics is presented to the issue of comparing programs within Data Envelopment Analysis (DEA) efficiency evaluation framework, which distinguishes between managerial and programmatic inefficiency and uses the Mann-Whitney rank statistic to evaluate the statistical significance of the differences observed between a treatment program and its control group program after adjusting for differences in managerial efficiency.
Abstract: This paper presents an analytical approach, based on rank statistics, to the issue of comparing programs within Data Envelopment Analysis (DEA) efficiency evaluation framework. The program evaluation procedure distinguishes between managerial and programmatic inefficiency and uses the Mann-Whitney rank statistic to evaluate the statistical significance of the differences observed between a treatment program and its control group program after adjusting for differences in managerial efficiency between the programs. A numerical example, based on the data used to evaluate the educational enhancement of the Program Follow Through, is used to illustrate the proposed statistical procedures.

Journal ArticleDOI
TL;DR: Seven theorems are presented which expand understanding of the theoretical structure of the Charnes-Cooper-Rhodes (CCR) model of Data Envelopment Analysis, especially with respect to slacks and the underlying structure of facets and faces, and are a basis for new algorithms which will provide optimal primal and dual solutions.
Abstract: This paper presents seven theorems which expand understanding of the theoretical structure of the Charnes-Cooper-Rhodes (CCR) model of Data Envelopment Analysis, especially with respect to slacks and the underlying structure of facets and faces. These theorems also serve as a basis for new algorithms which will provide optimal primal and dual solutions that satisfy the strong complementary slackness conditions (SCSC) for many (if not most) non-radially efficient DMUs; an improved procedure for identifying the setE of extreme efficient DMUs; and may, for many DEA domains, also settle in a single pass the existence or non-existence of input or output slacks in each of their DMUs. This paper also introduces the concept of a positivegoal vector G, which is applied to characterize the set of all possible maximal optimal slack vectors. The appendix C presents an example which illustrates the need for a new concept,face regular, which focuses on the role of convexity in the intersections of radial efficient facets with the efficient frontier FR. The same example also illustrates flaws in the popular “sum of the slacks” methodology.

Journal ArticleDOI
TL;DR: A stochastic approach is proposed, in which a probability distribution on efficiencies can be derived for each decision making unit, as a basis for comparison.
Abstract: We contrast the different approaches of Data Envelopment Analysis (DEA) and Multiple Criteria Decision Making (MCDM) to superficially similar problems. The concepts of efficiency and Pareto optimality in DEA and MCDM are compared, and a link is demonstrated between the ratio efficiency definition in DEA and a distance measure in input–output space based on linear value functions. The problem of weight sensitivity is discussed in terms of value measurement theory, highlighting the assumptions needed during model formulation in order to justify the use of value judgements to constrain weight flexibility in DEA. Finally, we propose a stochastic approach, in which a probability distribution on efficiencies can be derived for each decision making unit, as a basis for comparison.

Journal ArticleDOI
TL;DR: In this paper, sensitivity analysis of the CCR model in data envelopment analysis (DEA) is studied by means of modified versions of CCR models based on evaluation of a decision making unit (DMU) relative to a reference set grouped by all other DMUs.

Journal ArticleDOI
TL;DR: In this article, the use of data envelopment analysis (DEA) as a tool for measuring the performance of vendors on multiple criteria and for use in vendor negotiations is discussed.
Abstract: While it has long been recognized that vendor selection is multi‐objective in nature, little has been done to develop techniques for measuring vendors’ performance on multiple criteria. Demonstrates the use of data envelopment analysis (DEA) as a tool for measuring the performance of vendors on multiple criteria and for use in vendor negotiations. Describes the DEA model, develops a DEA formulation for measuring vendor efficiency and, finally, shows how a baby food manufacturer applied the DEA technique in a just‐in‐time environment. Shows how application of the DEA technique can provide savings in monetary and other measurable terms.

Journal ArticleDOI
TL;DR: A revision and a generalization of the results contained in the only paper so far published on the matter of translation invariance is undertaken by allowing inputs and outputs to take not only zero but negative values, broadening the field of application of the DEA methodology.
Abstract: In this paper, we undertake a revision and a generalization of the results contained in the only paper so far published on the matter of translation invariance by allowing inputs and outputs to take not only zero but negative values. This broadens the field of application of the DEA methodology.

Journal ArticleDOI
TL;DR: Previously used models, such as those used to identify “allocative inefficiencies”, are extended by means of “assurance region” approaches which are less demanding in their information requirements and underlying assumptions.
Abstract: The extensions, new developments and new interpretations for DEA covered in this paper include: (1) new measures of efficiency, (2) new models and (3) new ways of implementing established models with new results and interpretations presented that include treatments of “congestion”, “returns-to-scale” and “mix” and “technical” inefficiencies and measures of efficiency that can be used to reflect all pertinent properties. Previously used models, such as those used to identify “allocative inefficiencies”, are extended by means of “assurance region” approaches which are less demanding in their information requirements and underlying assumptions. New opportunities for research are identified in each section of this chapter. Sources of further developments and possible sources for further help are also suggested with references supplied to other papers that appear in this volume and which are summarily described in this introductory chapter.

Journal ArticleDOI
TL;DR: In this paper, the use of data envelopment analysis for measuring vendor performance and efficiency is presented, and an algorithm by Inselberg is employed for determining points of vendor efficiency on multiple criteria.

Journal ArticleDOI
TL;DR: In this article, the uncertainty associated with technical efficiency estimates from stochastic frontier models is studied empirically and confidence intervals for estimates of technical efficiency levels under different sets of assumptions ranging from the very strong to the relatively weak.
Abstract: This paper is an empirical study of the uncertainty associated with technical efficiency estimates from stochastic frontier models. We show how to construct confidence intervals for estimates of technical efficiency levels under different sets of assumptions ranging from the very strong to the relatively weak. We demonstrate empirically how the degree of uncertainty associated with these estimates relates to the strength of the assumptions made and to various features of the data.

Journal ArticleDOI
TL;DR: DEA (Data Envelopment Analysis) models and concepts are formulated here in terms of the "P-Models" of Chance Constrained Programming, which are modified to contact the "satisficing concepts" of H.A. Simon, adding as a third category to the efficiency/inefficiency dichotomies that have heretofore prevailed in DEA.
Abstract: DEA (Data Envelopment Analysis) models and concepts are formulated here in terms of the “P-Models” of Chance Constrained Programming, which are then modified to contact the “satisficing concepts” of H.A. Simon. Satisficing is thereby added as a third category to the efficiency/inefficiency dichotomies that have heretofore prevailed in DEA. Formulations include cases in which inputs and outputs are stochastic, as well as cases in which only the outputs are stochastic. Attention is also devoted to situations in which variations in inputs and outputs are related through a common random variable. Extensions include new developments in goal programming with deterministic equivalents for the corresponding satisficing models under chance constraints.

Journal ArticleDOI
Joe Zhu1
TL;DR: In this paper, a weighted non-radial CCR model with preference weights is proposed to consider the decision making unit's preference over the potential adjustments of various inputs and outputs when data envelopment analysis (DEA) is employed.
Abstract: It is important to consider the decision making unit (DMU)'s or decision maker's preference over the potential adjustments of various inputs and outputs when data envelopment analysis (DEA) is employed. On the basis of the so-called Russell measure, this paper develops some weighted non-radial CCR models by specifying a proper set of 'preference weights' that reflect the relative degree of desirability of the potential adjustments of current input or output levels. These input or output adjustments can be either less or greater than one; that is, the approach enables certain inputs actually to be increased, or certain outputs actually to be decreased. It is shown that the preference structure prescribes fixed weights (virtual multiplier bounds) or regions that invalidate some virtual multipliers and hence it generates preferred (efficient) input and output targets for each DMU. In addition to providing the preferred target, the approach gives a scalar efficiency score for each DMU to secure comparability. It is also shown how specific cases of our approach handle non-controllable factors in DEA and measure allocative and technical efficiency. Finally, the methodology is applied with the industrial performance of 14 open coastal cities and four special economic zones in 1991 in China. As applied here, the DEA/preference structure model refines the original DEA model's result and eliminates apparently efficient DMUs.

Journal ArticleDOI
TL;DR: In this article, a framework for incorporating ordinal data factors into the standard ratio DEA model is presented for prioritization of RD one in which the ordinal factors are ranked and one where they are not.
Abstract: This paper presents a framework for incorporating ordinal data factors into the standard ratio DEA model. An application involving the prioritization of RD one in which the ordinal factors are ranked and one where they are not. Finally, the issue of selecting a lower bound ɛ on factors is addressed.

Journal ArticleDOI
TL;DR: In this paper, the performance of a sample of French urban transit companies is evaluated using a broad selection of nonparametric reference technologies for two specifications of the production process, and the results corroborate results reported elsewhere: the relevance of ownership, the use of risk-sharing incentives in contracting, the harmful impact of subsidies, etc.
Abstract: The performance of a sample of French urban transit companies is evaluated using a broad selection of nonparametric reference technologies for two specifications of the production process. In particular, the variable returns to scale Data Envelopment Analysis (DEA) models with either strong or weak disposability in both inputs and outputs, and the Free Disposal Hull (FDH) are applied. An extensive comparison of the resulting radial output efficiency measures yields the following major methodological conclusions. First, the location of the efficiency distributions differs substantially depending on the methodology and especially on the output specification considered. The latter differences vanish if the impact of outliers is eliminated. Second, convexity has a stronger influence on the efficient-inefficient dichotomy than allowing for congestion by means of a weakly disposable DEA model. For policy purposes, these efficiency distributions are explained using a Tobit model. The findings corroborate results reported elsewhere: the relevance of ownership, the use of risk-sharing incentives in contracting, the harmful impact of subsidies, etc. Furthermore, the network structure seems to account for some differences in performance. Finally, a novelty in the urban transit context is the indirect monitoring effect of the French earmarked transportation tax.

Journal ArticleDOI
TL;DR: The results demonstrate that, despite their differences, both methods offer a useful range of information regarding the assessment of performance.
Abstract: This paper is concerned with the comparison of two popular non-parametric methodologies—data envelopment analysis and artificial neural networks—as tools for assessing performance. Data envelopment analysis has been established since 1978 as a superior alternative to traditional parametric methodologies, such as regression analysis, for assessing performance. Neural networks have recently been proposed as a method for assessing performance. In this paper, we use a simulated production technology of two inputs and one output for testing the success of the two methods for assessing efficiency. The two methods are also compared on their practical use as performance measurement tools on a set of bank branches, having multiple input and output criteria. The results demonstrate that, despite their differences, both methods offer a useful range of information regarding the assessment of performance.

Journal ArticleDOI
TL;DR: This paper compares data envelopment analysis (DEA) and ratio analysis as alternative tools for assessing the performance of organisational units such as bank branches and schools and finds that provided the performance indicators capture all variables used in the DEA assessment the two methods agree reasonably closely on the performances of units as a whole.
Abstract: This paper compares data envelopment analysis (DEA) and ratio analysis as alternative tools for assessing the performance of organisational units such as bank branches and schools. Such units typically use one or more resources to secure one or more outputs, the inputs and/or outputs being possibly incommensurate. The assessment of District Health Authorities in England on the provision of perinatal care is used as a vehicle for comparing the two methods. The comparison focuses on how well the two methods agree on the performance of a unit relative to that of other units, and on the estimates of targets each method provides for improving the performance of units. It is found that provided the performance indicators capture all variables used in the DEA assessment the two methods agree reasonably closely on the performance of the units as a whole, though this depends on the way the performance indicators are combined into a summary figure of performance. The two methods can disagree substantially on the relative performance of individual units. Ratio analysis, unlike DEA, is not found to be suitable for setting targets so that units can become more efficient. This is mainly due to the fact that DEA takes simultaneous account of all resources and outputs in assessing performance while ratio analysis relates only one resource to one output at a time. However, the two methods can support each other if used jointly. Ratios do provide useful information on the performance of a unit on specific aspects and they can support the communication of DEA results to non-specialists when the two methods agree on performance.

Journal Article
J Magnussen1
TL;DR: The distribution of efficiency is found to be unaffected by changes in the specification of hospital output, and both the ranking of hospitals and the scale properties of the technology are found to depend on the choice of output specification.
Abstract: OBJECTIVE. To discuss the usefulness of efficiency measures as instruments of monitoring and resource allocation by analyzing their invariance to changes in the operationalization of hospital production. STUDY SETTING. Norwegian hospitals over the three-year period 1989-1991. STUDY DESIGN. Efficiency is measured using Data Envelopment Analysis (DEA). The distribution of efficiency and the ranking of hospitals is compared across models using various distribution-free tests. DATA COLLECTION. Input and output data are collected by the Norwegian Central Bureau of Statistics. PRINCIPAL FINDINGS. The distribution of efficiency is found to be unaffected by changes in the specification of hospital output. Both the ranking of hospitals and the scale properties of the technology, however, are found to depend on the choice of output specification. CONCLUSION. Extreme care should be taken before resource allocation is based on DEA-type efficiency measures alone. Both the identification of efficient and inefficient hospitals and the cardinal measure of inefficiency will depend on the specification of output. Since the scale properties of the technology also vary with the specification of output, the search for an optimal hospital size may be futile.

Journal ArticleDOI
TL;DR: In this paper, a new technique for assessing the sensitivity and stability of efficiency classifications in Data Envelopment Analysis (DEA) is presented, where an organization's input-outut vector serves as the center for a cell within which the classification remains unchanged under perturbations of the data.
Abstract: A new technique for assessing the sensitivity and stability of efficiency classifications in Data Envelopment Analysis (DEA) is presented. Here developed for the ratio (CCR) model, this technique extends easily to other DEA variants. An organization's input-outut vector serves as the center for a cell within which the organization's classification remains unchanged under perturbations of the data. For the l 1, l ∞ and generalized l ∞ norms, the radius of the maximal cell can be computed using linear programming formulations. This radius can be interpreted as a measure of the classification's stability, especially with respect to errors in the data.

Journal ArticleDOI
TL;DR: In this article, the authors examined the impact of fixed effects production functions vis-a-vis stochastic production frontiers on technical efficiency measures and found that the fixed effects technique was superior to the traditional frontier methodology.
Abstract: This article examines the impact of fixed effects production functions vis-a-vis stochastic production frontiers on technical efficiency measures. An unbalanced panel consisting of 96 Vermont dairy farmers for the 1971–1984 period was used in the analysis. The models examined incorporated both time-variant and time-invariant technical efficiency. The major source of variation in efficiency levels across models stemmed from the assumption made concerning the distribution of the one-sided term in the stochastic frontiers. In general, the fixed effects technique was found superior to the stochastic production frontier methodology. Despite the fact that the results of various statistical tests revealed the superiority of some specifications over others, the overall conclusion of the study is that the efficiency analysis was fairly consistent throughout all the models considered.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a set of indicators to determine whether or not the specification of the input and output space is supported by data in the sense that the variation in data is sufficient for estimation of a frontier of the same dimension as the input output space.
Abstract: Date Envelopment Analysis (DEA) employs mathematical programming to measure the relative efficiency of Decision Making Units (DMUs). This paper is concerned with development of indicators to determine whether or not the specification of the input and output space is supported by data in the sense that the variation in data is sufficient for estimation of a frontier of the same dimension as the input output space. Insufficient variation in data implies that some inputs/outputs can be substituted along the efficient frontier but only in fixed proportions. Data thus locally supports variation in a subspace of a lower dimension rather than in the input output space of full dimension. Each segment of the efficient frontier is in this sense subject to local collinearity. Insufficient variation in data provides a bound on admissible disaggregations in cases where substitution in fixed proportions is incompatible with a priori information concerning the production process. A data set incapable of estimating a fro...

Journal ArticleDOI
TL;DR: In this paper, the authors discuss alternative methods for determining returns to scale in DEA, based on Banker's concept of Most Productive Scale Size (MPSS), which is equivalent to the two-stage methods of Fare, Grosskopf and Lovell.