scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Productivity Analysis in 2007"


Journal ArticleDOI
TL;DR: In this article, a meta-regression analysis including 167 farm level technical efficiency (TE) studies of developing and developed countries was undertaken, and the econometric results suggest that stochastic frontier models generate lower mean TE (MTE) estimates than non-parametric deterministic models, while the primal approach is the most common technological representation.
Abstract: A meta-regression analysis including 167 farm level technical efficiency (TE) studies of developing and developed countries was undertaken. The econometric results suggest that stochastic frontier models generate lower mean TE (MTE) estimates than non-parametric deterministic models, while parametric deterministic frontier models yield lower estimates than the stochastic approach. The primal approach is the most common technological representation. In addition, frontier models based on cross-sectional data produce lower estimates than those based on panel data whereas the relationship between functional form and MTE is inconclusive. On average, studies for animal production show a higher MTE than crop farming. The results also suggest that the studies for countries in Western Europe and Oceania present, on average, the highest levels of MTE among all regions after accounting for various methodological features. In contrast, studies for Eastern European countries exhibit the lowest estimate followed by those from Asian, African, Latin American, and North American countries. Additional analysis reveals that MTEs are positively and significantly related to the average income of the countries in the data set but this pattern is broken by the upper middle income group which displays the lowest MTE.

476 citations


Journal ArticleDOI
TL;DR: In this article, an alternative method that involves the incorporation of the materials balance concept into the production model in a similar manner to which price information is normally incorporated is proposed, which produces a new environmental efficiency measure that can be decomposed into technical and allocative components.
Abstract: The materials balance condition is a fundamental adding up condition, which essentially says that: “what goes in must come out”. In this paper we argue that a number of the recently developed methods of incorporating pollution measures into standard productive efficiency models may be inconsistent with this fundamental condition. We propose an alternative method that involves the incorporation of the materials balance concept into the production model in a similar manner to which price information is normally incorporated. This produces a new environmental efficiency measure that can be decomposed into technical and allocative components, in a similar manner to the conventional cost efficiency decomposition. The approach is illustrated with the case of phosphorus emission on Belgian pig-finishing farms, using data envelopment analysis (DEA) methods. Our results indicate that a substantial proportion of nutrient pollution on these farms can be abated in a cost reducing manner.

305 citations


Journal ArticleDOI
TL;DR: The water and sewerage industry of England and Wales was privatized in 1989 and subjected to a new regime of environmental, water quality and RPI+K price cap regulation as mentioned in this paper.
Abstract: The water and sewerage industry of England and Wales was privatized in 1989 and subjected to a new regime of environmental, water quality and RPI+K price cap regulation This paper estimates a quality-adjusted input distance function, with stochastic frontier techniques in order to estimate productivity growth rates for the period 1985–2000 Productivity is decomposed so as to account for the impact of technical change, efficiency change, and scale change Compared with earlier studies by Saal and Parker [(2000) Managerial Decision Econ 21(6):253–268, (2001) J Regul Econ 20(1): 61–90], these estimates allow a more careful consideration of how and whether privatization and the new regulatory regime affected productivity growth in the industry Strikingly, they suggest that while technical change improved after privatization, productivity growth did not improve, and this was attributable to efficiency losses as firms appear to have struggled to keep up with technical advances after privatization Moreover, the results also suggest that the excessive scale of the WaSCs contributed negatively to productivity growth

305 citations


Journal ArticleDOI
TL;DR: This approach leads to the closest targets by means of a single-stage procedure, which is easier to handle than those based on algorithms aimed at identifying all the facets of the efficient frontier.
Abstract: In this paper, we propose a general approach to find the closest targets for a given unit according to a previously specified criterion of similarity. The idea behind this approach is that closer targets determine less demanding levels of operation for the inputs and outputs of the inefficient units to perform efficiently. Similarity can be interpreted as closeness between the inputs and outputs of the assessed unit and the proposed targets, and this closeness can be measured by using either different distance functions or different efficiency measures. Depending on how closeness is measured, we develop several mathematical programming problems that can be easily solved and guarantee to reach the closest projection point on the Pareto-efficient frontier. Thus, our approach leads to the closest targets by means of a single-stage procedure, which is easier to handle than those based on algorithms aimed at identifying all the facets of the efficient frontier.

222 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss DEA (Data Envelopment Analysis) and some of its future prospects, including extensions to different objectives such as satisfactory or full efficiency objectives.
Abstract: This paper covers some of the past accomplishments of DEA (Data Envelopment Analysis) and some of its future prospects. It starts with the “engineering-science” definitions of efficiency and uses the duality theory of linear programming to show how, in DEA, they can be related to the Pareto–Koopmans definitions used in “welfare economics” as well as in the economic theory of production. Some of the models that have now been developed for implementing these concepts are then described and properties of these models and the associated measures of efficiency are examined for weaknesses and strengths along with measures of distance that may be used to determine their optimal values. Relations between the models are also demonstrated en route to delineating paths for future developments. These include extensions to different objectives such as “satisfactory” versus “full” (or “strong”) efficiency. They also include extensions from “efficiency” to “effectiveness” evaluations of performances as well as extensions to evaluate social-economic performances of countries and other entities where “inputs” and “outputs” give way to other categories in which increases and decreases are located in the numerator or denominator of the ratio (=engineering-science) definition of efficiency in a manner analogous to the way output (in the numerator) and input (in the denominator) are usually positioned in the fractional programming form of DEA. Beginnings in each of these extensions are noted and the role of applications in bringing further possibilities to the fore is highlighted.

157 citations


Journal ArticleDOI
TL;DR: Although WinBUGS may not be that efficient for more complicated models, it does make Bayesian inference with stochastic frontier models easily accessible for applied researchers and its generic structure allows for a lot of flexibility in model specification.
Abstract: Markov chain Monte Carlo (MCMC) methods have become a ubiquitous tool in Bayesian analysis. This paper implements MCMC methods for Bayesian analysis of stochastic frontier models using the WinBUGS package, a freely available software. General code for cross-sectional and panel data are presented and various ways of summarizing posterior inference are discussed. Several examples illustrate that analyses with models of genuine practical interest can be performed straightforwardly and model changes are easily implemented. Although WinBUGS may not be that efficient for more complicated models, it does make Bayesian inference with stochastic frontier models easily accessible for applied researchers and its generic structure allows for a lot of flexibility in model specification.

146 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a bootstrap procedure to improve the performance of the DEA/FDH estimators in the presence of noise, and showed that the procedure works well and better than the standard DEA and FDH estimator when the noise is of moderate size in terms of signal to noise ratio.
Abstract: In frontier analysis, most nonparametric approaches (DEA, FDH) are based on envelopment ideas which assume that with probability one, all observed units belong to the attainable set. In these “deterministic” frontier models, statistical inference is now possible, by using bootstrap procedures. In the presence of noise, envelopment estimators could behave dramatically since they are very sensitive to extreme observations that might result only from noise. DEA/FDH techniques would provide estimators with an error of the order of the standard deviation of the noise. This paper adapts some recent results on detecting change points [Hall P, Simar L (2002) J Am Stat Assoc 97:523–534] to improve the performances of the classical DEA/FDH estimators in the presence of noise. We show by simulated examples that the procedure works well, and better than the standard DEA/FDH estimators, when the noise is of moderate size in term of signal to noise ratio. It turns out that the procedure is also robust to outliers. The paper can be seen as a first attempt to formalize stochastic DEA/FDH estimators.

104 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the Turkish banking efficiency in a pre-and post-liberalization environment by drawing on the Turkish experience by using DEA and found that liberalization programs were followed by an observable decline in efficiency.
Abstract: This paper examines the banking efficiency in a pre- and post-liberalization environment by drawing on the Turkish experience by using DEA. The paper also investigates the scale effect on efficiency. Our findings suggest that liberalization programs were followed by an observable decline in efficiency. Another finding of the study is that the Turkish banking system had a serious scale problem during the study period. The second part of our analysis relied on econometric methods and found that one major reason for such system-wide efficiency decline has been the growing macroeconomic instability of the Turkish economy in general and financial sector in particular.

101 citations


Journal ArticleDOI
TL;DR: In this article, the authors present an overview of the main approaches that can be used to improve the discrimination of DEA, including simple methods such as the aggregation of inputs or outputs, the use of longitudinal data, more advanced methods, such as weight restrictions, production trade-offs and unobserved units, and a relatively new method based on the selective proportionality between the inputs and outputs.
Abstract: In some contexts data envelopment analysis (DEA) gives poor discrimination on the performance of units. While this may reflect genuine uniformity of performance between units, it may also reflect lack of sufficient observations or other factors limiting discrimination on performance between units. In this paper, we present an overview of the main approaches that can be used to improve the discrimination of DEA. This includes simple methods such as the aggregation of inputs or outputs, the use of longitudinal data, more advanced methods such as the use of weight restrictions, production trade-offs and unobserved units, and a relatively new method based on the use of selective proportionality between the inputs and outputs.

98 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present two ways of obtaining numerical values of scale elasticity by an indirect approach using efficiency scores and dual variables for radial projections of inefficient points to the frontier, and a direct approach that is more general and powerful and directly evaluates numerically scales elasticity at any point on the DEA surface along intersections with planes.
Abstract: The qualitative characterisation of returns to scale in DEA has been a research issue the last decade. However, quantitative information provides the ultimate information. This paper presents two ways of obtaining numerical values of scale elasticity by an indirect approach using efficiency scores and dual variables for radial projections of inefficient points to the frontier, and a direct approach that is more general and powerful and directly evaluates numerically scale elasticity at any point on the DEA surface along intersections with planes. The direct and indirect approaches are compared using real data and a very high correspondence is found.

83 citations


Journal ArticleDOI
TL;DR: In this article, the authors used both parametric and nonparametric procedures to identify the apparent source of cost inefficiency in banking and found that inefficiency appears stable over time because it is small relative to industry-wide cost changes occurring concurrently and because technology dispersion is imperfect.
Abstract: Parametric and nonparametric procedures are used to identify the apparent source of cost inefficiency in banking. Inefficiencies of 20–25% from earlier studies are reduced to 1–5% when, in addition to commonly specified cost function influences, variables reflecting banks’ external business environment and industry indicators of “productivity” are added. These productivity indicators explain most of the reduction in bank operating cost over 1992–2001 and was 5 times the reduction in the dispersion of inefficiency. Inefficiency appears stable over time because it is small relative to industry-wide cost changes occurring concurrently and because technology dispersion is imperfect.

Journal ArticleDOI
TL;DR: In this paper, a new mathematical model for efficiency analysis, called DEA-R, was proposed, which combines DEA methodology with an old idea-ratio analysis, treating all possible ratios "output" as outputs within the standard DEA model.
Abstract: We propose a new mathematical model for efficiency analysis, which combines DEA methodology with an old idea-Ratio Analysis. Our model, called DEA-R, treats all possible ratios "output/input" as outputs within the standard DEA model. Although DEA and DEA-R generate different summary measures for efficiency, the two measures are comparable. Our mathematical and empirical comparisons establish the validity of DEA-R model in its own right. The key advantage of DEA-R over DEA is that it allows effective integration of the model with experts' opinions via flexible restrictive conditions on individual "output/input" pairs. © 2007 Springer Science+Business Media, LLC.

Journal ArticleDOI
TL;DR: In this article, the authors used data from the years 1997-2003 to evaluate the size efficiency, as distinct from scale efficiency, of Indian banks and found that many of the banks are indeed too large in various years.
Abstract: In this paper we use data from the years 1997–2003 to evaluate the size efficiency, as distinct from scale efficiency, of Indian banks. Following Maindiratta [Maindiratta A (1990) J Econ 46:39–56] we consider a bank to be “too large” if breaking it up into a number of smaller units would result in a larger output bundle than what could be produced from the same input by a single bank. When this is the case, the bank is not size efficient. Our analysis shows that many of the banks are, indeed, too large in various years. We also find that often a bank is operating in the region of diminishing returns to scale but is not a candidate for break up.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a method for eco-efficiency analysis of consumer durables that is based on Data Envelopment Analysis (DEA), which measures efficiency in terms of absolute shadow prices that are optimized endogenously within the model to maximize efficiency.
Abstract: We develop a method for eco-efficiency analysis of consumer durables that is based on Data Envelopment Analysis (DEA). In contrast to previous product efficiency studies, we consider the measurement problem from the policy perspective. The innovation of the paper is to measure efficiency in terms of absolute shadow prices that are optimized endogenously within the model to maximize efficiency of the good. Thus, the efficiency measure has a direct economic interpretation as a monetary loss due to inefficiency, expressed in some currency unit. The advantages as well as technical differences between the proposed approach and the traditional production-side methods are discussed in detail. We illustrate the approach by an application to eco-efficiency evaluation of Sport Utility Vehicles.

Journal ArticleDOI
TL;DR: This model allows us to identify determinants of the efficiency orientation, thereby providing useful information that can help researchers to choose between the input and the output-oriented approaches.
Abstract: In this paper, we estimate parametric input and output distance functions and discuss how to estimate a mixture/latent class model (LCM) involving the output and input distance functions in the context of multi-input and multi-output production technology. The proposed technique is applied to a panel data on European Railways (1971–1994). This model allows us to identify determinants of the efficiency orientation, thereby providing useful information that can help researchers to choose between the input and the output-oriented approaches. In addition, we develop cross-indices that can be used to compute input (output) technical inefficiency from the estimates of output (input) distance function.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a way of calculating global Malmquist indices and global frontier shift indices which provides a better estimation of the true frontier shift and furthermore is easy to calculate.
Abstract: The Malmquist index is a measure of productivity changes, of which an important component is the frontier shift or technological change. Often technological change can be viewed as a global phenomenon, and therefore individual or local measures of technological changes are aggregated into an overall measure, traditionally using geometric means. In this paper we propose a way of calculating global Malmquist indices and global frontier shift indices which provides a better estimation of the true frontier shift and furthermore is easy to calculate. Using simulation studies we show how this method outperforms the traditional aggregation approach, especially for sparsely populated production possibility sets and for frontiers that also change shape over time. Furthermore, our global indices can be used for unbalanced panels without disregarding any information. Finally, we show how the global indices are meaningful for calculating differences between frontiers from different groups rather than different time periods as illustrated in a small case study of bank branches in different countries.

Journal ArticleDOI
TL;DR: In this paper, a flexible time-varying stochastic frontier model is proposed for measuring the productivity of rice farms in Indonesian rice farms, and the changes in the efficiency rankings of farms over time demonstrate the model's flexibility.
Abstract: This paper proposes a flexible time-varying stochastic frontier model. Similarly to Lee and Schmidt [1993, In: Fried H, Lovell CAK, Schmidt S (eds) The measurement of productive efficiency: techniques and applications. Oxford University Press, Oxford], we assume that individual firms’ technical inefficiencies vary over time. However, the model, which we call the “multiple time-varying individual effects” model, is more general in that it allows multiple factors determining firm-specific time-varying technical inefficiencies. This allows the temporal pattern of inefficiency to vary over firms. The number of such factors can be consistently estimated. The model is applied to data on Indonesian rice farms, and the changes in the efficiency rankings of farms over time demonstrate the model’s flexibility.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the case when efficient operation of individual economic units does not necessarily imply efficiency for a group of economic units and develop new group-wise efficiency indexes that measure the extent to which the performance of a group economic units can be enhanced, even if all these units are individually efficient.
Abstract: In this work we consider the case when efficient operation of individual economic units does not necessarily imply efficiency for a group of these units. Merging theoretical findings of Li and Ng (Int Adv Econ Res, 1995, 1, 377.) and Fare and Zelenyuk (Eur J Oper Res, 2003, 146, 615), we develop new group-wise efficiency indexes that measure the extent to which the performance of a group of economic units can be enhanced, even if all these units are individually efficient. The existence of such potential improvement is attributed to non-optimal allocation of inputs across the individual economic units from the point of view of a group of these units.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed trends in labour productivity and its underlying determinants in a panel of OECD countries from 1979 to 2002 using data envelopment analysis to estimate a Malmquist measure of multifactor productivity (MFP) change.
Abstract: This paper analyses trends in labour productivity and its underlying determinants in a panel of OECD countries from 1979 to 2002. Data Envelopment Analysis (DEA) is used to estimate a Malmquist measure of multifactor productivity (MFP) change. We decompose the growth in labour productivity into (i) net technological change (ii) input biased technical change (IBTC) (iii) efficiency change and (iv) capital accumulation. We analyse the effect of each of these factors in the transition towards the equilibrium growth paths of both labour productivity and per capita GDP for the OECD countries, controlling for the effects of different policies and institutions. The results indicate that on average gaps in productivity or income levels are narrowing down although there is no evidence to suggest that the entire OECD area comprises a single convergence “club”. Using kernel estimation methods we find that that labour productivity and per capita GDP are settling toward a twin peak (bimodal) distribution. Panel unit root tests over an extended (1960–2001) period provide general support for the convergence hypothesis. Analysis of the contributions of productivity growth within industries and sectoral composition changes show that aggregate productivity change is predominantly driven by ‘net’ within sector effects with very little contribution emerging from sectoral shifts (the ‘in-between’ static or dynamic effects resulting from higher or above average productivity industries gaining employment shares or low productivity industries losing shares).

Journal ArticleDOI
TL;DR: In this paper, the authors address the question of Data Envelopment Analysis (DEA) evaluation of efficiency when aggregate cost or revenue data must be used, and show that the DEA technical inefficiency measure using total revenues as the single output variable or total costs as a single input variable equals the aggregate technical and allocative inefficiency.
Abstract: In this paper, we address the question of Data Envelopment Analysis (DEA) evaluation of efficiency when aggregate cost or revenue data must be used. We show that the DEA technical inefficiency measure using total revenues as the single output variable or total costs as the single input variable equals the aggregate technical and allocative inefficiency. We employ this result to estimate allocative inefficiency and construct statistical tests of the null hypothesis of no allocative inefficiency analogous to those of the null hypothesis of no scale inefficiency. We illustrate our method using revenue and personnel data for the top U.S. public accounting firms over 1995–1998. Our empirical results indicate the existence of statistically significant allocative inefficiency in the public accounting industry.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the efficiency of public street lighting service in Spanish towns, by means of DEA methodology, pursuing two objectives: to estimate the technical efficiency achieved and to discover whether differences in efficiency can be explained by the type of management, that has been chosen, whether public or private.
Abstract: Efficiency evaluation is very important in the municipal realm because of its impact on the people’s standard of living. However, in most cases the value of public output is hardly significant, and therefore measurement is necessarily limited to estimating technical efficiency, which is calculated using physical inputs and outputs. A major part of municipal services can be provided through different types of public management. This plurality of options lends greater relevance to the evaluation. This paper analyzes efficiency in the public street lighting service in Spanish towns, by means of DEA methodology, pursuing two objectives: to estimate the technical efficiency achieved and to discover whether differences in efficiency can be explained by the type of management, that has been chosen, whether public or private. The results of the analysis allow us to confirm that there is a significant relationship between the variables defined as inputs into the process and efficiency. However, the relationship is not very significant for the variables considered as outputs, in some cases. It was also detected that the factors defining the characteristics of the environment and the type of management, whether public or private, do not have a statistically significant impact on efficiency levels.

Journal ArticleDOI
TL;DR: In this article, the authors assess bank branch profitability and productivity in seven national branch networks owned and operated by a multi-national financial services corporation and find that countries in which branch performance is quite consistent amongst domestic branches are less productive and less profitable when compared to other countries that have more disparity in their efficiency scores.
Abstract: Increasingly globalized financial markets with considerable activity in the multinational sector have created the need to understand inter-country bank branch performance. This topic is relatively unstudied, primarily due to the immense difficulty encountered in gathering reliable data. Fortunately, we have been able to obtain data on a group of banks operating in one geographical market area, but in different countries. In this paper we critically assess bank branch profitability and productivity in seven national branch networks owned and operated by a multi-national financial services corporation. The corporate head office (owner) imposes its management philosophy equally on all of its subsidiaries, thus removing executive managerial and corporate disparity. Results suggest that countries in which branch performance is quite consistent amongst domestic branches are less productive and less profitable when compared to other countries that have more disparity in their efficiency scores. In addition, we discovered that, surprisingly, branches do not have to be productive in order to be profitable and this led us to somewhat of a major breakthrough in inter-country branch analysis. Significant managerial advice may be derived from these results vis-a-vis trans-national benchmarking and opportunity for performance improvements both at the branch level and nationally as well.

Journal ArticleDOI
Mike Smet1
TL;DR: A waiting time indicator to proxy hospital standby capacity is incorporated into a multi-product translog cost function for Belgian general care hospitals, derived from queuing theory and improves on the conventionally used (inverse of the) occupancy rate.
Abstract: Since demand for hospital services is subject to substantial variability, the relationship between uncertain demand, excess capacity, hospital costs and performance should be investigated thoroughly. In this paper a waiting time indicator to proxy hospital standby capacity is incorporated into a multi-product translog cost function for Belgian general care hospitals. The indicator is derived from queuing theory and improves on the conventionally used (inverse of the) occupancy rate. The multi-product stochastic frontier specification allows calculation of cost elasticities and marginal cost of seven hospital departments, as well as the degree of economies of scale and scope and enables identification of differences in efficiency.

Journal ArticleDOI
TL;DR: In this article, the efficiency of water suppliers in rural areas of East and West Germany was investigated, and a non-radial measure of input specific allocative inefficiency was used to reduce the distributional dependency with respect to the inefficiency parameters.
Abstract: This contribution investigates the efficiency of water suppliers in rural areas of East and West Germany. A non-radial measure of input specific allocative inefficiency is used to reduce the distributional dependency with respect to the inefficiency parameters. It is based on the demand system of a flexible cost function for the variable inputs labour, energy and chemicals modelled by applying a modified symmetric generalized McFadden functional form. Concavity restrictions, as required by economic theory, are imposed. The analysis reveals that efforts towards increasing suppliers’ allocative efficiency should focus on the relatively inefficient usage of the input chemicals. The input specific allocative model specification was found to be superior to the overall allocative specification.

Journal ArticleDOI
TL;DR: In this paper, the effects of social barriers to communication on productivity and capital accumulation were investigated in an optimal growth model and it was shown that such barriers reduce both transitory and steady-state levels of total factor productivity, per capita consumption and reproducible capital.
Abstract: The paper contributes to the explanation of the large differences in cross-country productivity performance by modelling and testing the effects of social barriers to communication on productivity and capital accumulation. In an optimal growth model, social barriers to com- munication, which impede the formation of knowledge connections, are shown to reduce both transitory and steady-state levels of total factor productivity (TFP), per capita consumption and reproducible capital. Empirical testing yields a robust and theoretically consistent result: linguistic barriers to communication reduce productivity and capital accumulation. The findings provide an expla- nation for cross-country differences in TFP, and fresh insights into how productivity 'catch up' may be initiated.

Journal ArticleDOI
TL;DR: In this article, a method for mutual fund performance measurement and best-practice benchmarking is proposed, which endogenously identifies a dominating benchmark portfolio for each evaluated mutual fund, which provides information about efficiency improvement potential as well as portfolio strategies for achieving them.
Abstract: We propose a method for mutual fund performance measurement and best-practice benchmarking, which endogenously identifies a dominating benchmark portfolio for each evaluated mutual fund. Dominating benchmarks provide information about efficiency improvement potential as well as portfolio strategies for achieving them. Portfolio diversification possibilities are accounts for by using Data Envelopment Analysis (DEA). Portfolio risk is accounted for in terms of the full return distribution by utilizing Stochastic Dominance (SD) criteria. The approach is illustrated by an application to US based environmentally responsible mutual funds.

Journal ArticleDOI
TL;DR: In this paper, the authors study the construction of confidence intervals for efficiency levels of individual firms in stochastic frontier models with panel data and propose a simple parametric alternative in which one acts as if the identity of the best firm is known.
Abstract: We study the construction of confidence intervals for efficiency levels of individual firms in stochastic frontier models with panel data. The focus is on bootstrapping and related methods. We start with a survey of various versions of the bootstrap. We also propose a simple parametric alternative in which one acts as if the␣identity of the best firm is known. Monte Carlo simulations indicate that the parametric method works better than the␣percentile bootstrap, but not as well as bootstrap methods that make bias corrections. All of these methods are valid␣only for large time-series sample size (T), and correspondingly none of the methods yields very accurate confidence intervals except when T is large enough that the identity of the best firm is clear. We also present empirical results for two well-known data sets.

Journal ArticleDOI
TL;DR: In this article, the weak axiom of profit maximization (WAPM) is applied to the problem of determining the profit maximizing input-output bundle of a firm without data on prices.
Abstract: Determining the profit maximizing input–output bundle of a firm requires data on prices. This paper shows how endogenously determined shadow prices can be used in place of actual prices to obtain the optimal input–output bundle where the firm’s shadow profit is maximized. This approach amounts to an application of the Weak Axiom of Profit Maximization (WAPM) formulated by Varian [(1984) The Non-parametric approach to production analysis. Econometrica 52:3 (May) 579–597] based on shadow prices rather than actual prices. At these shadow prices, the shadow profit of a firm is zero. The maximum shadow profit that could have been attained at some other input–output bundle is shown to be a measure of the inefficiency of the firm. Because the benchmark input–output bundle is always an observed bundle from the data, it can be determined without having to solve any elaborate programming problem.

Journal ArticleDOI
TL;DR: This article showed that the Hotelling-Lau elasticity of substitution inherits all of the failings of the Allen-Uzawa elasticity identified by Blackorby and Russell.
Abstract: We show that the Hotelling–Lau elasticity of substitution, an extension of the Allen–Uzawa elasticity to allow for optimal output-quantity (or utility) responses to changes in factor prices, inherits all of the failings of the Allen–Uzawa elasticity identified by Blackorby and Russell [(1989) Am Econ Rev 79: 882–888]. An analogous extension of the Morishima elasticity of substitution to allow for output quantity changes preserves the salient properties of the original Hicksian notion of elasticity of substitution.

Journal ArticleDOI
TL;DR: This work develops an approach that aims to capture the time lag between the outputs and the inputs in assigning the efficiency values to DMUs and proposes using weight restrictions in conjunction with the model.
Abstract: Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many applications, inputs and outputs of DMUs are monitored over time. There might be a time lag between the consumption of inputs and the production of outputs. We develop an approach that aims to capture the time lag between the outputs and the inputs in assigning the efficiency values to DMUs. We propose using weight restrictions in conjunction with the model. Our computational results on randomly generated problems demonstrate that the developed approach works well under a large variety of experimental conditions. We also apply our approach on a real data set to evaluate research institutions.