scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Productivity Analysis in 1999"


Journal ArticleDOI
TL;DR: In this article, the authors define a statistical model allowing determination of the statistical properties of the nonparametric estimators in the multi-output and multi-input case, and provide the asymptotic sampling distribution of the FDH estimator in a multivariate setting and of the DEA estimator for the bivariate case.
Abstract: Efficiency scores of firms are measured by their distance to an estimated production frontier. The economic literature proposes several nonparametric frontier estimators based on the idea of enveloping the data (FDH and DEA-type estimators). Many have claimed that FDH and DEA techniques are non-statistical, as opposed to econometric approaches where particular parametric expressions are posited to model the frontier. We can now define a statistical model allowing determination of the statistical properties of the nonparametric estimators in the multi-output and multi-input case. New results provide the asymptotic sampling distribution of the FDH estimator in a multivariate setting and of the DEA estimator in the bivariate case. Sampling distributions may also be approximated by bootstrap distributions in very general situations. Consequently, statistical inference based on DEA/FDH-type estimators is now possible. These techniques allow correction for the bias of the efficiency estimators and estimation of confidence intervals for the efficiency measures. This paper summarizes the results which are now available, and provides a brief guide to the existing literature. Emphasizing the role of hypotheses and inference, we show how the results can be used or adapted for practical purposes.

1,099 citations


Journal ArticleDOI
TL;DR: In this paper, the additive model of DEA is developed in association with a new measure of efficiency referred to as RAM (Range Adjusted Measure) and the need for separately treating input oriented and output oriented approaches to efficient measurement is eliminated because additive models effect their evaluations by maximizing distance from the efficient frontier (in l 1, or weighted l 1 measure) and thereby simultaneously maximize outputs and minimize inputs.
Abstract: Generalized Efficiency Measures (GEMS) for use in DEA are developed and analyzed in a context of differing models where they might be employed. The additive model of DEA is accorded a central role and developed in association with a new measure of efficiency referred to as RAM (Range Adjusted Measure). The need for separately treating input oriented and output oriented approaches to efficient measurement is eliminated because additive models effect their evaluations by maximizing distance from the efficient frontier (in l1, or weighted l1, measure) and thereby simultaneously maximize outputs and minimize inputs. Contacts with other models and approaches are maintained with theorems and accompanying proofs to ensure the validity of the thus identified relations. New criteria are supplied, both managerial and mathematical, for evaluating proposed measures. The concept of “approximating models” is used to further extend these possibilities. The focus of the paper is on the “physical” aspects of performance involved in “technical” and “mix” inefficiencies. However, an Appendix shows how “overall,” “allocative” and “technical” inefficiencies may be incorporated in additive models.

632 citations


Journal ArticleDOI
TL;DR: A nonparametric, linear programming, frontier procedure for obtaining a measure of managerial efficiency that controls for exogenous features of the operating environment is introduced.
Abstract: The ability of a production unit to transform inputs into outputs is influenced by its technical efficiency and external operating environment. This paper introduces a nonparametric, linear programming, frontier procedure for obtaining a measure of managerial efficiency that controls for exogenous features of the operating environment. The approach also provides statistical tests of the effects of external conditions on the efficient use of each individual input (for an input oriented model) or for each individual output (for an output oriented model). The procedure is illustrated for a sample of nursing homes.

441 citations


Journal ArticleDOI
TL;DR: In this paper, the authors measured the efficiency of international airlines using stochastic frontier production functions adjusted to account for environmental influences such as network conditions, geographical factors, etc., and found that Asian/Oceanic airlines are more efficient than European and North American airlines.
Abstract: The principal aim of this paper is to measure the efficiency of international airlines. We obtain measures of technical efficiency from stochastic frontier production functions which have been adjusted to account for environmental influences such as network conditions, geographical factors, etc. We observe that two alternative approaches to this problem have been proposed in the efficiency measurement literature. One assumes that the environmental factors influence the shape of the technology while the other assumes that they directly influence the degree of technical inefficiency. In this paper we compare the results obtained when using these two approaches. The two sets of results provide similar rankings of airlines but suggest differing degrees of technical inefficiency. Both sets of results also suggest that Asian/Oceanic airlines are technically more efficient than European and North American airlines but that the differences are essentially due to more favourable environmental conditions. Nevertheless, it is among Asian companies that the major improvements in managerial efficiency (technical efficiency with environmental factors netted out) took place over the sample period (1977–1990).

315 citations


Journal ArticleDOI
TL;DR: In this paper, an integrated microeconomic framework for IT productivity and efficiency assessment using developments in production economics was developed and applied to a dataset used in prior research with mixed results to obtain new evidence regarding IT contribution.
Abstract: We reexamine the •Information Technology (IT) productivity paradox• from the standpoints of theoretical basis, measurement issues and potential inefficiency in IT management. Two key objectives are: (i) to develop an integrated microeconomic framework for IT productivity and efficiency assessment using developments in production economics, and (ii) to apply the framework to a dataset used in prior research with mixed results to obtain new evidence regarding IT contribution. Using a stochastic frontier with a production economics framework involving the behavioral assumptions of profit maximization and cost minimization, we obtain a unified basis to assess both productivity and efficiency impacts of IT investments. The integrated framework is applied to a manufacturing database spanning 1978–1984. While previous productivity research with this dataset found mixed results regarding the contribution from IT capital, we show the negative marginal contribution of IT found in an important prior study is attributable primarily to the choices of the IT deflator and modeling technique. Further, by ignoring the potential inefficiency in IT investment and management, studies that have reported positive results may have significantly underestimated the true contribution of IT. This positive impact of IT is consistent across multiple model specifications, estimation techniques and capitalization methods. The stochastic production frontier analysis shows that while there were significant technical, allocative and scale inefficiencies, the inefficiencies reduced with an increase in the IT intensity. Given that the organizational units in our sample increased their IT intensity during the time period covered by the study, management was taking a step in the right direction by increasing the IT share of capital inputs. Our results add to a small body of MIS literature which reports significant positive returns from IT investments.

151 citations


Journal ArticleDOI
TL;DR: In this paper, a method is developed that determines the least-norm projection from an inefficient DMU to the efficient frontier in both the input and output space simultaneously, and introduces the notion of the observable frontier and its subsequent projection.
Abstract: Data Envelopment Analysis (DEA) has been widely studied in the literature since its inception in 1978. The methodology behind the classical DEA, the oriented method, is to hold inputs (outputs) constant and to determine how much of an improvement in the output (input) dimensions is necessary in order to become efficient. This paper extends this methodology in two substantive ways. First, a method is developed that determines the least-norm projection from an inefficient DMU to the efficient frontier in both the input and output space simultaneously, and second, introduces the notion of the “observable” frontier and its subsequent projection. The observable frontier is the portion of the frontier that has been experienced by other DMUs (or convex combinations of such) and thus, the projection onto this portion of the frontier guarantees a recommendation that has already been demonstrated by an existing DMU or a convex combination of existing DMUs. A numerical example is used to illustrate the importance of these two methodological extensions.

131 citations


Journal ArticleDOI
TL;DR: In this paper, a new approach where potential improvements are used to guide the selection of reference plans is proposed and an associated translation invariant, strictly monotonous and continuous efficiency index is suggested.
Abstract: Efficiency evaluation of a Decision Making Unit (DMU) involves two issues: 1) selection of an appropriate reference plan against which to evaluate the DMU and 2) measurement of performance slack. In the literature, these issues are mixed in one and the same operation but we argue that it has theoretical as well as practical advantages to separate them. We provide an axiomatic characterization of the implicit Farrell selection. This approach, ignores important aspects of the technology by focussing on proportional variations in inputs (or outputs). We propose a new approach where potential improvements are used to guide the selection of reference plans. A characterization of this approach is provided and an associated translation invariant, strictly monotonous and continuous efficiency index is suggested.

130 citations


Journal ArticleDOI
TL;DR: In this article, the authors deal with estimation of primal panel data models of production risk, focusing on measurement of risk properties of inputs and productivity growth, and find that the increase in mean output dominates the risk of output risk.
Abstract: This paper deals with estimation of primal panel data models of production risk, focusing on measurement of risk properties of inputs and productivity growth. Under production risk one should estimate technical change separately for the deterministic part and risk part of the technology, since risk averse producers will take into account both the mean and variance of output when they rank alternative technologies. For a panel of Norwegian salmon farms fish feed and fish input are found to increase output risk, while labor has a risk-decreasing effect on output. In the analysis of technical change by the first order stochastic dominance criterion the increase in mean output dominates the increase in output risk.

129 citations


Journal ArticleDOI
TL;DR: In this article, a commensurable Holder distance function is introduced, which is invariant with respect to a change in the units of measurement, and it is shown that the Debreu-Farrell measure is a special case of the holder distance function.
Abstract: In this paper we intend to establish relations between the way efficiency is measured in the literature on efficiency analysis, and the notion of distance in topology. In particular we study the Holder norms and their relationship to the shortage function (Luenberger (1995) and the directional distance function (Chambers, Chung and Fare (1995–96)). Along this line, we provide mathematical programs to compute the Holder distance function. However, this has a perverse property that undermines its attractiveness: it fails the commensurability condition suggested by Russell (1988). Thus, we introduce a commensurable Holder distance function invariant with respect to a change in the units of measurement. Among other things we obtain some continuity result and we prove that the well known Debreu-Farrell measure is a special case of the Holder distance function.

112 citations


Journal ArticleDOI
TL;DR: In this paper, an estimated translog production function is used to obtain output-and input-oriented measures of scale efficiency at an observed input bundle, and the estimated model can be used to determine the optimal quantity of labor input for an exogenously fixed quantity of capital.
Abstract: In parametric analysis based on a frontier production function, usually the scale elasticity rather than scale efficiency level is reported In this paper we show how one can use an estimated translog production function to obtain output- and input-oriented measures of scale efficiency at an observed input bundle We also show how the estimated model can be used to determine the optimal quantity of labor input for an exogenously fixed quantity of capital

109 citations



Journal ArticleDOI
TL;DR: In this paper, it is shown that when a firm attempts to move towards the frontier it not only increases its efficiency but also reduces its production uncertainty and this will lead to shorter confidence intervals.
Abstract: One of the main purposes of the frontier literature is to estimate inefficiency. Given this objective, it is unfortunate that the issue of estimating “firm-specific” inefficiency in cross sectional context has not received much attention. To estimate firm-specific (technical) inefficiency, the standard procedure is to use the mean of the inefficiency term conditional on the entire composed error as suggested by Jondrow, Lovell, Materov and Schmidt (1982). This conditional mean could be viewed as the average loss of output (return). It is also quite natural to consider the conditional variance which could provide a measure of production uncertainty or risk. Once we have the conditional mean and variance, we can report standard errors and construct confidence intervals for firm level technical inefficiency. Moreover, we can also perform hypothesis tests. We postulate that when a firm attempts to move towards the frontier it not only increases its efficiency, but it also reduces its production uncertainty and this will lead to shorter confidence intervals. Analytical expressions for production uncertainty under different distributional assumptions are provided, and it is shown that the technical inefficiency as defined by Jondrow et al. (1982) and the production uncertainty are monotonic functions of the entire composed error term. It is very interesting to note that this monotonicity result is valid under different distributional assumptions of the inefficiency term. Furthermore, some alternative measures of production uncertainty are also proposed, and the concept of production uncertainty is generalized to the panel data models. Finally, our theoretical results are illustrated with an empirical example.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the bootstrap procedure suggested by Ferrier and Hirschberg (1997) gives inconsistent estimates and illustrate the statis- tical issues underlying nonparametric efficiency measurement and the problems with the Ferrier/Hirschberg approach.
Abstract: This paper demonstrates that the bootstrap procedure suggested by Ferrier and Hirschberg (1997) gives inconsistent estimates. A very simple example is given to illustrate the statis- tical issues underlying nonparametric efficiency measurement and the problems with the Ferrier/Hirschberg approach, and may serve as a primer on bootstrapping in nonparametric models of production processes.

Journal ArticleDOI
TL;DR: In this article, the conditions under which these indexes constructed at various levels of aggregation can be consistent with one another are examined, indicating that very strong restrictions on the technology and/or the efficiency index itself are required to enable consistent aggregation (or disaggregation).
Abstract: Measurement of technical efficiency is carried out at many levels of aggregation—at the individual branch, plant, division, or district level; at the company- or organization-wide level; at the industry or sectoral level; or at the economy-wide level In this paper, we examine the conditions under which these indexes constructed at various levels of aggregation can be consistent with one another—that is, the extent to which efficiency indexes can be consistently aggregated Unfortunately, our results are discouraging, indicating that very strong restrictions on the technology and/or the efficiency index itself are required to enable consistent aggregation (or disaggregation)

Journal ArticleDOI
Robert M. Thrall1
TL;DR: In this article, the authors considered the FDH model with respect to its production possibility set, TF, and proved that DMUj is DEA efficient if it is not dominated by any p ∈ TB.
Abstract: The central feature of the FDH model is the lack of convexity for its production possibility set, TF. Starting with n observed (distinct) decision making units DMUk , each defined by an input-output vector p k = [y k -x k], domination is defined by ordinary vector inequalities. DMUk is said to dominate DMUj if p k ≥ p j , p k ≠ p j . The FDH production possibility set TF consists of the observed DMUj together with all input-output vectors p=[yk,−xk] with y ≥ 0, x ≥ 0, y ≠ 0, x ≠ 0 which are dominated by at least one of the observed DMUj. DMUk is defined as “FDH efficient” if no DMUj dominates it. In the BCC (or variable return to scale) DEA model the production possibility set TB consists of the observed DMUk together with all input-output vectors dominated by any convex combination of them and DMUk is DEA efficient if it is not dominated by any p in TB. In the DEA model, economic meaning is established by the introduction of (non negative) multiplier (price) vectors w = [u,v]. If DMUk is undominated (in TB) then there exists a positive multiplier vector w for which (a) w T p k = u T y k − v T x k ≥ w T p for every p ∈ TB. In everyday language, the net return (or profit) for DMUk relative to the given multiplier vector w is at least as great as that for any production possibility p. On the other hand, if DMUk is FDH but not DEA efficient then it is proved that there exists no positive multiplier vector >w for which (a) holds, i.e. for any positive w there exists at least one DMUj for which w T p j > w T p k . Since, therefore, FDH efficiency does not guarantee price efficiency what is its economic significance? Without economic significance, how can FDH be considered as being more than a mathematical system however logically soundly it may be conceived?

Journal ArticleDOI
TL;DR: In this article, the sensitivity of the returns to scale (RTS) classifications in data envelopment analysis is studied by means of linear programming problems and the stability region for an observation preserving its current RTS classification is investigated by the optimal values to a set of particular DEA-type formulations.
Abstract: Sensitivity of the returns to scale (RTS) classifications in data envelopment analysis is studied by means of linear programming problems. The stability region for an observation preserving its current RTS classification (constant, increasing or decreasing returns to scale) can be easily investigated by the optimal values to a set of particular DEA-type formulations. Necessary and sufficient conditions are determined for preserving the RTS classifications when input or output data perturbations are non-proportional. It is shown that the sensitivity analysis method under proportional data perturbations can also be used to estimate the RTS classifications and discover the identical RTS regions yielded by the input-based and the output-based DEA methods. Thus, our approach provides information on both the RTS classifications and the stability of the classifications. This sensitivity analysis method can easily be applied via existing DEA codes.

Journal ArticleDOI
TL;DR: A technique for comparing the results of different assembly line balancing strategies by using Data Envelopment Analysis (DEA) shows that DEA is effective in suggesting which line balancing heuristics are most promising.
Abstract: This paper presents a technique for comparing the results of different assembly line balancing strategies by using Data Envelopment Analysis (DEA). Initially, several heuristics--which can be thought of as assembly line balancing strategies--were used to solve seven line-balancing problems. The resulting line balance solutions provided two pieces of information that were of particular interest: the number of workers needed and the amount of equipment needed. These two items were considered inputs for DEA. The different line balance solutions were then used as layouts for simulated production runs. From the simulation experiments, several output performance measures were obtained which were of particular interest and were used as outputs for DEA. The analysis shows that DEA is effective in suggesting which line balancing heuristics are most promising.

Journal ArticleDOI
TL;DR: In this article, a model of price-induced innovation is presented incorporating long-run prices as arguments in the production function serving the role of stimulating firms to seek innovations, and empirical application examines the production structure and technological progress of the U.S. food processing and distribution sector for the period 1948-1991.
Abstract: A model of price-induced innovation is presented incorporating long-run prices as arguments in the production function serving the role of stimulating firms to seek innovations. The empirical application examines the production structure and technological progress of the U.S. food processing and distribution sector for the period 1948–1991. The empirical model separates scarcity, innovation, and exogenous technical change responses in analyzing the Morishima elasticities for input combinations and multifactor input biases. The results suggest significant structural changes occurring in the food processing and distribution sector since 1980. Focusing on multifactor input bias, results suggest that there are no wide changes in technical change patterns over the last forty years. However, Morishima elasticity results suggest a more varied pattern of technical change between inputs. The price-induced technical progress has a dominant contribution on input decisions compared with the exogenous technical change.

Journal ArticleDOI
TL;DR: The long-term impact of research, education, and various government support programs on U.S. agricultural productivity was analyzed using an error correction model as mentioned in this paper, which indicated that the proposed reduction in commodity program expenditures (e.g., 1996 Farm Bill) is unlikely to reduce agricultural productivity.
Abstract: The long-term impact of research, education, and various government support programs on U.S. agricultural productivity was analyzed using an error correction model. Results indicate that the proposed reduction in commodity program expenditures (e.g. 1996 Farm Bill) is unlikely to reduce agricultural productivity. Results suggest that shifting public funds from commodity programs to education and research would raise U.S. agricultural productivity. Our estimates of long-term rates of return to public research are lower than those from most previous, perhaps due to our improved model specification, but are high enough to justify continued public investments to raise productivity.

Journal ArticleDOI
TL;DR: In this article, rank statistics were used to evaluate efficiency performance trends using productive efficiency measures derived through various Data Envelopment Analysis (DEA) models, and the new procedures were applied to data reflecting the macroeconomic performance of 17 OECD nations in 1979-1988.
Abstract: This paper presents two applications of rank statistics to evaluate efficiency performance trends using productive efficiency measures derived through various Data Envelopment Analysis (DEA) models. The paper starts with a discussion of the difficulties in obtaining consistent ranks from DEA efficiency ratings. Next, a procedure is proposed to identify intertemporal performance trends using any one of several possible efficiency measures. Another procedure is then developed to test the stability over time of the rank positions of the analyzed units. For each statistical procedure, a small numerical example involving DEA efficiency measures is provided to illustrate the proposed technique. Finally, the new procedures are applied to data reflecting the macro-economic performance of 17 OECD nations in 1979–1988. The outcomes of the application are discussed and contrasted with previous research in this area.

Journal ArticleDOI
TL;DR: In this article, a translog cost function is specified and estimated with controls for system (engineering) characteristics and environmental variables, and it is found that the reforms had substantial cost-reducing effects.
Abstract: As part of a general program of market-based reform, New Zealand transformed the electrical supply industry from state-owned to commercially-oriented power companies. This paper tests the hypothesis that such privatization is efficiency-improving. A translog cost function is specified and estimated with controls for system (engineering) characteristics and environmental variables, and it is found that the reforms had substantial cost-reducing effects. The reforms are found to have benefitted customers, with the real price of electricity falling 16.4 percent, over the period.

Journal ArticleDOI
TL;DR: In this paper, the feasibility of bootstrapping DEA scores in the context of the earlier paper by Ferrier and Hirschberg (1997) has been discussed and a simple experiment is devised to demonstrate that in the one-input, one-output case the bootstrap of the non-modified DEA scores is not a failure of the original bootstrap.
Abstract: This paper discusses the feasibility of bootstrapping DEA scores in the context of the earlier paper by Ferrier and Hirschberg (1997). A simple experiment is devised to demonstrate that in the one-input, one-output case the bootstrap of the non-modified DEA scores is not a failure of the bootstrap.

Journal ArticleDOI
TL;DR: In this article, a dynamic model of factor demands based on expected discounted costs minimization is presented, and the authors establish a duality relationship between contemporary factor demands and the technology, and provide formula for easily recovering marginal products, returns to scale, and technological change from estimated factor demands Parametrization and implementation are illustrated in a detailed example.
Abstract: We present a dynamic model of factor demands based on expected discounted costs minimization While making only very mild assumptions on expectations and technology, we are able to establish a duality relationship between contemporary factor demands and the technology, and we provide formula for easily recovering marginal products, returns to scale, and technological change from estimated factor demands Parametrization and implementation are illustrated in a detailed example

Journal ArticleDOI
TL;DR: In this paper, the authors apply measurement techniques to an eleven-year panel of 20 U.S. interstate natural gas transmission companies and use their benchmarking measures to distinguish firms that perform well owing to a superior management of technology from those that perform poorly owing to the effective management of the regulatory mechanism.
Abstract: Rewarding regulated firms based on their relative performance requires benchmarks that reflect how performance is affected by regulation. This paper demonstrates how parametric and nonparametric efficiency measures can be employed to produce benchmarks that account for the effects of regulation. We apply measurement techniques to an eleven-year panel of 20 U.S. interstate natural gas transmission companies and use our benchmarking measures to distinguish firms that perform well owing to a superior management of technology from firms that perform well owing to the effective management of the regulatory mechanism.

Journal ArticleDOI
TL;DR: In this article, the use of finite mixtures of probability distributions to estimate cost functions is proposed, which allows for the simultaneous existence and unobservability of multiple technologies of production.
Abstract: This article proposes the use of finite mixtures of probability distributions to estimate cost functions. The mixture technique allows for the simultaneous existence and unobservability of multiple technologies of production. Technology switching by firms and conventional technical change can be studied directly. We illustrate the technique on a large sample of U.S. Savings and Loan companies, and find strong evidence of multiple technologies. We compare the mixture results to conventional stochastic cost frontier and thick frontier models, and highlight their differences.

Journal ArticleDOI
TL;DR: In this paper, the authors highlight the underlying assumptions that raise questions about this approach, and overviews alternative approaches to and rationales for computing these types of elasticity estimates, which is not conceptually appropriate for most applications, and disallows evaluation of the implied adjustment process to long run values.
Abstract: Morrison (1985), Morrison and Siegel (1997) and Morrison and Schwartz (1994) have suggested using an expression for “total” scale or cost economies to disentangle determinants of cost efficiency, including short run subequilibrium effects. Fousekis (1998) has noted that the derivation of such an expression is based on imputation of the long run, which implicitly suggests evaluation at steady state values. Measurement of elasticities imputing values not observed in the data, however, invariably requires some type of approximation. The Fousekis approach represents one view of the relevant approximation, which is not conceptually appropriate for most applications, and disallows evaluation of the implied adjustment process to long run values. This article highlights the underlying assumptions that raise questions about this approach, and overviews alternative approaches to and rationales for computing these types of elasticity estimates.

Journal ArticleDOI
TL;DR: In this paper, the incorporation of polyhedral cone constraints on the virtual multipliers in DEA is discussed, and probabilistic bounds based on a stochastic benchmark vector are demonstrated.
Abstract: The paper is concerned with the incorporation of polyhedral cone constraints on the virtual multipliers in DEA. The incorporation of probabilistic bounds on the virtual multipliers based upon a stochastic benchmark vector is demonstrated. The suggested approach involves a stochastic (chance constrained) programming model with multipliers constrained to the cone spanned by confidence intervals for the components of the stochastic benchmark vector at varying probability levels. Consider a polyhedral assurance region based upon bounded pairwise ratios between multipliers. It is shown that in general it is never possible to identify a “center-vector” defined as a vector in the interior of the cone with identical angles to all extreme rays spanning the cone. Smooth cones are suggested if an asymmetric variation in the set of feasible relative prices is to be avoided.

Journal ArticleDOI
TL;DR: In this article, the authors used the average derivative estimation of Stoker and the pesudo-likelihood estimation of Fan, Li, and Weersink (1996) to estimate a semiparametric stochastic frontier regression, y = g(x) + e, where the function g(.)is unknown and e is a composite error in a standard setting.
Abstract: This paper utilizes the average derivative estimation of Stoker (1986) and the pesudo-likelihood estimation of Fan, Li, and Weersink (1996) to estimate a semiparametric stochastic frontier regression, y = g(x) + e, where the function g(.)is unknown and e is a composite error in a standard setting. The proposed semiparametric method of estimation is applied to data on farmers' credit unions in Taiwan. Empirical results show that the banking services of the farmers' credit unions is subject to economies of scale, but high degree of cost inefficiency in operation.

Journal ArticleDOI
TL;DR: In this paper, a convex N-input-M -output production possibility frontier can be locally approximated by means of a flexible Nonseparable Nested Constant-Elasticity-of-Substitution/Constant Elasticity ofTransformation (NNCES-CET) restricted profit function.
Abstract: This note describes how a convex N-input-M -output production possibility frontier can be locally approximated by means of a flexible Nonseparable Nested Constant-Elasticity-of-Substitution/Constant-Elasticity-of-Transformation (NNCES-CET) restricted profit function. This technique yields a summary representation of technology sets that is globally regular and thus suitable for use in applications where regularity is crucial.

Journal ArticleDOI
TL;DR: In this article, the authors employ dynamic duality and comparative dynamics to assess the behavior of cost elasticity along an optimal path to the steady state, concluding that no a priori relationship exists between economic capacity utilization and elasticity of cost.
Abstract: The result by Morrison (1985) and Morrison and Schwarzt (1994) that there is an one-to-one relationship between the rate of economic capacity utilization and ratio of cost elasticities at the temporary and the full equilibrium has been instrumental in past studies on measuring economic capacity utilization and on adjusting indexes of productivity growth for temporary equilibrium. In this paper, dynamic duality and comparative dynamics are employed to assess the behavior of cost elasticity along an optimal path to the steady state. The analysis suggests that no a priori relationship exists between economic capacity utilization and elasticity of cost. The absence of an a priori relationship between these two economic variables implies that theoretical and empirical results based on past notions about the dynamic behavior of cost elasticity may have to be reconsidered.