scispace - formally typeset
Search or ask a question

Showing papers on "Productivity model published in 2001"


Journal ArticleDOI
TL;DR: In this article, the authors developed a generic measure of scale efficiency for a multiple-input multiple-output (MIMO) firm, using basic principles of modern production theory, and combined measures of technological change, technical efficiency change, and scale efficiency change into an encompassing measure of productivity change.
Abstract: Thefirst objective of this paper is to develop a generic measureof scale efficiency for a multiple-input multiple-output firm,using basic principles of modern production theory. The secondobjective is to combine measures of technological change, technicalefficiency change, and scale efficiency change into an encompassing(primal) measure of productivity change. This measure and itsdecomposition is compared to a number of recent proposals inorder to shed light on what seems to have become a controversialissue. The paper proceeds by developing an encompassing dualmeasure of productivity change. This dual measure is then appliedto panel data of a set of Dutch firms, continuing the empiricalwork of Balk (1998). It turns out that extending the Malmquistproductivity index with factors measuring scale efficiency changeand input mix change leads to appreciably different outcomes.

336 citations


Journal ArticleDOI
TL;DR: A closed-form analytical model is used and demonstrated that investments in certain efficiency-enhancing technologies may be expected to decrease the productivity of profit-maximizing firms and that the direction of firm productivity following such investments depends upon the relationship between the fixed costs of the firm and the size of the market.
Abstract: For over a decade, empirical studies in the information technology (IT) value literature have examined the impact of technology investments on various measures of performance. However, the results of these studies, especially those examining the contribution of IT to productivity, have been mixed. One reason for these mixed empirical findings may be that these studies have not effectively accounted for the impact of technology investments that increase production efficiency and improve product quality on firm productivity. In particular, it is commonly assumed that such investments should lead to gains in both profits and productivity. However, using a closed-form analytical model we challenge this underlying assumption and demonstrate that investments in certain efficiency-enhancing technologies may be expected to decrease the productivity of profit-maximizing firms. More specifically, we demonstrate that investments in technologies that reduce the firm's fixed overhead costs do not affect the firm's product quality and pricing decisions but do increase profits and improve productivity. In addition, we demonstrate that investments in technologies that reduce the variable costs of designing, developing, and manufacturing a product encourage the firm to improve product quality and to charge a higher price. Although this adjustment helps the firm to capture higher profits, we show that it will also increase total production costs and will, under a range of conditions, decrease firm productivity. Finally, we show that the direction of firm productivity following such investments depends upon the relationship between the fixed costs of the firm and the size of the market.

235 citations


Posted Content
TL;DR: The Productivity Measurement Advisory Service of the OECD has published a manual on productivity measurement as discussed by the authors, which provides an accessible guide to productivity measurement for those involved in constructing and interpreting productivity measures, in particular statistical offices, other relevant government agencies and productivity researchers.
Abstract: roductivity growth forms the basis forimprovements in real incomes and wel-fare. Slow productivity growth limits therate at which real incomes can improve, and alsoincreases the likelihood of conflicting demandsconcerning the distribution of income. Measuresof productivity growth and of productivity levelstherefore constitute important economic indica-tors.Over a number of years, the StatisticalWorking Party of the OECD IndustryCommittee has dealt with different aspect ofproductivity measurement and analysis. Thegroup noted that, despite a large body of litera-ture, no recent systematic and accessible sourceof information exists to provide a guide to thedifferent approaches, interpretations and statisti-cal requirements of productivity measures atnational or international level. At the OECD,the last product of this kind was published by theProductivity Measurement Advisory Service ofthe OECD in 1966. Consequently, the WorkingParty undertook a project to compile a manualon productivity measurement. The final draft ofthis manual was de-classified by the OECDIndustry Committee in February 2001, and isavailable in electronic form as of April 2, 2001 onthe OECD homepage (http://www.oecd.org/subject/growth/an_ec_gr.htm), to be followedby a paper publication later this year as well as atranslation into French.The main objectives of the manual are to:• Provide an accessible guide to productivitymeasurement for those involved in construct-ing and interpreting productivity measures, inparticular statistical offices, other relevant gov-ernment agencies and productivity researchers.• Improve international harmonisation:although there is no strong prescriptive ele-ment in the manual, it contains indicationsabout desirable properties of productivitymeasures. Hence, when countries have achoice in constructing new measures or devel-oping a system of indicators, the manual mayprovide guidance.• Identify desirable characteristics of productivi-ty measures by reference to a coherent frame-work that links economic theory and indexnumber theory. Desirable properties have tobe assessed against the reality of data availabil-ity or the costs of producing statistics. Broad

206 citations


Journal ArticleDOI
TL;DR: In this paper, the authors apply a stochastic frontier production model to Korean manufacturing industries, to decompose the sources of total factor productivity (TFP) growth into technical progress, changes in technical efficiency, change in allocative efficiency, and scale effects.
Abstract: This paper applies a stochastic frontier production model to Korean manufacturing industries, to decompose the sources of total factor productivity (TFP) growth into technical progress, changes in technical efficiency, changes in allocative efficiency, and scale effects. Empirical results based on data from 1980–1994 show that productivity growth was driven mainly by technical progress, that changes in technical efficiency had a significant positive effect, and that allocative efficiency had a negative effect. This study suggests that specific guidelines are required to promote productivity in each industry, and provides additional insight into understanding the recent debate on TFP growth in Korean manufacturing.

134 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify a minimal set of relevant properties that a productivity index needs to satisfy to correctly assess performance development of a decision-making unit and point out inconsistencies with properties related to commensurability, monotonicity, and implications of maximizing behavior.

121 citations


Journal ArticleDOI
TL;DR: In this article, the directional output distance function (DOVF) is used to construct a Malmquist-Luenberger index of total factor productivity growth for manufacturing when both good and bad outputs are jointly produced.
Abstract: The directional output distance function (Chambers, Chung, & Fare, 1996) is used to construct a Malmquist-Luenberger index of total factor productivity growth for manufacturing when both good and bad outputs are jointly produced. The index is constructed using information on good and bad output quantities and input quantities, circumventing the problem of recovering shadow price information for the bad output needed for the Fisher or Tornqvist type of productivity indices. Accounting for toxic releases in manufacturing, productivity growth averages 1.4% an-nually during 1988–1994. The findings also suggest that the failure to account for toxic releases in manufacturing results in a significant under-statement of total factor productivity growth.

114 citations


Journal ArticleDOI
TL;DR: In this article, the authors used the stochastic frontier approach to assess the growth potential of the manufacturing sector in the second-tier newly industrializing economy of Malaysia, which is performed by first estimating the production function by using panel data comprising 28 manufacturing industries over the period of 1981-1996.

72 citations


Journal ArticleDOI
TL;DR: In this article, the nonparametric Data Envelopment Analysis approach is used to compute Malmquist productivity indexes, which are decomposed into efficiency change and technical change, and the latter is further decomosed into an output bias, an input bias and a magnitude component.
Abstract: This paper analyses productivity growth in 16 of Taiwan's manufacturing industries during the period 1978–1992. The non-parametric Data Envelopment Analysis approach is used to compute Malmquist productivity indexes. These are decomposed into efficiency change and technical change. The latter is further decomposed into an output bias, an input bias and a magnitude component. In addition, the direction of input bias is identified. Empirical results indicate that the sector's TFP increased at a rate of 2.89% per annum, which could be ascribed to a technical progress (2.56%) and an efficiency improvement (0.33%).

66 citations


ReportDOI
TL;DR: In this article, the authors examined the welfare-theoretic basis for measuring productivity growth and showed that the ideal welfare theoretic measure is a chain index of productivity growth rates of different sectors which uses current output weights.
Abstract: The present study is a contribution to the theory of the measurement of productivity growth. First, it examines the welfare-theoretic basis for measuring productivity growth and shows that the ideal welfare-theoretic measure is a chain index of productivity growth rates of different sectors which uses current output weights. Second, it lays out a technique for decomposing productivity growth which separates aggregate productivity growth into three factors -- the pure productivity effect, the effect of changing shares, and the effect of different productivity levels. Finally, it shows how to apply the theoretically correct measure of productivity growth and indicates which of the three different components should be included in a welfare-oriented measure of productivity growth. The study concludes that none of the measures generally used to measure productivity growth is consistent with the theoretically correct measure.

64 citations


Journal ArticleDOI
TL;DR: In this article, the authors define hyperbolic performance measures on a graph representation of production technology, and introduce a direct formulation to calculate them making use of Data Envelopment Analysis techniques.
Abstract: Hyperbolic measures of efficiency and productivity change with respect to a graph representation of production technology allow researchers to consider output and input dimensions simultaneously in measuring producer performance. Hyperbolic efficiency measures have been proposed, but empirical implementation has not followed, either in efficiency analysis or in productivity analysis. The objectives of this paper are to define hyperbolic performance measures on a graph representation of production technology, to motivate their use by stating some of their advantages over their radial counterparts, and to introduce a direct formulation to calculate them making use of Data Envelopment Analysis techniques. The ideas are illustrated by calculating hyperbolic efficiency and Malmquist productivity indexes for a US agricultural panel data set.

62 citations


Journal ArticleDOI
TL;DR: In this article, the authors explore the possibility that information technology is generating output that is increasingly hard to measure in non-manufacturing industries, which contributes to the divergence in industry productivity growth rates.
Abstract: In recent years, U.S. productivity growth accelerated sharply in manufacturing, but has remained sluggish in the most computer-intensive service industries. This paper explores the possibility that information technology is generating output that is increasingly hard to measure in non-manufacturing industries, which contributes to the divergence in industry productivity growth rates. Our results suggest that measurement error in 13 computer-intensive, non-manufacturing industries increased between 0.74 and 1.57 percentage points per year in the 1990s, which understates annual aggregate productivity growth by 0.10 to 0.20 percentage points in the 1990s. This adds to an estimated 0.22 to 0.30 percentage point error from the increasing share of aggregate output in these hard-to-measure industries. Thus, increasing measurement problems may understate aggregate productivity growth by an additional 0.32 to 0.50 percentage points per year in the 1990s and play an important role in understanding recent productivity trends at the industry level.

Journal ArticleDOI
Derek Byerlee1, Rinku Murgai1
TL;DR: The conceptual and practical issues in measuring total social factor productivity (TSFP) and shows that no one measure alone will be theoretically or empirically robust as an indicator of sustainability as discussed by the authors.


Patent
08 Aug 2001
TL;DR: In this article, a production monitoring system collects an array of data related to employee productivity, wages, supply usage, costs, desired profits, overhead, customer information, and other information pertinent to operating a manufacturing operation of service industry.
Abstract: A production monitoring system collects an array of data related to employee productivity, wages, supply usage, costs, desired profits, overhead, customer information, and other information pertinent to operating a manufacturing operation of service industry. The data is analyzed to derive a variety of productivity values such as average worker efficiency, production incentives, material costs, supply waste, and others. The system audits productivity data entered by workers, and sounds alarms when the data appears to be incorrect. Supply usage rates are calculated and additional supplies are automatically ordered. Estimated prices and delivery times are determined based on historical data and user-supplied safety margins and profit margins.

Journal ArticleDOI
TL;DR: In this paper, the authors explore some issues surrounding the use of farm-level efficiency and productivity estimates for benchmarking studies and find that farms change their relative rank in terms of efficiency across years.
Abstract: In this article we explore some issues surrounding the use of farm-level efficiency and productivity estimates for benchmarking studies. Using an eight-year balanced panel of Victorian wool producers we analyse annual variation between estimates of farm-level technical efficiency derived using Data Envelopment Analysis and Malmquist estimates of Total Factor Productivity. We find that farms change their relative rank in terms of efficiency across years. Also, unlike aggregate studies of Total Factor Productivity, we find at best erratic and modest growth, a worrying result for this industry. However, caution is needed when interpreting these results, and for that matter, benchmarking analysis as currently practised when using frontier estimation techniques like Data Envelopment Analysis.

Posted Content
TL;DR: In this article, the authors investigate how the evidence for Italy from simple growth accounting exercises is modified by more accurate measurements of inputs and find that a sizeable part of the observed growth of total factor productivity vanishes when these adjustments are applied.
Abstract: We investigate how the evidence for Italy from simple growth accounting exercises is modified by more accurate measurements of inputs We describe the dynamics of total factor productivity in the last 20 years in Italy, and review theoretical and measurement issues that complicate the picture emerging from this simple exercise We adjust the labour input for its composition and verify its impact on estimated multifactor productivity in the whole economy We replicate the labour-quality adjustment for the industrial sector together with corrections for hours worked and capital utilisation We find that a sizeable part of the observed growth of total factor productivity vanishes when these adjustments are applied They are not sufficient, however, to overturn the evidence of a productivity slowdown in the Italian economy in the second half of the 1990s

Journal ArticleDOI
TL;DR: In this paper, the authors compare the conceptual merits and empirical performance of alternative approaches that can be employed for this purpose: input distance functions, output distance function, nonparametric methods, and index number approaches.

Proceedings ArticleDOI
08 Oct 2001
TL;DR: In this article, the authors used the Mahalanobis distance (MD) to make use of the core of a manufacturing control system to identify deviations from normality in respect to productivity, specify the root cause of the abnormality, and decide how to prioritize the problem.
Abstract: Primary productivity in the semiconductor manufacturing industry, which includes cycle time, cost, and production, depends to a great extent on the capability of manufacturing administrators (MA), particularly with regards to the critical tradeoff between cycle time and tool utilization. The Mahalanobis distance (MD) has significance in pattern recognition, and the authors have found a method to make use of the MD as the core of a manufacturing control system. By using this system, one can easily distinguish deviations from normality in respect to productivity, specify the root cause of the abnormality, and decide how to prioritize the problem. As a result, one can efficiently concentrate limited resources on the root cause in the absence of a capable MA and restore productivity on a minimum timescale.

01 Jan 2001
TL;DR: In this paper, the authors developed a tracked harvester that has been specially developed for harvesting operations on steep terrain using trapezoidal tracked undercarriages for better traction, reduced impact to the ground and greater climbing possibilities on slopes.
Abstract: Productivity models are important decision support tools for harvesting operations. They should be derived with the least possible effort but with the highest possible precision. Models based on production monitoring or time studies are time consuming in their derivation. No models are available for new machine developments although it is particularly important to estimate productivity under given conditions and make statements about parameters such as climbing limits on steep terrain. The Valmet 911 Snake is a tracked harvester that has been specially developed for harvesting operations on steep terrain. A standard four-wheeled Valmet harvester serves as a carrier platform whereby the four wheel sets are replaced with trapezoidal tracked undercarriages. This construction results in better traction, reduced impact to the ground and greater climbing possibilities on slopes. For the productivity model derivation a combined approach has been used. The model consists of three parts: tree processing model, locomotion model and delay model. The delay model has been chosen from literature, the tree processing and locomotion model have been derived from empirical studies. The productivity potential and climbing capabilities of the Valmet 911 Snake steep terrain harvester up to an inclination of 70% could be verified.

Posted Content
TL;DR: This article revisited the conundrum surrounding the measurement of the impact of public infrastructure spending in the light of the aggregation literature and showed why estimations of aggregate production functions (APFs) augmented with a proxy for public expenditures do not yield an estimate that can be interpreted as the productivity of public infrastructures.
Abstract: This paper revisits the conundrum surrounding the measurement of the impact of public infrastructure spending in the light of the aggregation literature (Fisher, 1969; 1993). It shows why estimations of aggregate production functions (APFs) augmented with a proxy for public expenditures do not yield an estimate that can be interpreted as the productivity of public infrastructure. The reason is that underlying all APFs is the accounting identity that relates value added to the wage bill plus overall profits. This identity can be rewritten as a mathematical form that resembles a production function. Therefore, the estimated coefficients cannot be taken to be the structural parameters of a production function. The paper also offers a brief discussion of the productivity puzzle.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a nonlinear programming (NLP) model to jointly determine the optimum incentive and price discount levels for each rate class, aiming at maximizing net revenues.
Abstract: Financial incentives are used to improve productivity and quality in manufacturing and service facilities. This improvement in productivity would normally release part of the facility's productive capacity. Without stimulating additional demand to consume this released capacity, the facility would be unable to tap the full benefits of the improvements in productivity. Hence, yield management is introduced in an attempt to entice more demand and increase revenues. In this paper, we develop a NonLinear Programming (NLP) model to jointly determine the optimum financial incentives and price discount levels for each rate class. The model aims at maximizing net revenues. It includes nonlinear relationships representing the impact of incentives on productivity and quality improvements as well as the effect of price discounts on customer demand in each market segment. The generic nature of our NLP model makes it applicable to all multi-product manufacturing facilities covering sales, production and delivery. The model is applied to determine the optimum incentive and price discount levels for perishable products in a multi-product ready-mix concrete plant. It is demonstrated that the model is useful in maximizing net revenues through productivity improvements and an increase in customer demand.

Journal ArticleDOI
TL;DR: In this article, it was found that when linear programming formulations have capacity constraints that explicitly account for productivity losses, the resulting production plans are superior to those obtained when productivity losses are modelled solely as costs in the objective function.
Abstract: The aggregate planning literature fails to account correctly for productivity losses incurred by employee layoffs and hires. Moreover, the literature is mostly silent on the productivity losses associated with other capacity changes such as multiple shifts and overtime. In those cases where productivity losses are considered, traditional approaches impute associated costs but fail to account for lost capacity. It was found that when linear programming formulations have capacity constraints that explicitly account for productivity losses, the resulting production plans are superior to those obtained when productivity losses are modelled solely as costs in the objective function. Some avenues for future research are also proposed.

Posted Content
TL;DR: In this paper, the authors introduce a new approach to measuring industrial productivity based on income-side data that are published by the Bureau of Economic Analysis (BEA), which is internally consistent in that both inputs and outputs are income side measures of value added.
Abstract: The present study is the second is a series of three papers devoted to issues in the measurement of productivity and productivity growth. The contributions of the present paper are three. First, it introduces a new approach to measuring industrial productivity based on income-side data that are published by the Bureau of Economic Analysis (BEA). The data are internally consistent in that both inputs and outputs are income-side measures of value added, whereas the usual productivity measures combine expenditure-side output measures with income-side input measures. Second, because of interest in the 'new economy,' we have also constructed a set of new-economy accounts. For the purpose of this study, we define the new economy as machinery, electric equipment, telephone and telegraph, and software. Finally, because of concerns about poor deflation in the current output measures, this study constructs a new output concept called 'well-measured output,' which includes only those sectors for which output is relatively well measured. We present a brief summary of the behavior of the alternative measures.

Dissertation
01 Jan 2001
TL;DR: In this article, the authors evaluated two long term breeding policy options in the smallholder grade dairy cattle populations by use of a demographic stationary state productivity model, i.e., equal sharing of imported and locally progeny tested AI semen for breeding both the large and smallscale herds.
Abstract: A study was done based on records generated in the period 19801992 in 398 smallholder herds in 23 districts located in the high and medium potential areas at low, medium and high altitudes to evaluate two long term breeding policy options in the smallholder grade dairy cattle populations by use of a demographic stationary state productivity model—PRY. The first option was the current policy, where there was equal sharing of imported and locally progeny tested AI semen for breeding both the largeand smallscale herds. The second option was whereby there would be use of semen of bulls bred and progeny tested within the smallholder herd environment. The inputs to the productivity model were estimated by least squares analysis procedures. The grade dairy breed groups were Friesian, Ayrshire, Guernsey, Jersey, Nondescript, two Fi crossbreds of Friesian or Ayrshire (Fl) and Jersey or Guernsey (Fs) with zebu and their respective backcrosses RL and Rs; and the Kilifi breed. The respective overall means and standard deviations for lactational milk yield, calving interval, age at first calving and selective culling rate of heifer and cow were 2430.71 and 47/./I kg, 13.88 and 1.64 months, 39.48 and 8.10 months, 9.20 and 5.0 %; and 23.40 % and 6.03 .%.. The respective overall means and standard deviations for survival rates were 83.80 and 30.90; 83.20 and 21.31; 84.30 and 32.66; 88.02 and 28.38; 80.01 and 30.21;-and 84.36 and 31.02% for pre-weaning period of female and male calf; post-weaning period of young stock female and male; and for cows and bulls. On basis of productivity model, the RL was found not to be sustainable under the smallholder environment. The overall productivity in Ksh/kg dry matter intake per animal-year were ̂‘ 322, 1.501, 1.896, 2.369, 1.967, 1.950, 2.257, 2.218 and 2.176 for Friesian, Ayrshire, Guernsey, Jersey, Nondescript, Fi,, Fs, Rs and Kilifi respectively. These results confirmed the suitable reed groups choices to be Jersey, Fs, Rs, Kilifi, Fl, Nondescript and Guernsey since they had outstanding lactational milk yield!" fitness and production efficiency. Sensitivity analyses indicated rank of traits by relative economic values, in descending order, to be lactational milk yield, age at first calving, selective culling rate, calving interval and post-weaning survival rate in females. The trade-offs of potential milk yield increment for fitness loss in the pure breeds were greater than realised annual genetic increment of 6.29 kg milk yield under the current breeding programme. The results also showed that in the projected future, where smallscale farmers will have access to and heavy reliance on imported semen, resulting in the expected annual genetic gain of 38 kg of milk, Jersey will be favoured but the rest of pure breeds will require the alternative breeding policy option. The study ha s established the existence of the genotype-environment interaction with respect to dairy breed groups kept on smallscale farms. Therefore, it was recommended that breed choice and ' breeding programme be moified to match the environment attainable in the majority of smallscale farms. 1 x 0 INTRODUCTION 2 1 Historical Perspectives of Dairy Development in Kenya Kenya is a country with a land area of 582,644 km' located on the Equator within East Africa. There is a wide range of agroclimatic zones, but much of the country is dry. More than 80% of the country receives less than 700 mm of rain per year. However, the area under irrigation is quite small. Hence only about 20% of Kenya can be regarded as land suitable for rainfed agriculture (Jaetzold and Schmidt, 1983). The human population is estimated at 28.6 million. About 90% of this are employed in agriculture, of which 10% are pastoralists. Cattle are widely distributed in the country with 50% being located in the rangelands and the rest are in settled agricultural lands (MALDM, 1996). The establishment of the colonial regime in Kenya in the early part of the 20th century was accompanied by great changes in structures of land ownership and access to land use. A largescale farm sector producing for local and export markets emerged and was separated from mainly subsistence African agriculture by the division of the country into "scheduled" areas, reserved exclusively for actual or potential use of European settlers, and "non-scheduled" or "reserve" areas for African use (Hopcraft et al., 1976). In all about 20% of the arable land area was set aside for European use. At first, all agricultural policies and research were entirely there was gradual shift of emphasis to the development of African agriculture such that in the 1940s some Africans began introducing grade cattle in their farms. But it was not until the 1954 Swynnerton Plan that opposition to this introduction was abandoned (Hopcraft et al., 1976). Just before independence, plans were made for the settlement of Africans on sub-divided large scale mixed farms. The settlement process was accomplished between 1961 and 1971. This process has resulted in a number of different land holdings:i) large plantations and ranches, many of which produce for both the domestic and export market; ii) large-scale mixed farms, most of which are owned by individual Africans or groups through societies or cooperatives; iii) small scale individual farms in former reserve areas; iv) the extensive arid lands, communally owned by pastoralist and nomadic tribes. Today there is large scale/small scale farm dualism with tendency to sub-divide more large scale farms into smallholdings. The large scale farms are averaging 600-700 hectares in striking contrast to the small-scale farms in which a majority are less than 1.5 hectares and very few are more than 5 hectares (MALDM, 1994). The dairy development thrust, which started within the large scale farms in 1920s in the scheduled areas, aimed at importation of grade cows and bulls from Europe, cross breeding 3 exotic dairy breeds with Zebu cattle and disease control. A total of 100,000 grade dairy cattle existed by 1935, rising to about 600,000 by 1963. This cattle population was the main supplier of milk to urban areas and for export (Hopcraft et

Hans Rudolf Heinimann1
01 Jan 2001
TL;DR: In this article, the authors developed a productivity model for a whole family of cut-to-length harvesters based on a linear model with covariates and factors and found that stem volume explained about 63% of the total variability, while machine type contributed about 11%.
Abstract: In forest operations, productivity analyses have mainly been based on time studies. With the increasing appli- cation of computer technology, operation data could be captured automatically or at least semi-automatically. Under central European conditions, operation data are usually recorded on the cutting-unit level. The aim of this study was to develop a productivity model for a whole family of cut-to-length harvesters. More than 2200 data records were available, covering 12 different harvester types. The statistical analysis was based on a linear model with covariates and factors. Here, stem volume explained about 63% of the total variability, while machine type contributed about 11%. The two major findings from this study were that: (1) it is possible to quantify productivity differences among harvester makes, and (2) the influence of tech- nological advances can be estimated. However, data quality was inconsistent because of differences both in recording pro- ductive-system time due to registration methods (manual, electronic, mechanical), as well as in stem volume calculations (e.g., harvester computer, volume calculation at the mill, volume designated according to grading rules). The next steps for improvement will be to standardize data capture and develop productivity databases on a regional or even an industry level.

Proceedings ArticleDOI
01 Jan 2001
TL;DR: The paper argues that time scale decomposition for enhanced information sharing and better deterministic fluid model approximations capturing the essence of the underlying stochastic dynamics can indeed provide these productivity gains.
Abstract: Modern manufacturing productivity and competitiveness is undoubtedly time-based. As a result, the design of production facilities (product cells) and their operation (just in time, zero in process, lean etc. (R. Suri, 1998; K. Suzaki, 1987)) aim at reducing lead times and inventories of either work in process or finished goods. The MRP practice is a serious impediment to further productivity gains. Indeed, significant gains are possible when lead time dynamics of individual supply chain links are accounted for in the overall manufacturing supply chain coordination through synergistic decentralized production planning. Better production planning algorithms that exploit information sharing are capable of taking the industry to the next big step in productivity gains. The paper argues that time scale decomposition for enhanced information sharing and better deterministic fluid model approximations capturing the essence of the underlying stochastic dynamics can indeed provide these productivity gains.

Posted Content
TL;DR: In this paper, the directional distance function (DDF) method is used to analyze the role undesirable outputs of the economy, such as carbon dioxide and other green-house gases, have on the frontier production process.
Abstract: This paper explores a relatively new methodology, the directional distance function method, to analyze productivity growth. The method explicitly evaluates the role undesirable outputs of the economy, such as carbon dioxide and other green-house gases, have on the frontier production process which we specify as a piece-wise linear and convex boundary function. We decompose productivity growth into efficiency change (catching up) and technology change (innovation). We test the statistical significance of the estimates using recently developed bootstrap methods. We also explore implications for growth of total factor productivity in the OECD and Asian economies.

Journal ArticleDOI
TL;DR: This article used a translog cost function to produce econometric estimates of the separate influences of technical change versus scale efficiency in contributing to multifactor productivity growth within the US manufacturing sector.
Abstract: This study utilizes a translog cost function to produce econometric estimates of the separate influences of technical change versus scale efficiency in contributing to multifactor productivity growth within the US manufacturing sector. The analysis generates (two-digit) industry-specific parameters that capture the effects of output versus time-related shifts in the cost function over the 1949–1991 period. Thus initial evidence concerning the relative importance of technical progress (versus ‘scale’) cannot be provided as a source of productivity gains within two-digit industries. The parametric estimates of total factor productivity growth are compared with existing Divisia measures to explore the shortcomings of the growth accounting technique. These long-run patterns hold implications for the productivity convergence hypothesis traced to knowledge spillovers between industries.

Journal ArticleDOI
TL;DR: In this paper, two approaches to measuring the change in productivity are considered: total factor productivity approach (TFP) and hybrid cost proxy model (HCPM) which does not possess the limitations of the TFP approach.
Abstract: Incentive regulation for some of the services provided by local exchange carriers in the U.S. telecommunications industry is based on price caps. Under price caps, a regulated firm's average real prices for services it provides are required to fall by a specified percentage each year. This percentage is known as the X-factor. An important component of the X-factor is productivity change for local exchange carriers providing interstate access service. Two separate approaches to measuring the change in productivity are considered. The total factor productivity approach (TFP), which is currently used in regulatory proceedings in the telecommunications industry, quantifies the change in output less the change in input and classifies it as the measure of productivity growth. There are a number of limitations with this approach. An alternative is proposed-the hybrid cost proxy model (HCPM)-which is an engineering process model that does not possess the limitations of the total factor productivity approach. The ...

Journal ArticleDOI
TL;DR: In this paper, the authors propose to estimate jointly the cost function, the share equations as well as total factor productivity and scale economies measures, using full system estimation to account for all the restrictions implied by their endogeneity.