scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Business & Economic Statistics in 1986"


Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of economic forecasting, the justification for the Bayesian approach, its implementation, and the performance of one small BVAR model over the past five years.
Abstract: The results obtained in five years of forecasting with Bayesian vector autoregressions (BVAR's) demonstrate that this inexpensive, reproducible statistical technique is as accurate, on average, as those used by the best known commercial forecasting services. This article considers the problem of economic forecasting, the justification for the Bayesian approach, its implementation, and the performance of one small BVAR model over the past five years.

1,115 citations


Journal ArticleDOI
TL;DR: The method of file concatenation with adjusted weights and multiple imputations is described and illustrated on an artificial example, showing the ability to display sensitivity of inference to untestable assumptions being made when creating the matched file.
Abstract: Statistically matched files are created in an attempt to solve the practical problem that exists when no single file has the full set of variables needed for drawing important inferences. Previous methods of file matching are reviewed, and the method of file concatenation with adjusted weights and multiple imputations is described and illustrated on an artificial example. A major benefit of this approach is the ability to display sensitivity of inference to untestable assumptions being made when creating the matched file.

436 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the properties of generalized method of moments estimators of utility function parameters and found that with short lags, the estimates are nearly asymptotically optimal, but with longer lags the estimates concentrate around biased values and confidence intervals become misleading.
Abstract: The article examines the properties of generalized method of moments GMM estimators of utility function parameters. The research strategy is to apply the GMM procedure to generated data on asset returns from stochastic exchange economies; discrete methods and Markov chain models are used to approximate the solutions to the integral equations for the asset prices. The findings are as follows: (a) There is variance/bias trade-off regarding the number of lags used to form instruments; with short lags, the estimates of utility function parameters are nearly asymptotically optimal, but with longer lags the estimates concentrate around biased values and confidence intervals become misleading, (b) The test of the overidentifying restrictions performs well in small samples; if anything, the test is biased toward acceptance of the null hypothesis.

355 citations


Journal ArticleDOI
TL;DR: In this paper, the authors test a variety of such methods in the context of combining forecasts of GNP from four major econometric models and find that a simple average, the normal model with an independence assumption, and the Bayesian model perform better than the other approaches that are studied here.
Abstract: A method for combining forecasts may or may not account for dependence and differing precision among forecasts. In this article we test a variety of such methods in the context of combining forecasts of GNP from four major econometric models. The methods include one in which forecasting errors are jointly normally distributed and several variants of this model as well as some simpler procedures and a Bayesian approach with a prior distribution based on exchangeability of forecasters. The results indicate that a simple average, the normal model with an independence assumption, and the Bayesian model perform better than the other approaches that are studied here.

343 citations


Journal ArticleDOI
TL;DR: In this paper, maximum likelihood techniques for estimating consumer demand functions when budget constraints are piecewise linear are exposited and surveyed, and it is shown that the functions are themselves nonlinear as a result.
Abstract: In this article maximum likelihood techniques for estimating consumer demand functions when budget constraints are piecewise linear are exposited and surveyed. Consumer demand functions are formally derived under such constraints, and it is shown that the functions are themselves nonlinear as a result. The econometric problems in estimating such functions are exposited, and the importance of the stochastic specification is stressed, in particular the specification of both unobserved heterogeneity of preferences and measurement error. Econometric issues in estimation and testing are discussed, and the results of the studies that have been conducted to date are surveyed.

254 citations


Journal ArticleDOI
TL;DR: This article compares the accuracy of these VAR forecasts with that of several prominent forecasters and provides an interpretation of this evidence and speculations on the potential of these two approaches to economic modeling and forecasting.
Abstract: (1986). Forecasting Accuracy of Alternative Techniques: A Comparison of U.S. Macroeconomic Forecasts. Journal of Business & Economic Statistics: Vol. 4, No. 1, pp. 5-15.

152 citations


Journal ArticleDOI
TL;DR: In this paper, two extensions to the ARMA model, bilinearity and ARCH errors, are compared and their combination is considered, along with various least squares and maximum likelihood estimates of the parameters and tests of the estimated models based on these.
Abstract: Two extensions to the ARMA model, bilinearity and ARCH errors are compared, and their combination is considered. Starting with the ARMA model, tests for each extension are discussed, along with various least squares and maximum likelihood estimates of the parameters and tests of the estimated models based on these. The effects each may have on the identification, estimation, and testing of the other are given, and it is seen that to distinguish between the two properly, it is necessary to combine them into a bilinear model with ARCH errors. Some consequences of the misspecification caused by considering only the ARMA model are noted, and the methods are applied to two real time series.

95 citations


ReportDOI
TL;DR: In this paper, a hedonic model of automobile prices that takes gasoline costs into account is developed and used to examine whether gasoline price increases (especially those related to the 1973 and 1979 oil shocks) changed consumer evaluations of the relative qualities of used cars in the U.S. during 1970-1981.
Abstract: A hedonic model of automobile prices that takes gasoline costs into account is developed and used to examine whether gasoline price increases (especially those related to the 1973 and 1979 oil shocks) changed consumer evaluations of the relative qualities of used cars in the U.S. during 1970–1981. We test the null hypothesis that the characteristics' coefficients remained constant over time. It is rejected if gasoline costs are excluded from the model but not if they are included. Alternative approaches are developed to show that the gasoline price increases alone can explain much of the observed changes in the coefficients.

90 citations


Journal ArticleDOI
TL;DR: A recently developed statistical model, called Bayesian vector autoregression (BVAR), has proven to be a useful tool for economic forecasting as mentioned in this paper, predicting a strong resurgence of growth in the second half of 1985 and in 1986.
Abstract: A recently developed statistical model, called Bayesian vector autoregression, has proven to be a useful tool for economic forecasting. Such a model today forecasts a strong resurgence of growth in the second half of 1985 and in 1986.

90 citations


Journal ArticleDOI
TL;DR: In this article, a state-space representation derived from an underlying vector autoregressive process of the expected real interest rate and the expected inflation rate on lagged expectations and lagged values of the observed Treasury bill rate and actual inflation rate is used to estimate the role of inflationary expectations in stock price movements.
Abstract: Hamilton developed a technique for estimating financial market expectations of inflation based on the observed time-series properties of interest rates and inflation The technique is based on a state-space representation derived from an underlying vector autoregressive process of the expected real interest rate and the expected inflation rate on lagged expectations and lagged values of the observed Treasury bill rate and the actual inflation rate This article extends this work in two ways First, we use monthly data, since the quarterly data used by Hamilton may obscure many interesting movements, especially for determining the role of inflationary expectations in stock price movements, and this is one of our primary interests Second, we employ an alternative method developed by Burmeister and Wall for estimating the parameters of the model, and this method leads to a different identification proof Both approaches share the use of the Kalman filter to estimate the unobserved variables, in this case, e

86 citations



Journal ArticleDOI
TL;DR: In this article, the authors present a technique called sampling the future for including this feature in both the estimation and forecasting stages of the Box-Jenkins methodology for univariate time series models.
Abstract: The Box–Jenkins methodology for modeling and forecasting from univariate time series models has long been considered a standard to which other forecasting techniques have been compared. To a Bayesian statistician, however, the method lacks an important facet—a provision for modeling uncertainty about parameter estimates. We present a technique called sampling the future for including this feature in both the estimation and forecasting stages. Although it is relatively easy to use Bayesian methods to estimate the parameters in an autoregressive integrated moving average (ARIMA) model, there are severe difficulties in producing forecasts from such a model. The multiperiod predictive density does not have a convenient closed form, so approximations are needed. In this article, exact Bayesian forecasting is approximated by simulating the joint predictive distribution. First, parameter sets are randomly generated from the joint posterior distribution. These are then used to simulate future paths of the time se...

Journal ArticleDOI
TL;DR: In this article, the authors show that if the biases contain a multiplicative aspect, both estimators of change are then biased, and also present some empirical results that cast doubt on the validity of a purely additive model.
Abstract: Previous analysis of rotation group bias in the Current Population Survey has concluded that if the biases are additive, the ratio and composite estimators of month-to-month change in unemployment are unbiased. This article shows that if the biases contain a multiplicative aspect, both estimators of change are then biased. The article also presents some empirical results that cast doubt on the validity of a purely additive model.

Journal ArticleDOI
TL;DR: In this article, the authors identify and assess the contribution of errors in preliminary data to the forecast error and to forecast error variance of linear dynamic simultaneous equation models of the Italian economy.
Abstract: Econometric models, especially when designed for forecasting purposes, tend to use updated economic series, the last figures(s) of which embed first-published or preliminary data errors. This article identifies and assesses the contribution of errors in preliminary data to the forecast error and to the forecast error variance of linear dynamic simultaneous equation models. The effect of preliminary data errors is shown to be pervasive, although not necessarily weighty. The suggested decomposition of the forecast error is applied to a small macroeconometric model of the Italian economy.

Journal ArticleDOI
TL;DR: In this paper, the joint asymptotic distribution of the upper and lower bounds for the Gini index derived by Gastwirth for grouped data are obtained, from which a conservative distribution-free confidence interval is presented.
Abstract: The joint asymptotic distribution of the upper and lower bounds for the Gini index derived by Gastwirth for grouped data are obtained. From them a conservative asymptotically distribution-free confidence interval for the population Gini index is presented. The methods also yield similar results for other indices of inequality (e.g., Theil's and Atkinson's).

Journal ArticleDOI
TL;DR: In this paper, a bootstrap method is proposed for the choice of the ridge parameter, which provides independent measures of prediction errors based on multiple predictions along with an estimate of the standard error of prediction.
Abstract: Several existing methods for the choice of the ridge parameter are reviewed, and a bootstrap method is proposed. The bootstrap provides independent measures of prediction errors based on multiple predictions along with an estimate of the standard error of prediction. The bootstrap and selected competitors are compared through Monte Carlo simulations for various degrees of design matrix collinearity and varying levels of signal-to-noise ratio. The procedure is also illustrated by application to two published data sets. In one case, the bootstrap choice of the ridge parameter leads to a smaller mean squared error of prediction than the ridge trace method. In the second case, an optimal choice of no perturbation is confirmed. Benefits of the bootstrap choice include its less subjective nature, ease of implementation, and robustness.

Journal ArticleDOI
TL;DR: This paper used sectoral data covering the entire production side of the U.S. economy and employed substitution elasticities calculated from these models as dependent variables in the statistical search for systematic relationships between features of the econometric models and perceptions of the production technologies generated by those models.
Abstract: Earlier attempts at reconciling disparate substitution elasticity estimates examined differences in separability hypotheses, data bases, and estimation techniques, as well as methods employed to construct capital service prices. Although these studies showed that differences in elasticity estimates between two or three studies may be attributable to the aforementioned features of the econometric models, they have been unable to demonstrate this link statistically and establish the existence of systematic relationships between features of the econometric models and the perception of production technologies generated by those models. Using sectoral data covering the entire production side of the U.S. economy, we estimate 34 production models for alternative definitions of the capital service price. We employ substitution elasticities calculated from these models as dependent variables in the statistical search for systematic relationships between features of the econometric models and perceptions of the sec...

Journal ArticleDOI
TL;DR: In this article, it is argued that the normality assumption of the error terms is more appropriate in the linear logit model than in a share equation model with additive disturbances, and their implications for model estimation are discussed in that context.
Abstract: The implications of including autoregressive disturbances in linear logit models of demand systems are explored. It is argued that the normality assumption of the error terms is more appropriate in the linear logit model than in a share equation model with additive disturbances (commonly found in the literature). Autoregressive disturbances and their implications for model estimation are discussed in that context. Both theoretical arguments and empirical evidence are presented in favor of the logit specification given the presence of serial correlation.

Journal ArticleDOI
TL;DR: In this article, a class of estimators for linear structural models that are robust to heavytailed disturbance distributions, gross errors in either the endogenous or exogenous variables, and certain other model failures is presented.
Abstract: This article presents a class of estimators for linear structural models that are robust to heavytailed disturbance distributions, gross errors in either the endogenous or exogenous variables, and certain other model failures. The class of estimators modifies ordinary two-stage least squares by replacing each least squares regression by a bounded-influence regression. Conditions under which the estimators are qualitatively robust, consistent, and asymptotically normal are established, and an empirical example is presented.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a partial adjustment model in which the speed of adjustment is a linear function of policy and economic variables rather than fixed, and applied the model to analyze the behavior of lumber and pulpwood production.
Abstract: This article proposes a partial adjustment model in which the speed of adjustment is a linear function of policy and economic variables rather than fixed. The model is applied to analyze the behavior of lumber and pulpwood production. Since the model is overparameterized and intrinsically nonlinear, the task of estimation is done by the nonlinear least squares method, using quarterly data covering the period from 1961 through 1983. Both the adjustment speeds of lumber and pulpwood production are found to vary inversely with the discrepancy between the actual and the expected interest rate. The adjustment speed of lumber production is also found to shift positively with the difference between the actual and the expected government expenditure. In addition, the results show that the adjustment speed of lumber production suffers from a longterm declining trend, whereas the adjustment speed of pulpwood production exhibits a long-term upward sweep, and that desired outputs of lumber and pulpwood respond negati...

Journal ArticleDOI
TL;DR: A wide variety of time series techniques are now used for generating forecasts of economic variables, with each technique attempting to summarize and exploit whatever regularities exist in a given data set.
Abstract: A wide variety of time series techniques are now used for generating forecasts of economic variables, with each technique attempting to summarize and exploit whatever regularities exist in a given data set. It appears that many researchers arbitrarily choose one of these techniques. The purpose of this article is to provide an example for which the choice of time series technique appears important; merely choosing arbitrarily among available techniques may lead to suboptimal results.

Journal ArticleDOI
TL;DR: In this paper, a model-based approach is used to derive quarterly figures on several variables for the aggregate labor market in the Netherlands that are only observed annually. And attention is given to the properties of estimation procedures based on proxy variables.
Abstract: We use a model-based approach to derive quarterly figures on several variables for the aggregate labor market in the Netherlands that are only observed annually. These approximations are conditional expectations derived from univariate and bivariate quarterly time series models for the series under consideration. They are subsequently used as proxies to estimate and analyze the structural labor market equations. Attention is given to the properties of estimation procedures based on proxy variables.

Journal ArticleDOI
TL;DR: In this article, the authors extend the regression model to a multivariate model that captures the correlation among the variables and allows the errors in the model to be correlated over time, which is called cross-lagged panel studies.
Abstract: Cross-lagged panel studies are studies in which two or more variables are measured for a large number of subjects at each of several points in time. The variables divide naturally into two sets, and the purpose of the analysis is to estimate and test the cross-effects between the two sets. One approach to this analysis is to treat the cross-effects as parameters in regression equations. This study contributes to this approach by extending the regression model to a multivariate model that captures the correlation among the variables and allows the errors in the model to be correlated over time.

Journal ArticleDOI
TL;DR: In this article, a variety of estimation procedures are used to assess the importance of attrition bias in labor supply response to the Seattle and Denver Income Maintenance Experiments (SIME/DIME).
Abstract: Sample attrition is a potentially serious problem for analysis of panel data, particularly experimental panel data. In this article, a variety of estimation procedures are used to assess the importance of attrition bias in labor supply response to the Seattle and Denver Income Maintenance Experiments (SIME/DIME). Data from Social Security Administration earnings records and the SIME/DIME public use file are used to test various hypotheses concerning attrition bias. The study differs from previous research in that data on both attriters and nonattriters are used to estimate the experimental labor supply response. Although not conclusive, the analysis suggests that attrition bias is probably not a serious enough problem in the SIME/DIME data to warrant extensive correction procedures. The methodology used in this study could be applied to other panel data sets.

Journal ArticleDOI
TL;DR: In this paper, the authors distinguish between currency and bank payments on one side and several types of transactions and the transfer of idle money on the other, and make an attempt to measure these variables, with varying success.
Abstract: In the identity of exchange I distinguish between currency and bank payments on one side and several types of transactions and the transfer of idle money on the other. An attempt is made to measure these variables, with varying success. On the payments side I argue that currency velocity is constant (and low) and that the vast rise of bank money velocity is largely due to increased short-term investment of idle funds. The results suggest an upward shift in the level of transactions in 1968–1972, which I attribute to changes in the international role of the dollar.

Journal ArticleDOI
TL;DR: A spectral decomposition method is described for obtaining an upper bound on the amount of measurement error in a time series and the stochastic specification of the errors.
Abstract: A spectral decomposition method is described for obtaining an upper bound on the amount of measurement error in a time series. The method is applied to generated data and to M1b, real GNP, and the CPI. The bounds provide insight into both the amount of measurement error in these series and the stochastic specification of the errors.

Journal ArticleDOI
TL;DR: In this paper, the authors examine the advantages of estimating a third-order rather than a second-order translog utility function in a theoretical and empirical context and demonstrate that the rigor of tests for appropriate functional form is increased by increasing the order of approximation.
Abstract: This article examines the advantages of estimating a third-order rather than a second-order translog utility function in a theoretical and empirical context. It is demonstrated that the rigor of tests for appropriate functional form is increased by increasing the order of approximation. An empirical example demonstrates that a second-order approximation can lead to inconsistent parameter estimates, whereas the third-order translog allows for better modeling of the preference structure and more consistent estimates for policy decisions.

Journal ArticleDOI
TL;DR: This paper used Box-Jenkins univariate time series analysis for four defined variables (real interest rate, money multiplier, real GNP, and money velocity) and compared the forecasting accuracy of the two methods.
Abstract: Many important variables in business and economics are neither measured nor measurable but are simply defined in terms of other measured variables. For instance, the real interest rate is defined as the difference between the nominal interest rate and the inflation rate. There are two ways to forecast a defined variable: one can directly forecast the variable itself, or one can derive the forecast of the defined variable indirectly from the forecasts of the constituent variables. Using Box-Jenkins univariate time series analysis for four defined variables—real interest rate, money multiplier, real GNP, and money velocity—the forecasting accuracy of the two methods is compared. The results show that indirect forecasts tend to outperform direct methods for these defined variables.


Journal ArticleDOI
TL;DR: In this paper, it is shown that exponential smoothing of seasonal differences provides a simple means of damping short oscillations, including seasonal ones, which can be used to filter economic time series if variations within a certain frequency interval are relevant for the problem at hand.
Abstract: Filtering economic time series can be justified if variations within a certain frequency interval are relevant for the problem at hand. It is shown here that exponential smoothing of seasonal differences provides a simple means of damping short oscillations, including seasonal ones.