scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1978"


Journal ArticleDOI
TL;DR: In this article, the Lagrange multiplier approach is adopted and it is shown that the test against the nth order autoregressive and moving average error models is exactly the same as the test in the case of the serial correlation model.
Abstract: Since dynamic regression equations are often obtained from rational distributed lag models and include several lagged values of the dependent variable as regressors, high order serial correlation in the disturbances is frequently a more plausible alternative to the assumption of serial independence than the usual first order autoregressive error model. The purpose of this paper is to examine the problem of testing against general autoregressive and moving average error processes. The Lagrange multiplier approach is adopted and it is shown that the test against the nth order autoregressive error model is exactly the same as the test against the nth order moving average alternative. Some comments are made on the treatment of serial correlation.

1,304 citations


01 Jan 1978
TL;DR: In this article, a first-order autoregressive (AR(1)) model is considered, involving a coefficient that is a random variable, and may vary over realizations, and the moments of the coefficient can be identified in terms of the autocovariances.
Abstract: A first-order autoregressive (AR(1)) model is considered, involving a coefficient that is a random variable, and may vary over realizations. In the usual AR(1) model the coefficient has a degenerate distribution, and is thus constant over realizations. We show how moments of the coefficient can be identified in terms of the autocovariances. Using mixed cross-section and time series data, we show how the moments can be estimated, and establish the strong consistency and asymptotic normality of the estimators. We suggest several parametric forms for the distribution of the coefficient, and show how unknown parameters may be determined. The results are applied to real data.

313 citations


Book ChapterDOI
TL;DR: By asking the log likelihood of a model to be an unbiased estimate of the expectedlog likelihood of the model, a reasonable definition of the likelihood is obtained and this allows us to develop a systematic approach to parametric time series modelling.
Abstract: The conventional approach to parametric model fitting of time series is realized through the comparison of various competing models by some ad hoc criterion. Since each of the models is usually specified by the parameters determined by the information from the data, the extension of the classical concept of likelihood to this situation is not obvious. By asking the log likelihood of a model to be an unbiased estimate of the expected log likelihood of the model, a reasonable definition of the likelihood is obtained and this allows us to develop a systematic approach to parametric time series modelling. Practical utility of this approach is demonstrated by numerical examples.

258 citations


Journal ArticleDOI
TL;DR: In this paper, a minimum AIC procedure for fitting a locally stationary autoregressive model is proposed and the least squares computation for the procedure is realized by using the Householder transformation.
Abstract: A minimum AIC procedure for the fitting of a locally stationary autoregressive model is proposed. The least squares computation for the procedure is realized by using the Householder transformation which makes the procedure computationally more flexible and efficient than the one originally proposed by Ozaki and Tong.

172 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss stochastic models for vector processes, in particular the class of multivariate autoregressive moving average models, and present an iterative model building procedure, consisting of model specification and diagnostic checking.

34 citations


Journal ArticleDOI
TL;DR: The use of least squares estimates and their residual energies for obtaining autoregressive estimates satisfying a stability property are investigated and an algorithm is presented for efficient calculation of the estimates.
Abstract: The use of least squares estimates and their residual energies for obtaining autoregressive estimates satisfying a stability property are investigated. The partial correlation coefficients are used to provide an appropriate parametrization for this purpose. An algorithm is presented for efficient calculation of the estimates. Recursive versions of the estimate and maximum entropy properties are briefly discussed.

30 citations


Journal ArticleDOI
TL;DR: The max X2 technique for estimating rhe order of autoregressive processes (McClave (1976)) is extended to moving average models by using the inverse autocorrelation function and the subset autoregression algorithm.
Abstract: The max X2 technique for estimating rhe order of autoregressive processes (McClave (1976)) is extended to moving average models. The autoregressive-moving average duality is exploited by using the inverse autocorrelation function and the subset autoregression algorithm. The technique is demonstrated via simulations, and is applied to Box and Jenkins (1970) Series A.

11 citations


Journal ArticleDOI
TL;DR: One-sample linear rank tests are considered for the case where the observations are not independent but come from an autoregressive process in this paper, and the tests have asymptotically the same properties as under independence, both under the hypothesis and under contiguous location alternatives.
Abstract: One-sample linear rank tests are considered for the case where the observations are not independent but come from an autoregressive process. It is proposed to apply the tests under these circumstances to certain transformations of the observations, rather than to the observations themselves. Then the tests have asymptotically the same properties as under independence, both under the hypothesis and under contiguous location alternatives. In particular, they are asymptotically distribution-free.

10 citations


Journal ArticleDOI
TL;DR: In this paper, a recursive algorithm for computing the resulting estimates for increasing model orders is presented, which is more economical than standard solutions using Gaussian elimination, for example, for high order model fitting, and can be regarded as a special case of the algorithm presented here.
Abstract: Estimates for autoregressive models are obtained by approximating the maximum likelihood estimates in two ways. A recursive algorithm for computing the resulting estimates for increasing model orders is presented. To calculate a pth order estimate 0(p 2) arithmetic operations are required; hence for high order model fitting, the method is more economical than standard solutions using Gaussian elimination, for example. The Levinson–Durbin recursions for the Yule-Walker estimates can be regarded as a special case of the algorithm presented here.

10 citations


Journal ArticleDOI
TL;DR: It is shown that the residual variance is represented by the determinants of the autocorrelation matrix which is derived from applying the amount of information measure to the autoregressive process.
Abstract: The extensive investigations of autoregressive representation in the analysis of a stationary time series have been attracting the attention of many research workers, since the representation is useful in the identification, prediction of the systems and the spectrum estimation of the time series. The problem in fitting an autoregressive model to the observed data lies in the determination of the order of the model. Several methods to estimate the order of the autoregressive model are known, such as the maximum likelihood method or the extended likelihood one, which is called the Final Prediction Error method (FPE). These methods are based on the residual variance, which is composed of the difference between the linear prediction estimate and the observed data. In this paper it is shown that the residual variance is represented by the determinants of the autocorrelation matrix which is derived from applying the amount of information measure to the autoregressive process. This representation is a useful on...

4 citations


Journal ArticleDOI
TL;DR: The first order autoregressive model (FIRM) as discussed by the authors was proposed as a robust model for estimating and testing for means in single subject experiments, which has the advantage of mathematical simplicity and provides good approximations to a number of other models of the type typically encountered in behavioral research.
Abstract: The first order autoregressive model is proposed as a robust model for estimating and testing for means in single subject experiments. It has the advantage of mathematical simplicity, and it provides good approximations to a number of other models of the type typically encountered in behavioral research. Practical considerations on the use of the model are considered including: tests of hypotheses and confidence intervals, sample size requirements, normal approximations, and advantages of the model over the independent error term model. Inferences for means and differences of means are considered.

Journal ArticleDOI
TL;DR: In this paper, the asymptotic joint distribution of the estimated autoregressive coefficients of a model whose order is not necessarily known is derived, which is an extension of the classical result due to Mann and Wald (1943) in which the order is assumed to be known a priori.
Abstract: The asymptotic joint distribution of the estimated autoregressive coefficients of an autoregressive model, whose order is not necessarily known, are derived Our result is an extension of the classical result due to Mann and Wald (1943) in which the order is assumed to be known a priori

01 Mar 1978
TL;DR: In this article, a density function f(dot), with 1/f (dot) both Lebesgue-integrable, has a representation as an autoregressive spectral density.
Abstract: : A density function f(dot), with 1/f (dot) both Lebesgue-integrable, has a representation as an autoregressive spectral density. This representation is used to obtain new density autoregressive estimators of different orders p based on the empirical characteristic function of a sample of size n. The consistency of these new estimators is shown under varying conditions on the smoothness of f(dot).

Proceedings ArticleDOI
M. Kaveh1
01 Jan 1978
TL;DR: A method, closely related to Akaike's Information Criterion, is introduced that more nearly matches practical methods of estimating the parameters of an autoregressive (AR) model of a stationary time series.
Abstract: A method, closely related to Akaike's Information Criterion (AIC), is introduced that more nearly matches practical methods of estimating the parameters of an autoregressive (AR) model of a stationary time series. The method is computationally similar to AIC, and in preliminary experiments has shown considerable success in identifying AR model orders.