scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1977"


Journal ArticleDOI
TL;DR: In this paper, the authors consider the interrelationships between the various representations of the system, and develop joint estimation and model selection procedures for the multiple time series model which arises as a multivariate representation of the individual autoregressive moving average models.
Abstract: Univariate autoregressive moving average models for the endogenous variables of a dynamic simultaneous equations system can be interpreted as a form of solution of that system. This paper considers the interrelationships between the various representations of the system, and develops joint estimation and model selection procedures for the multiple time series model which arises as a multivariate representation of the individual autoregressive moving average models. A test of the restriction of common autoregressive parameters is incorporated. Two empirical examples are presented, the first concerned with a model of the hog cycle and the second with a model of the United States economy previously considered by Zellner and Palm.

161 citations


Journal ArticleDOI
TL;DR: In this paper, an extension of Page's method is presented which tests for changes in the parameter values of autoregressive integrated moving average (arima) models, where the distributional properties of the statistics are approximated under the assumption that the series follows an integrated auto-gressive moving average model.
Abstract: Procedures are proposed for monitoring forecast errors in order to detect changes in a time-series model. These procedures are based on likelihood ratio statistics which consist of cumulative sums. An extension of Page's method is presented which tests for changes in the parameter values of autoregressive integrated moving average (arima) models. The distributional properties of the statistics are approximated under the assumption that the series follows an integrated autoregressive moving average model. This approximation is based on the limiting Wiener process. An example is also given.

94 citations


Journal ArticleDOI
TL;DR: In this paper, the least squares estimators of the autoregressive constants for a stationary auto-regression model were considered. And the least square estimators P,3(N), j = 1, - − - -, p, from N data points, of the autoregression constants for the stationary autoregression model are considered.
Abstract: The least squares estimators P,3(N), j = 1, - - -, p, from N data points, of the autoregressive constants for a stationary autoregressive model are considered

93 citations


Dissertation
01 Jan 1977

31 citations



Book ChapterDOI
01 Jan 1977
TL;DR: In this paper, three power spectral density estimation problems are considered: the estimation of a single scalar time series (or equivalently a line array), estimation of power spectral densities matrices for vector valued time series, and estimation of the frequency-wavenumber spectra for random fields.
Abstract: Three power spectral density estimation problems are considered in this presentation. These are the estimation of power spectral density for a single scalar time series (or equivalently a line array), estimation of power spectral density matrices for vector valued time series, and the estimation of wavenumber or frequency-wavenumber spectra for random fields. The paper is primarily a tutorial coverage of autoregressive (also called maximum entropy) and maximum likelihood methods of spectral analysis. In addition to standard material on these subjects a recent generalization of the Burg algorithm from scalar to vector time series is included.

13 citations


Journal ArticleDOI
TL;DR: An approach based on an information measure which characterizes the autoregressive process has been made to the estimation of the order of the process, and a mixing operator, which is found by adding an independent random variable to the sub-sequence of the original data sequence, is introduced.
Abstract: Though there are several methods for estimating the order of the autoregressive process, it is hoped that it can be estimated with high sensitivity. In this paper, an approach based on an information measure which characterizes the autoregressive process has boon made to the estimation of the order of the process. A mixing operator, which is found by adding an independent random variable to the sub-sequence of the original data sequence, is introduced. By applying the operator to the conditional entropies, we developed a statistic which estimates the order of the autoregressive process. Results of computer simulation are presented to verify and compare this algorithm with other methods.

9 citations


Journal ArticleDOI
TL;DR: In this paper, a stationary Gaussian process X(t) is considered which is expressed as an autoregressive process of infinite order, and an estimate for the spectral density is obtained.
Abstract: : A stationary Gaussian process X(t) is considered which is expressed as an autoregressive process of infinite order. An autoregressive model of finite order K is fitted for this process and an estimate for the spectral density is obtained. The consistency and the asymptotic normality of this estimate under some conditions are shown. This estimate has an asymptotically efficient property in a sense under some conditions which are stronger than Berk's conditions.

9 citations


01 Feb 1977
TL;DR: Stochastic models for vector processes, in particular the class of multivariate autoregressive moving average models, are discussed and it is pointed out that moment estimators can be inefficient when moving average parameters are present and an approximate maximum likelihood estimation procedure is suggested.
Abstract: In this paper we discuss stochastic models for vector processes, in particular the class of multivariate autoregressive moving average models. Special cases of this class have been discussed in the literature on multisite streamflow generation and it is shown how these can be brought into a general framework. An iterative model building procedure, consisting of model specification -- estimation -- diagnostic checking is stressed. Results on model specification are given and it is shown how partial autocovariance matrices can be used to check whether multivariate autoregressive models provide adequate representation for (standardized) streamflow sequences. Furthermore, estimation of parameters in multivariate autoregressive moving average models is discussed and it is pointed out that moment estimators can be inefficient when moving average parameters are present. An approximate maximum likelihood estimation procedure is suggested. In the concluding section, we summarize important practical implications for hydrologists.

4 citations


Book ChapterDOI
01 Jan 1977
TL;DR: In this article, the change detection problem in an observed time series is dealt with without the restrictive assumptions mentioned above because they deal with the estimation of a change in parameters of a stochastic difference equation, namely, an autoregression model.
Abstract: This paper deals with the change detection problem in an observed time series. We come across such problems in the recognition of a signal in the noise and technical or medical diagnosis. There are many techniques for detecting the change point, most of which assume [1], [2] the given observation to be statistically independent random variables. Another widely used assumption is that the process observed before a change point should be independent of the process after the change. For example, if these processes are normal with known autocovariance functions, the problem has been solved in [3]. In this paper we shall do without the restrictive assumptions mentioned above because we deal with the estimation of a change in parameters of a stochastic difference equation, namely, an autoregression model.

3 citations


Proceedings ArticleDOI
01 May 1977
TL;DR: The set of PARCOR parameters of an autoregressive process is found sequentially and iteratively and it is proven that the equations are decoupled to the first order for small changes in the parameter values.
Abstract: The set of PARCOR parameters of an autoregressive process is found sequentially and iteratively. It is proven that the equations are decoupled to the first order for small changes in the parameter values. Sequential calculation of the parameters is therefore possible. Using the sign Decorrelator the number of calculations is drastically reduced with little degradation of the performance. Simulation results are given for a specific example. The simulation shows that the system converges and the predietion residual is small and comparable to the excitation of the autoregressive process.

Journal ArticleDOI
TL;DR: In this paper, it is shown that for a Gaussian process to be stationary, it is necessary and sufficient that the infinite order autocovariance matrix should be positive definite.
Abstract: It is well known that, for a gaussian process to be stationary, it is necessary and sufficient that the infinite order autocovariance matrix should be positive definite. This fact can be used to obtain the stationarity conditions, for a general autoregressive process; and, hence, the stationarity and invertibility conditions, for any mixed autoregressive moving average process. An interesting connection with a recently reported recursive approach is also noted.

Book ChapterDOI
01 Jan 1977
TL;DR: In this paper, it was shown that the best linear extrapolation of X m-k based on X m + 1,..., X N does not depend on the covariance function of the non-stationary autoregressive series.
Abstract: Let X 1 ..., X N be a non-stationary p-dimensional autoregressive series of the order n, where n < N. Suppose that m and k are integers such that 1 ≤ m ≤ N - n, 0 ≤ k ≤ m - 1. It is proved that the best linear extrapolation of X m-k based on X m + 1 , ..., X N does not depend on X m + n + 1 , ..., X N . Some formulas for the covariance function of the non-stationary autoregressive series are given. Several special stationary cases are discussed as a consequence of the presented general theorem.

Journal ArticleDOI
TL;DR: A method of on-line recursive system identification for multidimensional systems and the estimation of the order and the coefficients in the water quality model with distributed-lag in a river is developed.
Abstract: Various methods for statistical system identification have been proposed. Among them, the Akaike's method by means of an autoregressive (AR) model is especially useful and practical due to its clear-cut theory and simple algorithm of computation. Based on the Akaike's one-shot identification method we develop a method of on-line recursive system identification for multidimensional systems. Since this algorithm includes the procedure of mean deletion, the stationary time series can directly be handled. By using this method, a method of identifying the order and the coefficients of a one-dimensional distributed-lag model is also shown where it is not necessary to estimate other known parameters. This method is applied to the estimation of the order and the coefficients in the water quality model with distributed-lag in a river.