scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1974"



Journal ArticleDOI
TL;DR: In this article, the authors proposed a method to estimate the residual variance of the autoregressive scheme and the variance of white noise in a time series, based on analogies with regression theory, which yields strongly consistent efficient estimators.
Abstract: If $x(\bullet)$ is a time series which may be written as $x(t) = s(t) + n(t)$ where $t$ is an integer, $s(\bullet)$ an autoregressive signal of order $q$ and $n(\bullet)$ white noise, then the model has $q + 2$ parameters. These are (i) the $q$ autoregressive parameters (ii) the residual variance of the autoregressive scheme and (iii) the variance of the white noise. A method is proposed to estimate the $q + 2$ parameters. This method is based on analogies with regression theory and in the case of a normal series yields strongly consistent efficient estimators.

129 citations


Journal ArticleDOI
TL;DR: In this paper, the determinant and inverse of the covariance matrix of a set of n consecutive observations on a mixed autoregressive moving average process are given for the general autoregression process of order p (n? p), and for the first order mixed auto-regression process.
Abstract: Expressions are obtained for the determinant and inverse of the covariance matrix of a set of n consecutive observations on a mixed autoregressive moving average process. Explicit formulae for the inverse of this matrix are given for the general autoregressive process of order p (n ? p), and for the first order mixed autoregressive moving average process.

96 citations


Proceedings ArticleDOI
01 Nov 1974
TL;DR: Akaike's information criterion (AIC) is used in this article to identify the order of the autoregression to be used, which makes the procedure objective and can be used for model identification when compared to more subjective procedures such as the examination of partial F -statistics.
Abstract: In recent years there has been increasing interest in autoregressive spectrum estimation. This procedure fits a finite autoregression to the time series data, and calculates the spectrum from the estimated autoregression coefficients and the one step prediction error variance. For multivariate time series, the estimated autoregressive matrices and one step prediction covariance matrix produce estimates of the spectra, coherences, phases, and group delays. The use of Akaike's information criterion (AIC) for identification of the order of the autoregression to be used makes the procedure objective. Experience gained from analyzing large amounts of data from the biological and physical sciences has indicated that AIC works very well for model identification when compared to more subjective procedures such as the examination of partial F -statistics. This experience has also indicated that using both autoregressive spectrum estimation and classical spectrum estimation and superimposing the plots gives a much stronger feeling for the shape of the true spectrum being estimated. The results of some of these analyses are presented.

94 citations




Journal ArticleDOI
TL;DR: In this paper, large samples are synthesized through autoregressive, fast fractional noise, and broken line models, based on a practically significant pentad time unit, for four stations in the British Wye catchment.
Abstract: Large samples are synthesized through autoregressive, fast fractional noise, and broken line models, based on a practically significant pentad time unit, for four stations in the British Wye catchment. Hydrologic characteristics such as the numbers, durations, and magnitudes of droughts and floods are assessed by means of the crossing properties at critical levels in the time series. If a Gaussian distribution is applied to the stochastic component, crossing properties other than surplus run lengths are unrealistic. Compatibility with historical properties is achieved by the incorporation of approximately twice the coefficient of skewness estimated from the original data through a transformation to a gamma function. Also there is agreement in serial correlograms and coefficients of skewness. In this way neither negative numbers nor outliers are generated through autoregressive models, but run lengths at extreme levels in data from all models need adjustment, and a closer fitting distribution function is desirable.

16 citations


Journal ArticleDOI
TL;DR: In this article, several estimators of the constant mean about which a stationary first-order autoregressive process varies are studied by utilizing Monte Carlo techniques, and the estimators are shown to be accurate.
Abstract: Several estimators of the constant mean about which a stationary first-order autoregressive process varies are studied by utilizing Monte Carlo techniques.

4 citations