scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1979"


Journal ArticleDOI
TL;DR: In this article, a Bayesian extension of the minimum AIC procedure is proposed for the fitting of autoregressive models and the practical utility of the procedure is demonstrated by numerical examples.
Abstract: SUMMARY The proposal of simultaneous use of modified AIC statistics by Bhansali & Downham for the fitting of autoregressive models is reviewed and a Bayesian extension of the minimum AIC procedure is proposed. The practical utility of the procedure is demonstrated by numerical examples.

579 citations


Journal ArticleDOI
TL;DR: A class of generalized M-estimates is proposed which has attractive mean-squared-error robustness properties towards both IO and AO type deviations from the Gaussian model.
Abstract: Outliers in time series can adversely affect both the least squares estimates and ordinary M-estimates of autoregressive parameters. Attention is focused here on obtaining robust estimates of the parameter for a first-order autoregressive time series xk The observations are y k = z k + v k, and two models are considered: Model IO, with v k ≡ 0, x k possibly non-Gaussian, and Model AO, with v k nonzero and possibly quite large a small fraction of the time, and x k Gaussian. A class of generalized M-estimates is proposed which has attractive mean-squared-error robustness properties towards both IO and AO type deviations from the Gaussian model.

249 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed approximate procedures to estimate stationary mixed autoregressive moving average (ARMA) models, and properties of the estimates and an example are given, where Gaussian errors are assumed.
Abstract: Procedures to estimate parameters in multivariate autoregressive moving average (ARMA) models are developed. Gaussian errors are assumed. Exact maximum likelihood estimation procedures are developed for pure moving average models. Approximate procedures are obtained to estimate stationary mixed ARMA models. Properties of the estimates and an example are given.

207 citations


Journal ArticleDOI
Steven Kay1
TL;DR: In this paper, it was shown that the effect of white noise on the autoregressive spectral estimate is to produce a smoothed spectrum, which is a result of the introduction of spectral zeros due to the noise.
Abstract: The autoregressive power spectral density estimator possesses excellent resolution properties. However, it has been shown that for the case of a sinusoidal autoregressive process the addition of noise to the time series results in a decrease in spectral resolution. It is proven that, in general, the effect of white noise on the autoregressive spectral estimate is to produce a smoothed spectrum. This smoothing is a result of the introduction of spectral zeros due to the noise. Finally, the use of a large-order autoregressive model to combat the effects of noise is discussed.

171 citations


Journal ArticleDOI
TL;DR: In this paper, the asymptotic prediction mean squared error is derived for the regression model with autoregressive errors, and the results cover the cases where future values of the exogenous variables are known, where some exogenous variable are predicted from autoregression processes, and where lagged endogenous variables are included in the model.
Abstract: The asymptotic prediction mean squared error is derived for the regression model with autoregressive errors. The results cover the cases where future values of the exogenous variables are known, where some exogenous variables are predicted from autoregressive processes, and where lagged endogenous variables are included in the model. A result is also obtained for the predictor that ignores autocorrelation of the errors and is based on ordinary least squares (OLS) estimates of the regression parameters.

53 citations


Journal ArticleDOI
TL;DR: This paper derived the asymptotic mean squared error of multistep prediction for the general vector autoregressive process for one-step-ahead prediction, which is a surprisingly simple generalization of the result for the scalar autoregression process.
Abstract: SUMMARY This paper derives the asymptotic mean squared error of multistep prediction for the general vector autoregressive process. For one-step-ahead prediction the result is a surprisingly simple generalization of the result for the scalar autoregressive process. Results for multistep prediction are also derived for the regression model with autoregressive errors, where the set of exogenous variables follows a vector autoregressive process.

51 citations


Book
01 Jun 1979
TL;DR: A history of the idea of unobserved components in the analysis of economic time series can be found in this article, where the authors present an approach to the formalization and analysis of Unobserved-Components Models.
Abstract: A History of the Idea of Unobserved Components in the Analysis of Economic Time Series. Introduction to the Theory of Stationary Time Series. The Spectral Representation and Its Estimation. Formulation and Analysis of Unobserved-Components Models. Elements of the Theory of Prediction and Extraction. Formulation of Unobserved-Components Models and Canonical Forms. Estimation of Unobserved-Components and Canonical Models. Appraisal of Seasonal Adjustment Techniques. On the Comparative Structure of Serial Dependence in Some U.S. Price Series. Formulation and Estimation of Mixed Moving-Average Autoregressive Models for Single Time Series: Examples. Formulation and Estimation of Multivariate Mixed Moving-Average Autoregressive Time-Series Models. Formulation and Estimation of Unobserved-Components Models: Examples. Application to the Formulation of Distributed-Lag Models. A Time-Series Model of the U.S. Cattle Industry. Appendices: The Work of Buys Ballot. Some Requisite Theory of Functions of a Complex Variable. Fourier Series and Analysis. Whittle's Theorem. Inversion of Tridiagonal Matrices and a Method for Inverting Toeplitz Matrices. Spectral Densities, Actual and Theoretical, Eight Series. Derivation of a Distributed-Lag Relation between Sales and Production: A Simple Example. References. Author Index. Subject Index.

44 citations



Journal ArticleDOI
TL;DR: A number of segmentation techniques are developed to parameterize and to quantify the spontaneous EEG, and the hypothesis that clusters, containing similar intervals, represent the states that can be recognized within an EEG is proposed.
Abstract: As part of a study of the interand intraindividual variability of the EEG, a. number of segmentation techniques are developed to parameterize and to quantify the spontaneous EEG (4,6). These techniques are applied to split the EEG in more or less stationary intervals. After parameterization, these intervals can be classified. The hypothesis that clusters, containing similar intervals, represent the states that can be recognized within an EEG.

19 citations


Journal ArticleDOI
TL;DR: It is shown here that Kullback information is equivalent to spectral matching error measure and the likelihood ratio of residuals of the autoregressive model and it is useful practically in the segmentation of non-stationary time series.
Abstract: This paper discusses the statistics of measuring the difference between two spectral densities, to detect the change of amplitude and frequency in non-stationary time series. First, Kullback information is developed as a measure to segment non-stationary time series. It is shown here that Kullback information is equivalent to spectral matching error measure and the likelihood ratio of residuals of the autoregressive model and it is useful practically in the segmentation of non-stationary time series. Next, other measures such as Kullback divergence and Bhattacharyya distance are investigated to detect the change of amplitude and frequency in non-stationary time series.

18 citations


Journal ArticleDOI
TL;DR: In this article, a simple method for approximating the variance of meteorological time averages is presented, where the characteristic time between independent estimates and the ratio of variance of time-averaged data to that of unaveraged observations for a first-order autoregressive process are shown.
Abstract: A simple method for approximating the variance of meteorological time averages is presented. Graphs of the characteristic time between independent estimates and the ratio of the variance of time-averaged data to that of unaveraged data for a first-order autoregressive process are shown.

01 Mar 1979
TL;DR: In this article, an autoregressive spectral method for time domain model identification of non-stationary time series is proposed. But the method is not suitable for time series forecasting.
Abstract: : A strategy for building models for an observed time series is presented in this paper. We seek to fit time domain models which can be interpreted in terms of trend and seasonal components, provide forecasts, and provide spectral estimators. Our time series modeling strategy attempts to achieve distribution function. The approach described could be called: the autoregressive spectral method for time domain model identification of non-stationary time series (which could be abbreviated AR-SPECTRAL-TIME-ID). (Author)

28 Feb 1979
TL;DR: In this paper, it was shown that the stochastic expansion of the maximum likelihood estimator (MLE) of 6 is obtained and the MLE is second order asymptotically efficient.
Abstract: Let {Xt} be defined by X t =6Xt 1 +Ut (t=1, 2, ... ), where {Ut} is a sequence of independent identically distributed random variables with mean 0 and variance 1 and Xo is a random variable with mean 0 and variance (12 and for each t Xo is independent of U t . We assume that 161 <1. It is shown that the stochastic expansion of the maximum likelihood estimator (MLE) of 6 is obtained and the MLE is second order asymptotically efficient and that the least squares estimator is second order asymptotically efficient when U t (t= 1, 2, ... ) and Xo are normally distributed and (12=1/(1-62) in the above process. It is noted that the initial condition of Xo does not affect the second order asymptotic efficiency of the MLE.


Journal ArticleDOI
01 Jan 1979
TL;DR: The asymptotic distribution of λ* is determined and the classifi-kation of a time series in one of two classes which are described by ARMA-models is considered.
Abstract: We consider autoregressive models with moving average residuals (ARMA) and as special cases the autoregressive model (AR) and the moving average (MA). After a listing of properties of estimations of the parameters in these models we discuss an approximative likelihood ratio λ* which was investigated by T.W. Anderson. We determine the asymptotic distribution of λ* and with these results we get test for the orders of the autoregressive resp. the moving average part. In. the last section we consider the classifi-kation of a time series in one of two classes which are described by ARMA-models.

Journal ArticleDOI
TL;DR: The authors compare a simple Kaldor-type nonlinear model and comparable linear autoregressive schemes as models of sharp movements often observed in macroeconomic time series that exhibit persistent fluctuations.

Book
01 Jan 1979
TL;DR: The Newton-Raphson and scoring methods are applied to the maximum likelihood equations derived from modified likelihood functions under the Gaussian Assumption to derive asymptotically efficient estimates for the autoregressive matrix coefficients and moving average covariance matrices of the VARMA models.
Abstract: : The purpose of this paper is to derive asymptotically efficient estimates for the autoregressive matrix coefficients and moving average covariance matrices of the vector autoregressive moving average (VARMA) models in both time and frequency domains To do this we shall apply the Newton-Raphson and scoring methods to the maximum likelihood equations derived from modified likelihood functions under the Gaussian Assumption

Journal ArticleDOI
01 Dec 1979-Metrika
TL;DR: In this article, it was shown that the generalized variance of the autoregressive process is exactly equal to the infinite order generalized variance for the moving average process, for any integer, r, given two processes of order r, one auto-gressive and the other moving average but both with the same parameters.
Abstract: In this paper we give a simple proof of the result that, for any integer,r, given two processes of orderr, one autoregressive and the other moving average but both with the same parameters, then the generalized variance of all ordersk≥2r, for the autoregressive process, is exactly equal to the infinite order generalized variance for the moving average process.


Journal ArticleDOI
TL;DR: In this paper, an online procedure for estimating the parameters of linear discrete time systems when input and output are subjected to measurement noise of unknown statistics is described, which is derived through stochastic approximation.
Abstract: This paper describes an on-line procedure for estimating the parameters of linear discrete time systems when input and output are subjected to measurement noise of unknown statistics. The algorithm is derived through stochastic approximation, To ensure unbiased parameter estimates, the correlated part of the residuals are first estimated by modelling the residuals as an autoregressive series, and then subtracted from the estimated residuals. The algorithm estimates the system parameters and noise parameters simultaneously. Three gain expressions are derived for the estimation algorithm. They are («) scalar gain, (b) diagonal matrix gain, and (c) square matrix gain.

Journal ArticleDOI
TL;DR: In this paper, an alternative parameterization for second order a.r. models, which assists understanding and interpretation, is discussed, which is most plausible if certain restrictions are placed on the autoregressive parameters, which in turn lead to restrictions on the autoocorrelation coefficients.
Abstract: In time series analysis, autoregressive (a.r.) models are often fitted successfully to data, and such models are usually among the first to be taught. It is therefore important to understand why such models are useful in practice. An alternative parameterization for second order a.r. models, which assists understanding and interpretation, is discussed. The models are most plausible if certain restrictions are placed on the autoregressive parameters, which in turn lead to restrictions on the autocorrelation coefficients. Fitting the reparameterized model is straightforward, and the reparameterization may be extended, less usefully, to higher order a.r. models. The second‐order model is fitted to some meteorological data.


Proceedings ArticleDOI
01 Apr 1979
TL;DR: The problem of estimating the spectrum of a complex stochastic process observed in additive white noise is reduced to one of parameter identification by assuming an autoregressive model for the process.
Abstract: The problem of estimating the spectrum of a complex stochastic process observed in additive white noise is considered. The problem is reduced to one of parameter identification by assuming an autoregressive model for the process. An algorithm for estimating these parameters is derived. Examples are presented to illustrate the use of the algorithm.

Proceedings ArticleDOI
03 Dec 1979
TL;DR: A procedure for improving the estimate of the mean (in the mean-square sense), compared to &xmarc;, for a weakly-stationary autoregressive time-series is presented and the loss of efficiency caused by estimating the order and coefficients is shown to be small.
Abstract: A procedure for improving the estimate of the mean (in the mean-square sense), compared to x, for a weakly-stationary autoregressive time-series is presented. The improvement provided by the procedure is especially evident when only a relatively short sample time-series is available and the power spectrum of the underlying process is not flat (white). A detailed empirical evaluation of the new estimation procedure, based on a simulation analysis, is presented.The new procedure is based on an estimator that would be BLUE (best linear unbiased estimator) if the order and coefficients of the underlying autoregressive process were known. The loss of efficiency caused by estimating the order and coefficients is shown to be small.

Journal ArticleDOI
TL;DR: In this article, a simple relationship between the partial autocorrelations of a process realisation which requires first differencing, and those for that same sequence of differences is shown.
Abstract: An observation, from practical experience with analysing univariate time series, suggests a simple relationship between the partial autocorrelations of a process realisation which requires first differencing, and those for that same sequence of differences. The asymptotic result is proved for a general once integrated autoregressive process, but an extension to twice integrated processes is shown not to be relevant for finite samples. The results are illustrated with examples from the literature.