scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1983"


Journal ArticleDOI
TL;DR: In this paper, the authors established several almost sure asymptotic properties of general autoregressive processes and obtained a proof of the strong consistency of the least-squares estimates of the parameters of the process without any assumption on the roots of the characteristic polynomial.

168 citations


Journal ArticleDOI

127 citations


Journal ArticleDOI
TL;DR: In this article, an expository account of multivariate autoregressive moving average models and an extended sample cross-correlation approach for practical model identification are presented. But they do not consider the model selection.
Abstract: This article provides an expository account of the multivariate autoregressive moving average models and proposes an extended sample cross-correlation approach for practical model identification. An iterative model building procedure for applying these models to real data is discussed and demonstrated by analyzing the 5-series U.S. Hog Data.

124 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss methods for modelling multivariate autoregressive time series in terms of a smaller number of index series which are chosen to provide as complete a summary as possible of the past information contained in the original series necessary for prediction purposes.
Abstract: SUMMARY We discuss methods for modelling multivariate autoregressive time series in terms of a smaller number of index series which are chosen to provide as complete a summary as possible of the past information contained in the original series necessary for prediction purposes. The maximum likelihood method of estimation and asymptotic properties of estimators of the coefficients which determine the index variables, as well as the corresponding autoregressive coefficients, are discussed. A numerical example is presented to illustrate the use of the autoregressive index models.

91 citations


Journal ArticleDOI
TL;DR: In this article, an autoregressive parameter estimator for short data records and/or sharply peaked spectra is presented. The technique is a closer approximation to the true maximum likelihood estimator than that obtained using linear prediction techniques, and it operates in a recursive model order fashion, which allows one to successively fit higher order models to the data.
Abstract: A new method of autoregressive parameter estimation is presented. The technique is a closer approximation to the true maximum likelihood estimator than that obtained using linear prediction techniques. The advantage of the new algorithm is mainly for short data records and/or sharply peaked spectra. Simulation results indicate that the parameter bias as well as the variance is reduced over the Yule-Walker and the forward-backward approaches of linear prediction. Also, spectral estimates exhibit more resolution and less spurious peaks. A stable all-pole filter estimate is guaranteed. The algorithm operates in a recursive model order fashion, which allows one to successively fit higher order models to the data.

71 citations


Book ChapterDOI
TL;DR: The use of autoregressive spectral densities as exact models and as approximating models for true spectral density is often questioned by skeptical statisticians on the ground that their use in general is ad hoc and without theoretical justification.
Abstract: Publisher Summary The problem of spectral analysis of time series is clearly of great interest to the many applied scientists who use spectral analysis in their scientific research. It should be of great interest to statisticians, because it embodies the prototypes of two of the great problems of modern statistics: functional inference and modeling. The estimation of a function often has similar features to model identification, because a function can be parameterized exactly by a countable infinity of parameters. It should be noted that the inverse-correlation function is positive definite. However, the cepstral-correlation function is not. These new types of correlation functions are introduced, because they may provide more parsimonious parameterizations in the sense that they decay to 0 faster than does the correlation function. The use of autoregressive spectral densities as exact models and as approximating models for true spectral densities is often questioned by skeptical statisticians on the ground that their use in general is ad hoc and without theoretical justification.

51 citations


Journal ArticleDOI
TL;DR: In this article, two asymptotic expansions for the distribution for an estimator of the parameter in a first-order autoregressive process are derived, according to two situations.
Abstract: . In this paper, two asymptotic expansions for the distribution for an estimator of the parameter in a first-order autoregressive process are derived, according to two situations. Some well known estimators are special cases of the estimator discussed here. The series expansions are carried to terms of order T-1.

28 citations


Posted Content
TL;DR: In this article, a sufficient condition is given such that first-order autoregressive processes are strong mixing, which is specified in terms of the univariate distribution of the independent identically distributed innovation random variables.
Abstract: A sufficient condition is given such that first-order autoregressive processes are strong mixing. The condition is specified in terms of the univariate distribution of the independent identically distributed innovation random variables. Normal, exponential, uniform, Cauchy, and many other continuous innovation random variables are shown to satisfy the condition. In addition, an example of a first-order autoregressive process which is not strong mixing is given. This process has Bernoulli (p) innovation random variables and any autoregressive parameter in (0,1/2).

21 citations



Posted Content
TL;DR: In this paper, a nonstationary generalization of the classical Yule-Walker equations, relating the time-varying autocorrelations of an autoregressive process to the coefficients of the possible models for this process, is given.
Abstract: A nonstationary generalization of the classical Yule-Walker equations, relating the (time-varying) autocorrelations of an autoregressive process to the coefficients of the possible models for this process, is given. The corresponding theoretical model-building (or spectral factorization) problem, i.e. that of expressing the above mentioned models in terms of the autocorrelations, is solved. This paper, as well as several others, is part of a work whose purpose is a systematic study of time-varying ARMA models. © 1983.

19 citations


Journal ArticleDOI
TL;DR: In this paper, the authors derived analytically the general formulae of the Edgeworth coefficients for the LS estimator in the AR(1) model with exogenous variables, considered by Tse [13].
Abstract: IN RECENT YEARS Edgeworth expansions for some finite sample distributions associated with time series models have been studied by several authors. The pioneering works in this field are Phillips [6 and 7] which give the explicit formulae of the expansions for the distributions of the least squares (LS) estimator and t ratio in the first order autoregressive (abbreviated AR(1)) model. Sargan and Tse [10] and Tse [13] deal with the Edgeworth expansion for the LS estimator in the AR(1) model with exogenous variables based on Sargan [9], which gives the formulae of the Edgeworth coefficients for a certain error function of the econometric estimators. These formulae seem quite general but involve extremely tedious calculations, and it is often almost impossible to work out the coefficients. In fact, Sargan and Tse [10] and Tse [13] do not derive their formulae analytically but calculate them partly analytically and partly numerically. In the present paper we derive analytically the general formulae of the Edgeworth coefficients for the LS estimator in the AR(1) model with exogenous variables, considered by Tse [13]. We note here the differences and relative contributions in both Tse's [13] and our work. Firstly, in [13] the LS estimator a' is expressed in terms of eight random variables, i.e. two quadratic and six linear functions of normal variates. His setting seems very general, but may not be feasible analytically. In this paper, an error function for a' is defined by only two random variables q1 and q2 as below, both of which are quadratic forms of normal variates. This simplicity in the definition makes the calculations involved in the Edgeworth coefficients feasible, but limits us to the ratios of quadratic forms. Secondly, we can draw some interesting interpretations from the resulting formulae, though they are very complicated. Needless to say, we could not draw any interpretations analytically, if the formulae were unknown. Further our result can be reduced to a simple form when the exogenous part does not exist or is merely a constant. Lastly, Tse [13] does not make explicit assumptions about the exogenous variables. But without such assumptions, the Edgeworth expansion is not valid; we impose some weak restrictions which

Journal ArticleDOI
TL;DR: In this paper, a nonstationary generalization of the classical Yule-Walker equations, relating the time-varying autocorrelations of an autoregressive process to the coefficients of the possible models for this process, is given.

Journal ArticleDOI
TL;DR: A method for effecting an autoregressive moving average (ARMA) model estimate that has an elegant algebraic structure, its modeling performance in spectral estimation applications has been empirically found to typically exceed that of such contemporary techniques as the periodogram, the Burg method, and the Box-Jenkins method.
Abstract: The ability to generate rational models of time series plays an important role in such applications as adaptive filtering, spectral estimation, digital control, array processing, and forecasting. A method for effecting an autoregressive moving average (ARMA) model estimate is presented which possesses a number of admirable properties: 1) it has an elegant algebraic structure, 2) its modeling performance in spectral estimation applications has been empirically found to typically exceed that of such contemporary techniques as the periodogram, the Burg method, and the Box-Jenkins method on a variety of problems, 3) it is implementable by computationally efficient algorithms, and 4) it is based on pseudomaximum likelihood concepts. Taken in combination, these properties mark this method as being an effective tool in challenging applications requiring high modeling performance in a real time setting.


Journal ArticleDOI
TL;DR: In this article, the authors presented the limit distribution for the score vector of a growth curve model assuming both stationary and explosive autoregressive (A.R.) errors, where the autocorrelation parameters are treated as nuisance parameters.
Abstract: Summary This paper presents the limit distribution (as the number of time points increase) for the score vector of a growth curve model assuming both stationary and explosive autoregressive (A.R.) errors. Limit distributions of the score statistic and the likelihood-ratio statistic for testing composite hypotheses about the regression parameters of several growth curves, when the autocorrelation parameters are treated as nuisance parameters, are presented.

Journal ArticleDOI
TL;DR: In this article, the problem of estimating panel autoregressive time series is considered and an empirical Bayes procedure is suggested to estimate the parameters using information from all realizations from an unknown distribution.
Abstract: . The problem of estimating panel autoregressive time series is considered. The autoregressive parameters vary over independent realizations from an unknown distribution. An empirical Bayes procedure is suggested to estimate the parameters using information from all realizations.

Book ChapterDOI
01 Jan 1983
TL;DR: In this article, the authors discuss the properties of estimates of the mean square error of prediction in autoregressive models, and discuss the bias to terms of order n-1 of several estimates for an optimal predictor of finite memory.
Abstract: Publisher Summary This chapter discusses the properties of estimates of the mean square error of prediction in autoregressive models. A selection criterion is used to choose a suitable finite order approximation to the infinite order autoregression. There are several criteria available: final prediction error method of Akaike; AIC, an information criterion; and the criterion autoregressive transfer function method of Parzen. All of these methods require an estimate of the mean error of one-step-ahead prediction when an optimal predictor of finite memory is used. The chapter also discusses the bias to terms of order n-1 of several estimates of the mean square error of one-step-ahead prediction for an optimal predictor of finite memory. In the estimates, the autoregressive coefficients, forming the optimal predictor of finite memory, are estimated by regression methods. The chapter describes estimates of the autoregressive coefficients, constructed from both biased and unbiased estimates of the population covariances.

Book ChapterDOI
01 Jan 1983
TL;DR: This paper discusses how models for a univariate or multivariate time series Y(t) can be formulated as hypotheses about the information divergence between alternative models for the conditional probability density of Y (t) given various bases involving past, current, and future values of Y(.) and related time series x(.).
Abstract: : Statisticians, economists, and system engineers are becoming aware that to identify models for time series and dynamic systems, information theoretic ideas can plan a valuable (and unifying) role. This paper discusses how models for a univariate or multivariate time series Y(t) can be formulated as hypotheses about the information divergence between alternative models for the conditional probability density of Y(t) given various bases involving past, current, and future values of Y(.) and related time series x(.). To determine sets of variables that are sufficient to forecast Y(t), and thus to determine a model for Y(t), an approach is presented which estimates and compares various information increments. These information numbers play a central role in studies of causality and feedback. Approximating autoregressive schemes are used to form estimators of the many information numbers that one might compare to identify models for a time series.

Journal ArticleDOI
TL;DR: An efficient algorithm is presented for computing the covariance sequence of a multichannel autoregressive process represented by a set of reflection coefficients that is shown to be the impulse response of a certain lattice filter related to the optimal predictor.
Abstract: An efficient algorithm is presented for computing the covariance sequence of a multichannel autoregressive process represented by a set of reflection coefficients. The covariance sequence is shown to be the impulse response of a certain lattice filter related to the optimal predictor.

Journal ArticleDOI
01 Jan 1983
TL;DR: In this paper, some adaptive procedures for estimating the unknown parameters of autoregressive and moving average processes are considered. In the case of AK(p) and MA(1) processes sequences of estimators converging with probability one and In mean square are given.
Abstract: In the paper we consider some adaptive procedures for estimating the unknown parameters of autoregressive and moving average processes. In case of AK(p) and MA(1) processes sequences of estimators converging with probability one and In mean square are given

Journal ArticleDOI
TL;DR: This paper provides explicit estimates of the eigenvalues of the covariance matrix of an autoregressive process of order one and explicit error bounds are established in closed form.

Proceedings ArticleDOI
01 Apr 1983
TL;DR: A parametric technique using non causal spatial autoregressive models for spectral estimation is given and the usefulness of the method is illustrated by resolving two closely spaced sinusoids on the plane.
Abstract: Two-dimensional spectral estimation from raw data is of interest in signal and image processing. In this paper, a parametric technique using non causal spatial autoregressive models for spectral estimation is given. The spatial autoregressive models characterize the statistical dependency of the observation at location s on its neighbors in all directions. Once an appropriate model is fitted, the spectrum is a function of the model parameters. By assuming specific boundary conditions maximum likelihood estimates of model parameters are obtained. The usefulness of the method developed here is illustrated by resolving two closely spaced sinusoids on the plane.

Journal ArticleDOI
01 Dec 1983-Metrika
TL;DR: In this article, the exact likelihood function for the space time autoregressive moving average model is derived, which is similar to the likelihood function derived in this paper, but with a different model.
Abstract: Recently the space time autoregressive moving average models have become popular and this paper derives the exact likelihood function for this model.

Journal ArticleDOI
TL;DR: In this paper, the first and second order stationarity conditions for an autoregressive model with random coefficients are obtained and the asymptotic mean squared error of an h-step ahead forecast is also considered.
Abstract: The first and second order stationarity conditions for an autore-gressive model with random coefficients are obtained. In addition, for such a type of model, the asymptotic mean squared error of an h-step ahead forecast is also considered.

Journal ArticleDOI
01 Oct 1983
TL;DR: In this article, an estimation method for the linear credibility estimator for autoregressive models is presented, where the emphasis lies more on practicability, than on optimality of the estimation procedure.
Abstract: Recently the author (Kremer (1982 a)) and Sundt (1982) developed recursion formulas for the linear credibility estimator in general evolutionary models. For practical application one needs handy estimation-methods for the parameters of the models. In the sequel such an estimation method is presented for the subclass of autoregressive models. Emphasis lies more on practicability, than on optimality of the estimation procedure.

Journal ArticleDOI
TL;DR: In this paper, a linear autoregressive model was used to forecast the monthly rainfall in a stationary time series and the order of this model was chosen by means of a t-test or F-test.
Abstract: Methods of fitting a linear autoregressive model to a stationary time series are summarized. Parameters of the linear autoregressive model were estimated by the Durbin stepwise procedure and the order of this model was chosen by means of a t-test or F-test. An illustrative example used to forecast the monthly rainfall is also presented.

Book ChapterDOI
Paul Doukhan1
01 Jan 1983
TL;DR: In this article, the authors show that some theoretical results concerning autoregressive processes are indeed applicable and choose discretization parameters for the computation of non parametric kernel estimators for these processes.
Abstract: The main object of this work is to show that some theoretical results concerning autoregressive processes [4] are indeed applicable: first, we choose discretization parameters for the computation of non parametric kernel estimators for these processes; then, we investigate some “bad” cases and some “good” cases; it seems that effective computations generally give better results than those obtained in theory, finally we study the relation between the deterministic case of iterations and the non-deterministic case of autoregressive process. In addition, we describe the behaviour of the invariant measures associated with the relevent process when there is little white noise.

Journal ArticleDOI
TL;DR: This method is an expansion of Levinson's algorithm that reduces the problem of estimating autoregressive and moving-average parameters from the criterion to that of solving a 2 × 2 block Toeplitz matrix.
Abstract: Levinson's algorithm is one of the fast algorithms for estimating the parameters of an auto-regressive model. Since this algorithm is recursive with respect to the order of a model, it is an effective method to obtain models of various orders when the true order is unknown. This paper proposes a fast algorithm to estimate the parameters of an autoregressive and a movingaverage model. the parameter estimation can be accomplished using a criterion based on the least square method. the problem of estimating autoregressive and moving-average parameters from the criterion is reduced to that of solving a 2 × 2 block Toeplitz matrix. This method is an algorithm to solve this block Toeplitz matrix rapidly and it does not require the calculation of the inverse matrix. Also, it is recursive with respect to the order; the orders of an autoregressive model and a moving-average model can be changed independently of each other. This method is an expansion of Levinson's algorithm and this paper shows the relationship between this method and Levinson's algorithm. Furthermore, properties of this method are discussed and as an example of its application, an experiment of model reduction is performed.


Journal ArticleDOI
Y. Inouye1
TL;DR: In this article, the autocorrelation method for fitting multichannel time series from the nondegenerate case to the general case was extended to include the degenerate case.
Abstract: This note extends the autocorrelation method for fitting multichannel time series from the nondegenerate case to the general case. It shows that the AR model obtained by solving the normal equations is always stable even in the degenerate case. The multichannel Levinson algorithm for solving the normal equations is extended from the nondegenerate case to the general case.