scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1975"


Journal ArticleDOI
Keith Ord1
TL;DR: In this paper, a simplified computational scheme is given and extended to mixed regressive-autoregressive models for spatial interaction, and the ML estimator is compared with several alternatives.
Abstract: Autoregressive models for spatial interaction have been proposed by several authors (Whittle [15] and Mead [11], for example). In the past, computational difficulties with the ML approach have led to the use of alternative estimators. In this article, a simplified computational scheme is given and extended to mixed regressive-autoregressive models. The ML estimator is compared with several alternatives.

1,308 citations


01 Jul 1975
TL;DR: An approach to the solution of these problems of time series analysis through a criterion called CAT, an abbreviation for criterion autoregressive transfer-function, introduced in Parzen (1969) is described.
Abstract: : Three aims of the time series analysis can be distinguished of a finite sample Y(t), t = 1,2,...,T of a univariate or multivariate time series: (1) Spectral analysis, (2) Model identification, and (3) Prediction. In this paper we consider the case in which a joint autoaggressive scheme is a multiple time series which is stationary, normal, and zero mean. We describe an approach to the solution of these problems of time series analysis through a criterion called CAT (an abbreviation for criterion autoregressive transfer-function). CAT enables one to choose the order of an approximating autoregressive scheme which is 'optimal' in the sense that its transfer function is a minimum overall mean square error estimator (called ARTFACT) of the infinite autoregressive transfer function ARTF) of the filter which transforms the time series to its innovations (white noise). Algorithms for choosing the order of an ARTFACT (autoregressive transfer function approximation converging to the truth) enables one to carry out the approach to empirical multiple time series analysis introduced in Parzen (1969), in particular autoregressive spectral estimation of the spectral density matrix of a stationary multiple time series. Such estimators for univariate time series have been very successfully applied in geophysics (see Ulrych and Bishop (1975)) where they are called 'maximum entropy spectral estimators.' This paper provides a basis for an extension of these procedures to multiple time series.

111 citations


Journal ArticleDOI
TL;DR: A procedure for sequentially estimating the parameters and orders of mixed autoregressive moving-average signal models from time-series data is presented.
Abstract: A procedure for sequentially estimating the parameters and orders of mixed autoregressive moving-average signal models from time-series data is presented. Identification is performed by first identifying a purely autoregressive signal model. The parameters and orders of the mixed autoregressive moving-average process are then given from the solution of simple algebraic equations involving the purely autoregressive model parameters.

90 citations


Journal ArticleDOI
H. Tong1
TL;DR: This correspondence exploits one well-known fact concerning autoregressive (AR) signals plus white noise, and uses Akaike's information criterion to develop one efficient procedure for determining the order of the AR signal from noisy data.
Abstract: Davisson [131, [141 has considered the problem of determining the "order" of the signal from noisy data. Although interesting theoretically, his result is difficult to use in practice. In this correspondence, we exploit one well-known fact concerning autoregressive (AR) signals plus white noise, and using Akaike's information criterion [15], [17], we have developed one efficient procedure for determining the order of the AR signal from noisy data. The procedure is illustrated numerically using both artificially generated and real data. The connection between the preceding problem and the classical statistical problem of factor analysis is discussed.

81 citations


Journal ArticleDOI
D. Tjøstheim1
TL;DR: In this article, it was shown that seismic P-wave signals can be represented by parametric models of autoregressive type, which are models having the form X(t)-a1X(t-p)-1)- apX( t-p) =Z(t) where X(p) is the digitized short-period data time series defined by the P-Wave signal, and Z(t), is a white noise series.
Abstract: Summary It is shown that seismic P-wave signals can be represented by parametric models of autoregressive type. These are models having the form X(t)-a1X(t–1)-…- apX(t-p) =Z(t) where X(t) is the digitized short-period data time series defined by the P-wave signal, and Z(t) is a white noise series. The autoregressive analysis is undertaken for 40 underground nuclear explosions and 45 earthquakes from Eurasia. For each event a separate analysis of the noise preceding the event as well as of the P-wave coda has been included. It is found that in most cases a reasonable statistical fit is obtained using a low order autoregressive model. The autoregressive parameters characterize the power spectrum (equivalently, the autocorrelation function) of the P-wave signal and form a convenient basis for studying the possibilities of short-period discrimination between nuclear explosions and earthquakes. A preliminary discussion of these possibilities is included.

51 citations


01 Jan 1975
TL;DR: In this paper, an estimator of the parameters of a nonlinear time series regression when the statistical behavior of the disturbances can be reasonably approximated by an autoregressive model is presented.
Abstract: The article sets forth an estimator of the parameters of a nonlinear time series regression when the statistical behavior of the disturbances can be reasonably approximated by an autoregressive model. The sampling distribution of the estimator and relavent statistics is investigated both theoretically and using Monte-Carlo simulations.

49 citations


Journal ArticleDOI
TL;DR: An algorithm for the sequential identification of the parameters of a stationary process described by an autoregressive (AR) model is presented in this paper, where a set of parameters obtained by a transformation of the AR model is introduced which leads to certain computational advantages.
Abstract: An algorithm for the sequential identification of the parameters of a stationary process described by an autoregressive (AR) model is presented. A set of parameters obtained by a transformation of the AR model is introduced which leads to certain computational advantages. An example is presented to illustrate the use of the algorithms.

32 citations




Journal ArticleDOI
TL;DR: In this paper, it is shown that either type of nonstationarity results in a sample autocorrelation function which in general falls to dampen, and that consequently if m>d the usual identification procedure may result in over-differencing the series.
Abstract: Many time series in economics and other areas can be effectively represented as the sum of a polynomial trend and an autoregressive intagratad moving average process, in some cases after allowing for a systematic modal. Such series are nonstationary if either the degree $ill:a$eill: of the trend or the number d of autoregressive roots lying on the unit circle is greater than zero, Sufficient differencing of the series can eliminate either type-of nonstationarity; however if m>d the minimally diffaranced stationary series is nonlnvertible. It is shown that either type of nonstationarity results in a sample autocorrelation function which in general falls to dampen, and that consequently If m>d the usual identification procedure may result in “over-differencing” the series. A method is presented for tha identification of such series, and some simulated series are analyzed. Analogous problems for seasonal series are considerad briefly.

16 citations



Journal ArticleDOI
TL;DR: In this paper, an asymptotically valid test for first-order autoregressive errors is derived in a simultaneous system of equations context, and allows lagged endogenous variables to be present in the model.
Abstract: This paper derives an asymptotically valid test for first-order autoregressive errors. The test is derived in a simultaneous system of equations context, and allows lagged endogenous variables to be present in the model.

Journal ArticleDOI
TL;DR: The estimation of ARMA and Kalman-Bucy filter models of stationary time-series in cases of complete parameter and Covariance ignorance can be obtained from first identifying a corresponding pure AR (autoregressive) model of finite order.
Abstract: The estimation of ARMA (autoregressive-moving-average) models and of Kalman-Bucy filter models of stationary time-series in cases of complete parameter and Covariance ignorance can be obtained from first identifying a corresponding pure AR (autoregressive) model of finite order. Since the correct AR model is, in general, of infinite order, a bias is introduced in an otherwise consistent estimation procedure. Upper bounds of this bias are presently estimated in terms of a bound on the error in the corresponding innovations process. This bound is shown to converge to zero with increasing the order of the AR model or for the correct order if this order is finite. The above upper bound is thus also shown to yield a criterion for evaluating the finite order to approximate infinite AR models.

Journal ArticleDOI
TL;DR: A sequential algorithm for the identification of discrete-time linear systems based on decomposition of the autoregressive model using the multilevel hierarchical decomposition procedure and the stochastic approximation algorithm.

Journal ArticleDOI
TL;DR: In this paper, three different methods are compared by their ability to predict two periods ahead in simple autoregressive models with one lag, in spite of intuitive objections the usual least squares method performs relatively well.
Abstract: Three different methods are compared by their ability to predict two periods ahead in simple autoregressive models with one lag. In this study both artificial and historical time series are used. In spite of intuitive objections the usual least squares method performs relatively well. Moreover attention is paid to the estimation results, as they provide some links with other studies of the autoregressive model.

01 Sep 1975
TL;DR: A rich class of models, Autoregressive Integrated Moving Average (ARIMA) models, proposed by Box and Jenkins (1970) has a variety of industrial, economic and environmental applications.
Abstract: : Development of modelling for discrete time series has resulted in many applications in different areas. In particular a rich class of models, Autoregressive Integrated Moving Average (ARIMA) models, proposed by Box and Jenkins (1970) has a variety of industrial, economic and environmental applications.