scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 1980"


Journal ArticleDOI
TL;DR: In this paper, it is shown that there is an innovation process such that the sequence of random variables generated by the linear, additive first-order autoregressive scheme X n = pXn-1 + ∊ n are marginally distributed as gamma (λ, k) variables if 0 ≦p ≦ 1.
Abstract: It is shown that there is an innovation process {∊ n } such that the sequence of random variables {X n } generated by the linear, additive first-order autoregressive scheme X n = pXn-1 + ∊ n are marginally distributed as gamma (λ, k) variables if 0 ≦p ≦ 1. This first-order autoregressive gamma sequence is useful for modelling a wide range of observed phenomena. Properties of sums of random variables from this process are studied, as well as Laplace-Stieltjes transforms of adjacent variables and joint moments of variables with different separations. The process is not time-reversible and has a zero-defect which makes parameter estimation straightforward. Other positive-valued variables generated by the first-order autoregressive scheme are studied, as well as extensions of the scheme for generating sequences with given marginal distributions and negative serial correlations.

328 citations


Journal ArticleDOI
TL;DR: A sequential type of recursive algorithm for identifying state‐dependent models is described, and it is shown how such models may be used for forecasting and for indicating specific types of non‐linear behaviour.
Abstract: . We construct a general class of non-linear models, called ‘state-dependent models’, which have a very flexible non-linear structure and which contain, as special cases, bilinear, threshold autoregressive, and exponential autoregressive models. We describe a sequential type of recursive algorithm for identifying state-dependent models, and show how such models may be used for forecasting and for indicating specific types of non-linear behaviour.

289 citations


Journal ArticleDOI
R. Kashyap1
TL;DR: The inconsistency of the Akaike information criterion (AIC) rule and its variants for estimating the unknown order of the autoregressive model obeyed by a time series is demonstrated.
Abstract: We demonstrate the inconsistency of the Akaike information criterion (AIC) rule and its variants for estimating the unknown order of the autoregressive model obeyed by a time series. We also consider the case of time series which tony not obey AR models.

186 citations


Journal ArticleDOI
TL;DR: An introduction to the identification, estimation, and diagnostic checking of discrete linear transfer functions to model the interrelationships between input and output time series.
Abstract: Time series analysis methods have been applied to a large number of practical problems, including modeling and forecasting economic time series and process and quality control. One aspect of time series analysis is the use of discrete linear transfer functions to model the interrelationships between input and output time series. This paper is an introduction to the identification, estimation, and diagnostic checking of these models. Some aspects of forecasting with transfer function models are also discussed. A survey of intervention analysis models in which the input series is an indicator variable corresponding to an isolated event thought to influence the output is also given. Familiarity with univariate autoregressive integrated moving average modeling is assumed. Extensions to more general multiple time series analysis methods are also briefly discussed.

56 citations


Journal ArticleDOI
TL;DR: In this article, the error made in predicting a first-order autoregressive process with unknown parameters is investigated and it is shown that the least squares predictor is unbiased for symmetric error distributions.

38 citations


Journal ArticleDOI
TL;DR: The random coefficient model applied to panel data in a time-series context is discussed and an analysis of a first-order autoregressive model is presented and illustrated by an example using real data.

37 citations


Journal ArticleDOI
TL;DR: In this paper, an iterative, asymptotically efficient method of estimation for nonlinear regression functions is presented. But it does not take account of the correlation in the estimation procedure.
Abstract: SUMMARY Asymptotically efficient estimators are derived for nonlinear regression parameters when the errors have arisen from an autoregressive series with unknown parameters. An example is given for which this model is applicable, and parameters are estimated. correlation is almost inevitable. In these situations, correlation between the residuals need not imply lack of fit of the regression function. When errors are correlated, estimation of the regression parameters by minimizing the sum of squares of the residuals is inefficient, and if the correlation is positive the estimates of their variances based on the assumed independence will be too small. It is therefore desirable to take account of the correlation in the estimation procedure. If n observations are correlated, generally there are 2n(n - ) correlation parameters. This is far too large a number to estimate from a single series of observations. Even with replicated series either n would need to be small or there would have to be extensive data. Provided that the data have been collected at regular fixed intervals of time, a reduction in parameters is achieved by assuming that the correlation between errors depends only on the time separation between them. Representatioll of the errors by an autoregressive series provides a means of reducing the number of parameters still further while remaining sufficiently flexible to model many series. Durbin (1960) proposed a two-step, asymptotically eHicient method of estimation for the situation when the regression function is linear in its parameters and the error series arise from an autoregressive series. Many regression functions arising in science are nonlinear in their parameters, and this paper presents an iterative, asymptotically efficient method of estimation for such functions.

31 citations


Journal ArticleDOI
TL;DR: A general form for the rate-distortion function is presented for the nonstationary Gaussian autoregressive (AR) process and is shown to differ from the well-known form forThe asymptotically stationary process in a term corresponding to the log rate of the variance growth if the process has exponentially growing variance.
Abstract: A general form for the rate-distortion function is presented for the nonstationary Gaussian autoregressive (AR) process and is shown to differ from the well-known form for the asymptotically stationary process in a term corresponding to the log rate of the variance growth if the process has exponentially growing variance.

28 citations



Journal ArticleDOI
TL;DR: In this article, a method of estimating the parameters of an autoregressive model with real and equal roots in its characteristic equation is developed, which uses the serial autocorrelation function in the estimation process.

20 citations


Journal ArticleDOI
TL;DR: In this article, the maximum likelihood estimator of the parameters of a zero-mean normal stationary first-order autoregressive process is in-vestigated and it is shown that the likelihood function is uniquely maximized at a point in the interior of the parameter space.
Abstract: The maximum likelihood estimator of the parameters of a zero-mean normal stationary first-order autoregressive process is in-vestigated. it is shown that the likelihood function is uniquely maximized at a point in the interior of the parameter space. A closed-form expression is obtained for the estimator.

Journal ArticleDOI
TL;DR: In this article, for finite order normal autoregressive models, sufficient conditions for the existence of maximum likelihood estimates are given, and some cases not satisfying the conditions are studied. But they do not cover all cases where maximum likelihood does not always exist.
Abstract: . In finite order normal moving average models the maximum likelihood estimates always exist. For finite order normal autoregressive models sufficient conditions for the existence of maximum likelihood estimates is given. Some cases not satisfying the conditions are studied.

Journal ArticleDOI
TL;DR: In this article, the asymptotic variances of a spectral estimator for autoregressive moving average (ARMA) processes were derived independently by Kinkel, Tokumaru, and Kaveh.
Abstract: We derive the asymptotic variances of a spectral estimator for autoregressive moving average (ARMA) processes, proposed independently by Kinkel, Tokumaru, and Kaveh. The property of the variances is rather different from that of the well-known autoregressive spectral estimator.

Journal ArticleDOI
TL;DR: In this article, a proof is presented for establishing the convergence of least-squares (LS) identification algorithms when applied to autoregressive (AR) time-series models where some or all poles my be unstable, i.e., outside the unit circle in the complex z-plane.
Abstract: A proof is presented for establishing the convergence of least-squares (LS) identification algorithms when applied to autoregressive (AR) time-series models where some or all poles my be unstable, i.e., outside the unit circle in the complex z -plane. The only assumption on the time-series model is that its residual or driving sequence is a zero-mean uncorrelated (white noise) sequence with finite second moment which is second-moment-ergodic (SME). In cases where the SME condition cannot be established, the resulting identified parameters will relate to a model driven by an SME process which is the LS approximation to the actual process whose identification was sought.

Journal ArticleDOI
TL;DR: Akaike's method for model identification has been used to identify Markov chain models for simple transformations of daily precipitation at three locations in southern Norway, and wind force and wave height at one location in the Norwegian Sea.
Abstract: Akaike's method for model identification has been used to identify Markov chain models for simple transformations of daily precipitation at three locations in southern Norway, and wind force and wave height at one location in the Norwegian Sea. Attempts at identification of the horizontal wind vector as an autoregressive process also have been made. The estimated order of a model appears to increase with the sample size. It also may have a significant uncertainty. The analytical complexity of identified models may appear to be unnecessarily large for some purposes.



Proceedings ArticleDOI
01 Dec 1980
TL;DR: A family of consistent schemes for estimating the unknown order of the autoregressive model obeyed by a finite time series of length N given only that it obeys a finite order AR model are derived.
Abstract: We consider the estimation of the unknown order of the autoregressive (AR) model obeyed by a finite time series of length N given only that it obeys a finite order AR model. We derive a family of consistent schemes for estimating the unknown order. We give explicit upperbounds for the probability of error of the decision rules.

Journal ArticleDOI
TL;DR: In this paper, continuous autoregressive models were used to describe the behavior of traffic indices from discretely sampled data, and second-order differential equation models were constructed to represent dynamic traffic fluctuations as the response of a linear system to a stochastic forcing function.
Abstract: This article discusses the use of continuous autoregressive models to describe the behavior of traffic indices. From discretely sampled data, second-order differential equation models are constructed to represent dynamic traffic fluctuations as the response of a linear system to a stochastic forcing function. The results are compared to the more common M/G/∞ queueing model approach, and the analysis is demonstrated on time series of aircraft concentration in thirty-one enroute air traffic control sectors.

Journal ArticleDOI
TL;DR: In this note, the one-step prediction of autoregressive (AR) lognormal processes is considered and minimum-risk solutions of this problem are briefly discussed and then numerically compared using simulated data.
Abstract: In this note, the one-step prediction of autoregressive (AR) lognormal processes is considered. Minimum-risk solutions of this problem are briefly discussed and then numerically compared using simulated data.

01 Jul 1980
TL;DR: In this article, a general framework for analyzing estimates in nonlinear time series models is developed Ergodic strictly stationary series are treated General conditions for strong consistency and asymptotic normality are derived both for conditional least squares and maximum likelihood type estimates.
Abstract: : A general framework for analyzing estimates in nonlinear time series models is developed Ergodic strictly stationary series are treated General conditions for strong consistency and asymptotic normality are derived both for conditional least squares and maximum likelihood type estimates Examples are taken from exponential autoregressive, random coefficient autoregressive and bilinear time series models Some nonstationary models and examples are treated in a sequel to this paper (Author)

Journal ArticleDOI
J. Duvernoy1
TL;DR: Variations of the slope of handwriting in several hundreds of lines are predicted by using autoregressive models to define descriptors, such as the transition matrix, that account for the building of a further state-space model for handwriting.

Book ChapterDOI
01 Jan 1980
TL;DR: In this article, the authors discuss problems of prediction, filtering, and parameter estimation for simple discrete-time linear stochastic processes, such as autoregressive, moving average, autoregression with moving average residuals, and autoregulatory with superimposed error.
Abstract: This chapter discusses problems of prediction, that is, estimation of future random variables; filtering, that is, estimation of random variables in the presence of noise or super-imposed error; and parameter estimation for some simple discrete-time linear stochastic processes. By Wold's Decomposition Theorem, any regular stationary process without a singular component can be written as a linear process. The chapter describes only simple linear models such as autoregressive, moving average, autoregressive with moving average residuals, and autoregressive with superimposed error, that is, noise.

25 Sep 1980
TL;DR: General results in the theory of time series in m dimensions are obtained, thus providing a broad view applicable to the various models, and the interrelationships among the various types of moving average models are stressed.
Abstract: : General results in the theory of time series in m dimensions are obtained, thus providing a broad view applicable to the various models. The interrelationships among the various types of moving average models, MA, and autoregressive models, AR, and the general ARMA autoregressive moving average models are stressed. (Author)


01 Jan 1980
TL;DR: A family of consistent schemes for estimating the unknoun order of the autoregressive CAR are derived and explicit upperbounds for the probability of error of the decision rules are given.
Abstract: We consider the estimation of the unknoun order of the autoregressive CAR) del obeyed by a finite time series of length N given only that it obeys a finite order AR model. We derive a family of consistent schemes for estimating the unknoun order. Ye give explicit upperbounds for the probability of error of the decision rules.