scispace - formally typeset
Search or ask a question

Showing papers on "Moving-average model published in 1993"


Journal ArticleDOI
TL;DR: In this paper, a test procedure for detecting overdifferencing or a moving average unit root in Gaussian autoregressive integrated moving average (ARIMA) models is proposed.
Abstract: Test procedures for detecting overdifferencing or a moving average unit root in Gaussian autoregressive integrated moving average (ARIMA) models are proposed. The tests can be used when an autoregressive unit root is a serious alternative but the hypothesis of primary interest implies stationarity of the observed time series. This is the case, for example, if one wishes to test the null hypothesis that a multivariate time series is cointegrated with a given theoretical cointegration vector. A priori knowledge of the mean value of the observations turns out to be crucial for the derivation of our tests. In the special case where the differenced series follows a first-order moving average process, the proposed tests are exact and can be motivated by local optimality arguments. Specifically, when the mean value of the series is a priori known, we can obtain a locally best invariant (LBI) test that is identical to a one-sided version of the Lagrange multiplier test. But when the mean value is a prior...

116 citations


Journal ArticleDOI
TL;DR: It is shown that, under some constraints, the impulse response of the system can be expressed as a linear combination of cumulant slices, which is used to obtain a well-conditioned linear method for estimating the MA parameters of a nonGaussian process.
Abstract: A linear approach to identifying the parameters of a moving-average (MA) model from the statistics of the output is presented. First, it is shown that, under some constraints, the impulse response of the system can be expressed as a linear combination of cumulant slices. Then, this result is used to obtain a well-conditioned linear method for estimating the MA parameters of a nonGaussian process. The linear combination of slices used to compute the MA parameters can be constructed from different sets of cumulants of different orders, provided a general framework in which all the statistics can be combined. It is not necessary to use second-order statistics (autocorrelation slice), and therefore the proposed algorithm still provides consistent estimates in the presence of colored Gaussian noise. Another advantage of the method is that while most linear methods give totally erroneous estimates if the order is overestimated, the proposed approach does not require a previous estimation of the filter order. The simulation results confirm the good numerical conditioning of the algorithm and its improvement in performance in comparison to existing methods. >

76 citations


Journal ArticleDOI
TL;DR: In this paper, the authors use neural nets to construct nonlinear models to forecast the AL index given solar wind and interplanetary magnetic field (IMF) data, which is a nonlinear generalization of autoregressive moving average models.
Abstract: We use neural nets to construct nonlinear models to forecast the AL index given solar wind and interplanetary magnetic field (IMF) data We follow two approaches: (1) the state space reconstruction approach, which is a nonlinear generalization of autoregressive-moving average models (ARMA) and (2) the nonlinear filter approach, which reduces to a moving average model (MA) in the linear limit The database used here is that of Bargatze et al (1985)

71 citations


Journal ArticleDOI
TL;DR: The asymptotic properties of the discrete Fourier transform of the estimated periodic autocovariance and autocorrelation functions are presented to obtain a parsimonious model representing a periodically stationary time series.
Abstract: . This paper is concerned with the derivation of asymptotic distributions for the sample autocovariance and sample autocorrelation functions of periodic autoregressive moving-average processes, which are useful in modelling periodically stationary time series. In an effort to obtain a parsimonious model representing a periodically stationary time series, the asymptotic properties of the discrete Fourier transform of the estimated periodic autocovariance and autocorrelation functions are presented. Application of the asymptotic results to some specific models indicates their usefulness for model identification analysis.

58 citations


Journal ArticleDOI
01 Jul 1993
TL;DR: In this article, a common foundation for spatial statistics and geostatistics is proposed, which is based on the conditional autoregressive (CAR) model of spatial statistics for estimating missing geo-referenced data.
Abstract: This paper seeks to continue the building of a common foundation for spatial statistics and geostatistics. Equations from the conditional autoregressive (CAR) model of spatial statistics for estimating missing geo-referenced data have been found to be exactly those best linear unbiased estimates obtained with the exponential semi-variogram model of kriging, but in terms of the inverse covariance matrix rather than the covariance matrix itself. Further articulation of such relations, between the moving average (MA) and simultaneous autoregressive (SAR) or autoregressive response (AR) models of spatial statistics, and, respectively, the linear and Gaussian semi-variogram models of kriging, is outlined. The exploratory graphical and numerical work summarized in this paper indicates the following: (a) there is evidence to pair the moving average and linear models; (b) the simultaneous autoregressive and autoregressive response model pair with a Bessel function (modified of the second kind and order one) rather than the Gaussian semi-variogram model; (c) both specification error and measurement error can give rise to the nugget effect discussed in geostatistics; (d) restricting estimation to a geographic subregion introduces edge effects that increasingly bias semi-variogram model parameter estimates as the degree of spatial autocorrelation increases toward its upper limit; and (e) the theoretical spectral density function for a simultaneous autoregressive model is a direct extension of that for the conditional autoregressive model.

45 citations


01 Jan 1993
TL;DR: In this article, a chaotic moving average model that represents the time series as a convolution of a linear filter and an uncorrelated chaotic process was used to distinguish between chaos and randomness.
Abstract: Thirteen tree-ring series from the Salt and Verde River basins in Arizona are studied to distinguish between chaos and randomness. A chaotic moving average model that represents the time series as a convolution of a linear filter and an uncorrelated chaotic process is used. The results are compared to those given by the Grassberger-Procaccia algorithm (GPA) which can also identify the presence of low-dimensional chaotic dynamics.

14 citations


Book ChapterDOI
01 Jun 1993
TL;DR: In this article, the authors provide a quick introduction to the subject of spectral analysis, focusing on the notion of a time series, a set of observations made sequentially in time.
Abstract: Introduction This chapter provides a quick introduction to the subject of spectral analysis. Except for some later references to the exercises of Section 1.6, this material is independent of the rest of the book and can be skipped without loss of continuity. Our intent is to use some simple examples to motivate the key ideas. Since our purpose is to view the forest before we get lost in the trees, the particular analysis techniques we use here have been chosen for their simplicity rather than their appropriateness. Some Aspects of Time Series Analysis Spectral analysis is part of time series analysis, so the natural place to start our discussion is with the notion of a time series. The quip (attributed to R. A. Fisher) that a time series is ‘one damned thing after another’ is not far from the truth: loosely speaking, a time series is a set of observations made sequentially in time. Examples abound in the real world, and Figures 2 and 3 show plots of small portions of four actual time series: the speed of the wind in a certain direction at a certain location, measured every 0.025 second; the monthly average measurements related to the flow of water in the Willamette River at Salem, Oregon; the daily record of a quantity (to be precise, the change in average daily frequency) that tells how well an atomic clock keeps time on a day to day basis (a constant value of 0 would indicate that the clock agreed perfectly with a time scale maintained by the U. S. Naval Observatory); and […]

9 citations


Journal ArticleDOI
TL;DR: In this article, a sequence of repeated survey estimates of a population mean is approached as a time series subject to measurement error and the error is modeled as a moving average process, and an asymptotically efficient estimation method for the parameters of Xt based upon the properties of the least squares estimators of the model is proposed.
Abstract: A sequence of repeated survey estimates of a population mean Xt is approached as a time series subject to measurement error ut . In rotation sampling designs the error ut can be modeled as a moving average process. For an autoregressive process Xt we propose an asymptotically efficient estimation method for the parameters of Xt based upon the properties of the least squares estimators of the model Yt = Xt + ut . A Monte Carlo study examining the proposed estimation methods was conducted.

7 citations


Book ChapterDOI
Joseph L. Hellerstein1
10 May 1993
TL;DR: The autoregressive, moving average (ARMA) model is discussed in detail, with an emphasis on identifying time series models from measurement data using the autcorrelation and partial autocorrelation functions.
Abstract: The need to model dynamic behavior in information systems arises in many contexts, such as characterizing the locality of file access patterns, evaluating the dynamic behavior of scheduling algorithms, and identifying performance problems by their time serial behavior. This paper provides an introduction to time series analysis (a statistical technique), and applies it to analyzing the performance of information systems. The autoregressive, moving average (ARMA) model is discussed in detail, with an emphasis on identifying time series models from measurement data using the autocorrelation and partial autocorrelation functions. The paper concludes with a case study in which time series analysis is used to diagnosis a performance problem in a large computer system.

5 citations


Journal ArticleDOI
TL;DR: In this paper, the least absolute deviation (LAD) estimation of Box-Jenkins models for single time series is considered and the LAD method is applied to a multiplicative seasonal moving average model for monthly rice sales data and to US airline passenger data.

5 citations


Journal ArticleDOI
TL;DR: In this paper, the likelihoood function of the Gaussian MA(1) zero-mean can be expressed in terms of the variance of the process and the first-order autocorrelation or alternatively, the covariance of the unobservable independent normal random variables and the moving average coefficient.


Journal ArticleDOI
TL;DR: Bonferroni-type inequalities are used to approximate probabilities of the joint distribution of residual autocorrelation coefficients from an autoregressive moving-average time series model and can be used to find critical values of a test of whether the largest residual autOCorrelation is significantly different from zero.
Abstract: . Bonferroni-type inequalities are used to approximate probabilities of the joint distribution of residual autocorrelation coefficients from an autoregressive moving-average time series model. The approximations are useful for testing the goodness of fit of the model:they can be used to find critical values of a test of whether the largest residual autocorrelation is significantly different from zero. The approximation based on the first-order Bonferroni inequality is simple to use and adequate in practice.

Journal ArticleDOI
TL;DR: In this paper, the evaluation of certain quadratic forms and traces associated with the irst-order moving average model is dealt with, while considering the maximum likelihood estimation under normality of the parameters of this model.
Abstract: This paper deals with the evaluation of certain quadratic forms and traces associated with the irst-order moving average model. The problem arose while considering the maximum likelihood estimation under normality of the parameters of this model. The quadratic forms are y1R-jy, where y is a vector of observations generated by the models and R is the correlation matrix of the model; the traces are trR-j j cam be any natural number, but emphasis is placed on small Yalues, j = 1,2,3. Procedures in the time and frequency domains are studied, and the amount of computations needed in each case are considered and compared, from which a preferred approach emerges. The computations are compared with several alternative procedures suggested in the literature.

Book ChapterDOI
01 Dec 1993

Journal ArticleDOI
TL;DR: In this article, the limiting distribution of the least squares estimate of the derived process of a noninvertible and nearly non-invertable moving average model with infinite variance innovations is established as a functional function of the Levy process.
Abstract: The limiting distribution of the least squares estimate of the derived process of a noninvertible and nearly noninvertible moving average model with infinite variance innovations is established as a functional of a Levy process. The form of the limiting law depends on the initial value of the innovation and the stable index ae. This result enables one to perform asymptotic testing for the presence of a unit root for a noninvertible moving average model through the constructed derived process under the null hypothesis. It provides not only a parallel analog of its autoregressive counterparts, but also a useful alternative to determine "over-differencing" for time series that exhibit heavy-tailed phenomena.

Journal ArticleDOI
TL;DR: In this paper, the minimax spectral characteristics of the optimal estimate of the transformation Αζ and the least favorable spectral densities for the various classes of densities are found.
Abstract: Αζ= f •/o of a stationary stochastic process £(t) with values in a Hubert space from observations f(t) as t ^ 0 is considered. The minimax spectral characteristics of the optimal estimate of the transformation Αζ and the least favourable spectral densities for the various classes of densities are found. Denote by X a separable Hubert space with inner product (x,y) and an orthonormal basis {ek: k = 1,2,...}. A stochastic process £(t) with values in X is stationary, if its components £* = (f(£),ejt) are square-mean continuous and satisfy the conditions from [1, 2]: 00 Efc(i) = 0, E|K(t)|& = V E|fc(t)| < oo,

Journal Article
TL;DR: In this paper, the authors applied time series techniques to Ghara-Aghaj flow records, in order to generate forecast values of the mean monthly river flows, and provided spectral analyses of the data and exhaustive statistical tests are also provided.
Abstract: Time series techniques are applied to Ghara-Aghaj flow records, in order to generate forecast values of the mean monthly river flows. The study of data and its correlogram shows the effect of seasonality and provide no evidence of trend. The autoregressive models of order one and two (AR1, AR2), moving average model of order one and ARMA (1,1) model are fitted to the stationary series, where the AR2 model yields results which are statistically compatible with the past records. For a better understanding of the behavior the spectral analyses of the data and exhaustive statistical tests are also provided. Synthetic data generated by AR2 model together with historical data are then used to obtain reservoir storage for varying design periods, and draft rates. For this, the Stall's method which is a more sophisticated form of mass curve is employed. Results are given as two design curves.

01 Jan 1993
TL;DR: The robust estimation of signal auto regressive and moving average (ARMA) model parameters in noise by solving a normal equation with singular value decomposition (SVD) and spectral factorization in frequency domain.
Abstract: This paper is concerned with the robust estimation of signal auto regressive and moving average (ARMA) model parameters in noise. The original noisy measurement with high sampling rate is prefiltered by using a moving average process. The fast modes of signal are obtained by using the leading section of the data and the closely spaced slow modes are calculated from the decimated trailing section of the data. The MA coefficients are extracted from the estimated AR coefficients by solving a normal equation with singular value decomposition (SVD) and spectral factorization in frequency domain.

Proceedings ArticleDOI
27 Apr 1993
TL;DR: The authors consider the robust estimation of signal autoregressive and moving average (ARMA) model parameters in noise by solving a normal equation with singular value decomposition (SVD) and spectral factorization in the frequency domain.
Abstract: The authors consider the robust estimation of signal autoregressive and moving average (ARMA) model parameters in noise. The original noisy measurement with high sampling rate is prefiltered by using a moving average process. The fast modes of the signal are obtained by using the leading section of the data and the closely spaced slow modes are calculated from the decimated trailing section of the data. The MA coefficients are extracted from the estimated AR coefficients by solving a normal equation with singular value decomposition (SVD) and spectral factorization in the frequency domain. >