scispace - formally typeset
Search or ask a question

Showing papers on "Moving-average model published in 2000"


01 Jan 2000
TL;DR: In this article, the autoregressive estimation for periodically correlated processes, using the parameterization given by the partial autocorrelation function, is considered, and the comparison with other methods is made.
Abstract: We consider the autoregressive estimation for periodically correlated processes, using the parameterization given by the partial autocorrelation function. We propose an estimation of these parameters by extending the sample partial autocorrelation method to this situation. The comparison with other methods is made. Relationships with the stationary multivariate case are discussed.

265 citations


Book
04 Dec 2000
TL;DR: In this paper, the authors present an overview of the most important topics in UNIVARIATE time series analysis, including outlier detection, missing data, and missing data detection.
Abstract: Introduction (D. Pe?a & G. Tiao). BASIC CONCEPTS IN UNIVARIATE TIME SERIES. Univariate Time Series: Autocorrelation, Linear Prediction, Spectrum, State Space Model (G. Wilson). Univariate Autoregressive Moving Average Models (G. Tiao). Model Fitting and Checking, and the Kalman Filter (G. Wilson). Prediction and Model Selection (D. Pe?a). Outliers, Influential Observations and Missing Data (D. Pe?a). Automatic Modeling Methods for Univariate Series (V. Gomez & A. Maravall). Seasonal Adjustment and Signal Extraction in Economic Time Series (V. Gomez & A. Maravall). ADVANCED TOPICS IN UNIVARIATE TIME SERIES. Heteroscedatic Models (R. Tsay). Nonlinear Time Series Models (R. Tsay). Bayesian Time Series Analysis (R. Tsay). Nonparametric Time Series Analysis: Nonparametric Regression, Locally Weighted Regression, Autoregression and Quantile Regression (S. Heiler). Neural Networks (K. Hornik & F. Leisch). MULTIVARIATE TIME SERIES. Vector ARMA Models (G. Tiao). Cointegration in the VAR Model (S. Johansen). Multivariate Linear Systems (M. Deistler). References. Index.

254 citations


01 Jan 2000
TL;DR: In this article, the authors assume that the data come from a stationary time series (Xt) and that the finite-dimensional distributions of this sequence are invariant under shifts of time.
Abstract: When studying a real-life time series, it is frequently reasonable to assume, possibly after a suitable transformation, that the data come from a stationary time series (Xt). This means that the finite-dimensional distributions of this sequence are invariant under shifts of time. Various stationary time series models have been studied in detail in the literature. A standard assumption is that the time series is Gaussian or, more generally, that it has a probability distribution with light tails, in the sense that P(lXtl > x) decays to zero at least exponentially. Zie: Summary

61 citations


Journal ArticleDOI
TL;DR: In this paper, the authors studied the long-time behavior of the partial autocorrelation function of a stationary process, where the autocovariance function is defined by a Gaussian distribution.
Abstract: The purpose of this paper is to study the long-time behaviour of the partial autocorrelation function of a stationary process. Let {Xn} = {Xn : n ∈ Z} be a real, zero-mean, weakly stationary process, defined on a probability space (Ω,F , P ), which we shall simply call a stationary process . Throughout this paper, we assume that {Xn} is purely nondeterministic (see §2). The autocovariance function γ(·) of {Xn} is defined by

43 citations


Patent
Qingli Liu1, Aleksandar Purkovic1
04 Apr 2000
TL;DR: In this article, a multichannel Levinson algorithm for auto-regressive moving average (ARMA) modeling of the channel impulse response is used to determine the TEQ order and TEQ coefficients.
Abstract: A system, device, and method for time-domain equalizer (TEQ) training determines the TEQ order and TEQ coefficients by applying the multichannel Levinson algorithm for auto-regressive moving average (ARMA) modeling of the channel impulse response. Specifically, the TEQ is trained based upon a received training signal. The received training signal and knowledge of the transmitted training signal are used to derive an autocorrelation matrix that is used in formulating the multichannel ARMA model. The parameters of the multichannel ARMA model are estimated via a recursive procedure using the multichannel Levinson algorithm. Starting from a sufficiently high-order model with a fixed pole-zero difference, the TEQ coefficients corresponding to a low-order model are derived from those of a high-order model.

25 citations


Posted Content
TL;DR: In this paper, the autocorrelation structure of the Exponential GARCH(p,q) process of Nelson (1991) is considered and it is seen that, the EGARCH( p, q) model has a richer autoc orrelation structure than the standard Garch(p-q) one.
Abstract: In this paper the autocorrelation structure of the Exponential GARCH(p,q) process of Nelson (1991) is considered. Conditions for the existence of any arbitrary unconditional moment are given. Furthermore, the expressions for the kurtosis and the autocorrelations of squared observations are derived. The properties of the autocorrelation structure are discussed and compared to those of the standard GARCH(p,q) process. In particular, it is seen that, the EGARCH(p,q) model has a richer autocorrelation structure than the standard GARCH(p,q) one. The statistical theory is further illustrated by a few special cases such as the symmetric and the asymmetric EGARCH(2,2) models under the assumption of normal errors or non-normal errors. The autocorrelations computed from an estimated EGARCH(2,1) model of Nelson (1991) are highlighted.

15 citations


Journal ArticleDOI
TL;DR: This work considers the autoregressive estimation for periodically correlated processes, using the parametrization given by the partial autocorrelation function, and proposes an estimation of these parameters by extending the sample partial autOCorrelation method to this situation.
Abstract: We consider the autoregressive estimation for periodically correlated processes, using the parametrization given by the partial autocorrelation function. We propose an estimation of these parameters by extending the sample partial autocorrelation method to this situation. A comparison with other methods is made. Relationships with the stationary multivariate case are discussed.

14 citations


Journal ArticleDOI
Xavier de Luna1
TL;DR: In this article, conditional prediction intervals when using autoregression forecasts are proposed whose simple implementation will hopefully enable wide use, and a simulation study illustrates the improvement over classical intervals in terms of empirical coverage.
Abstract: The variability of parameter estimates is commonly neglected when constructing prediction intervals based on a parametric model for a time series. This practice is due to the complexity of conditioning the inference on information such as observed values of the underlying stochastic process. In this paper, conditional prediction intervals when using autoregression forecasts are proposed whose simple implementation will hopefully enable wide use. A simulation study illustrates the improvement over classical intervals in terms of empirical coverage.

12 citations


01 Jan 2000
TL;DR: In this paper, the authors tried to fit a threshold autoregressive (TAR) model to time series data of monthly coconut oil prices at Cochin market and the results are in favour of TAR process.
Abstract: In this paper we try to fit a threshold autoregressive (TAR) model to time series data of monthly coconut oil prices at Cochin market. The procedure proposed by Tsay [7] for fitting the TAR model is briefly presented. The fitted model is compared with a simple autoregressive (AR) model. The results are in favour of TAR process. Thus the monthly coconut oil prices exhibit a type of non-linearity which can be accounted for by a threshold model.

7 citations


Proceedings ArticleDOI
29 Oct 2000
TL;DR: In this article a different approach is presented with simulation results where initial white Gaussian process is replaced by scaled degraded image avoiding optimization problems.
Abstract: Almost all of parameter estimation schemes for image restoration to date, attempt to model the true image as a autoregressive model and the point spread function as a moving average model and assume the symmetry of the point spread function in order to reduce the computational complexity The autoregressive process builds the true image bypassing a Gaussian white noise process through a filter and may result in unstable systems and optimization of parameters could be trapped in local minima In this article a different approach is presented with simulation results where initial white Gaussian process is replaced by scaled degraded image avoiding optimization problems

4 citations


Proceedings ArticleDOI
12 Dec 2000
TL;DR: Time series analysis is reformulated to allow processing of segmented data and the finite sample theory required for order selection of AR models has been generalized to segments of data.
Abstract: Time series analysis is reformulated to allow processing of segmented data. This involves the reformulation of parameter estimation and order selection. Parameter estimation for autoregressive (AR) models is done by fitting a single model to all segments simultaneously. Parameter estimation for moving average (MA) and the combined ARMA models can be derived entirely from long autoregressive models. The finite sample theory required for order selection of AR models has been generalized to segments of data. The resulting algorithm can also deal effectively with segments of unequal length.

Journal Article
TL;DR: This paper established the combined regression-time-series model for Chinese inflation and compared its forecast results to that of regression model, showing that the residual term in the regression model is the noise generated by the omitted variables that influent dependent variable in the model.

Book ChapterDOI
01 Jan 2000
TL;DR: This work uses graphical models to explore and compare the structure of time series models, and focus on interpolation in e.g. seasonal models.
Abstract: There are various approaches to model time series data. In the time domain ARMA-models and state space models are frequently used, while phase space models have been applied recently, too. Each approach has got its own strengths and weaknesses w.r.t. parameter estimation, prediction and coping with missing data. We use graphical models to explore and compare the structure of time series models, and focus on interpolation in e.g. seasonal models.