scispace - formally typeset
Search or ask a question

Showing papers on "Moving-average model published in 2001"


Journal ArticleDOI
TL;DR: The MAR-ARCH models appear to capture features of the data better than the competing models and are applied to two real datasets and compared to other competing models.
Abstract: We propose a mixture autoregressive conditional heteroscedastic (MAR-ARCH) model for modeling nonlinear time series. The models consist of a mixture of K autoregressive components with autoregressive conditional heteroscedasticity; that is, the conditional mean of the process variable follows a mixture AR (MAR) process, whereas the conditional variance of the process variable follows a mixture ARCH process. In addition to the advantage of better description of the conditional distributions from the MAR model, the MARARCH model allows a more flexible squared autocorrelation structure. The stationarity conditions, autocorrelation function, and squared autocorrelation function are derived. Construction of multiple step predictive distributions is discussed. The estimation can be easily done through a simple EM algorithm, and the model selection problem is addressed. The shape-changing feature of the conditional distributions makes these models capable of modeling time series with multimodal conditional distr...

168 citations


Journal ArticleDOI
TL;DR: This paper developed an alternative model for stationary mean reverting data, the Poisson autoregressive model of order p, or PAR(p) model, and evaluated the properties of this model and presented both Monte Carlo evidence and applications to illustrate.
Abstract: Time series of event counts are common in political science and other social science applications. Presently, there are few satisfactory methods for identifying the dynamics in such data and accounting for the dynamic processes in event counts regression. We address this issue by building on earlier work for persistent event counts in the Poisson exponentially weighted moving-average model (PEWMA) of Brandt et al. (American Journal of Political Science 44(4):823–843, 2000). We develop an alternative model for stationary mean reverting data, the Poisson autoregressive model of order p, or PAR(p) model. Issues of identification and model selection are also considered. We then evaluate the properties of this model and present both Monte Carlo evidence and applications to illustrate.

147 citations


Journal ArticleDOI
TL;DR: In this paper, an autoregressive moving average model is proposed to generate uncorrelated (white noise) time series, but these series are not independent in the non-Gaussian case, and an approximation to the likelihood of the model in the case of Laplacian (two-sided exponential) noise yields a modified absolute deviations criterion.
Abstract: An autoregressive moving average model in which all of the roots of the autoregressive polynomial are reciprocals of roots of the moving average polynomial and vice versa is called an all-pass time series model. All-pass models generate uncorrelated (white noise) time series, but these series are not independent in the non-Gaussian case. An approximation to the likelihood of the model in the case of Laplacian (two-sided exponential) noise yields a modified absolute deviations criterion, which can be used even if the underlying noise is not Laplacian. Asymptotic normality for least absolute deviation estimators of the model parameters is established under general conditions. Behavior of the estimators in finite samples is studied via simulation. The methodology is applied to exchange rate returns to show that linear all-pass models can mimic “nonlinear” behavior, and is applied to stock market volume data to illustrate a two-step procedure for fitting noncausal autoregressions.

107 citations


Journal ArticleDOI
TL;DR: In this article, the authors generalize the mixture autoregressive, MAR, model to the logistic mixture auto-regressive with exogenous variables, LMARX, model for the modelling of nonlinear time series.
Abstract: SUMMARY We generalise the mixture autoregressive, MAR, model to the logistic mixture autoregressive with exogenous variables, LMARX, model for the modelling of nonlinear time series. The models consist of a mixture of two Gaussian transfer function models with the mixing proportions changing over time. The model can also be considered as a generalisation of the self-exciting threshold autoregressive, SETAR, model and the open-loop threshold autoregressive, TARSO, model. The advantages of the LMARX model over other nonlinear time series models include a wider range of shape-changing predictive distributions, the ability to handle cycles and conditional heteroscedasticity in the time series and better point prediction. Estimation is easily done via a simple EM algorithm and the model selection problem is addressed. The models are applied to two real datasets and compared with other competing models.

103 citations


Patent
Dongping Fang1, Ruey S. Tsay1
08 Nov 2001
TL;DR: In this paper, a method and computer systtem is provided for automatically constructing a time series model for the time series, which can be either a univariate or multivariate ARIMA model, depending upon whether predictors, interventions or events are input in the system in addition to the times series.
Abstract: A method and computer systtem is provided for automatically constructing a time series model for the time series (figure 2). The model can be either a univariate or multivariate ARIMA model, depending upon whether predictors, interventions or events are input in the system in addition to the times series. The method for constructing the univariate ARIMA model comprises the steps of inputting missing values of the corresponding times series, finding the proper transformation for positive time series, determining differencing orders, determining non-seasonal AR and MA oders by pattern detection, building an ininitial model, and iteratively estimating and modifying the model. The method for constructing the multivariate model comprises the steps of finding a univariate ARIMA model for the time series, applying the transformation found in the univariate model to all positive time series including the series to be forecast and predictors; applying differencing orders found in the univariate model to all time series including the series to be forecast, predictors, interventions and events, deleting selected predictors and further differencing other predictors, building an initial model wherein its disturbance series follows an ARIMA model with AR and MA orders found in the univariate model, eand iteratively estimating and modifying the model.

92 citations


Journal ArticleDOI
TL;DR: This article presented new characterizations of the integer-valued moving average model and gave moments and probability generating functions for four model variants, including Yule-Walker and conditional least-order.
Abstract: The paper presents new characterizations of the integer-valued moving average model. For four model variants, we give moments and probability generating functions. Yule-Walker and conditional least ...

70 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show consistency in the mean integrated quadratic sense of an estimator of the autocorrelation operator in the autoregressive Hilbertian of order one model.

41 citations


Journal ArticleDOI
TL;DR: The new model, a vector second-order auto-regressive first-order moving average (VARMA(2,1)) process, fits the data better, and produces more realistic simulated series than, existing models.

27 citations


Proceedings ArticleDOI
22 Apr 2001
TL;DR: This paper considers a general class of discrete time systems with batch arrivals and departures, and provides an explicit analytical expression as an infinite sum to obtain the system stationary probability distribution avoiding classical root finding methods, matrix analytical methodologies and finally spectral decomposition approaches.
Abstract: This paper considers a general class of discrete time systems with batch arrivals and departures. Such models appear frequently in the teletraffic analysis of computer and communications networks. Our arrival models are assumed to be quite general. They could be independent and identically distributed (i.i.d) in successive slots, be periodic, be Markovian or described by the moving average time-series model, etc. Our solution framework is novel and unifying. It uses a combination of multi-dimensional generating functions and combinatorial analysis using extensions of classical ballot theorems. In general, we provide an explicit analytical expression as an infinite sum to obtain the system stationary probability distribution avoiding classical root finding methods, matrix analytical methodologies and finally spectral decomposition approaches. We provide a number of analytical and numerical examples including a simple multi-server model with i.i.d arrivals, an ATM multiplexer fed by a (random) number of periodic sources, and a new example considering the discrete moving average model for the arrival process where a simple closed-form expression for the stationary distribution of the system queue lengths is provided.

23 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed an ARCH-based model, which is an autoregressive moving average model (ARM) model with an additional feature of addingitivity and correlation integral.
Abstract: Some key words: Additivity; ARCH; Autocorrelation; Autoregressive model; Autoregressive moving average model; Bandwidth; Bootstrap; Categorical data; Chaos; Conditional mean; Conditional variance; Correlation integral; Durbin-Watson test; Fractional differencing; Goodness of fit; Influence function; Invertibility; Kalman filtering; Levinson-Durbin algorithm; Ljung-Box test; Long memory; M-estimation; Model selection; Moving average model; Nonlinearity; Nonstationarity; Outlier; Panel; Periodogram; Regression; Robustness; Periodicity; Sampling rate; Spectral analysis; State space; Threshold modelling; Time reversibility; Variate difference method.

16 citations


Journal ArticleDOI
TL;DR: Results show that in some cases with noise contamination and incorrect model order assumptions, the GMDH performs better than either the FOS or the least-squares methods in providing only the parameters that are associated with the true model terms.
Abstract: A new algorithm for autoregresive moving average (ARMA) parameter estimation is introduced. The algorithm is based on the group method of data handling (GMDH) first introduced by the Russian cyberneticist, A. G. Ivakhnenko, for solving high-order regression polynomials. The GMDH is heuristic in nature and self-organizes into a model of optimal complexity without any a priori knowledge about the system's inner workings. We modified the GMDH algorithm to solve for ARMA model parameters. Computer simulations have been performed to examine the efficacy of the GMDH and comparison of the GMDH is made to one of the most accurate and one of the most widely used algorithms, the fast orthogonal search (FOS) and the least-squares methods, respectively. The results show that in some cases with noise contamination and incorrect model order assumptions, the GMDH performs better than either the FOS or the least-squares methods in providing only the parameters that are associated with the true model terms. © 2001 Biomedical Engineering Society. PAC01: 8710+e

Journal ArticleDOI
TL;DR: In this article, a threshold autoregressive model, a piecewise constant approximation to nonlinearity, delivers a statistically significant gain over the best fitting AR model for Central England annual mean temperature data.
Abstract: Autoregressive moving average (ARMA) processes are frequently used to model climatological time series. These tools form a broad segment of the class of linear stochastic processes. This paper summarizes formulation of nonlinear models and gives a review of a best developed type of nonlinearity. The main steps of model fitting, i.e. test for nonlinearity, model estimation, and model checking are described. The methodology is applied to Central England annual mean temperature data. A threshold autoregressive model, a piecewise constant approximation to nonlinearity, delivers a statistically significant gain over the best fitting AR model. The forecasting function has three stable points and one limit cycle related to quasi-biennial oscillation.

Journal ArticleDOI
TL;DR: In this paper, a time invariant and symmetric vector error correction model was proposed to explain the evolution of FX-returns depending on recent returns and on lagged deviations from the triangular equality.



Journal ArticleDOI
TL;DR: In this article, the problem of selecting a suitable bandwidth when estimating the marginal density function of a moving average process was considered. And the authors used the smoothed bootstrap method to implement a bandwidth selector for a convolution-type density estimator based on the kernel method.
Abstract: This article is concerned with the problem of selecting a suitable bandwidth when estimating the marginal density function of a moving average process. The smoothed bootstrap method is used to implement a bandwidth selector for a convolution-type density estimator, based on the kernel method. The relative rate of convergence of this selector with respect to the MISE bandwidth is proved to be O P (n −1/2). The finite sample size performance of the selector is investigated in a simulation study.

Posted Content
TL;DR: A number of commonly used estimates of the inverse autocorrelation function can be modified to deal with outlier contaminated data, and the robust analogues of the orthogonal and interpolation based techniques provide an alternative to the robust autoregressive approach.
Abstract: We show how a number of commonly used estimates of the inverse autocorrelation function can be modified to deal with outlier contaminated data The robust analogues of the orthogonal and interpolation based techniques appear to be new, and provide an alternative to the robust autoregressive approach We examine the performance of these techniques in a large scale numerical experiment This shows significant improvements in performance in outlier contaminated data when robust techniques are used While there was no uniformly best robust technique, our experiments support the use of the autoregressive approach to avoid catastrophic reductions in performance, and robust interpolation for short series corrupted by few outliers

Book ChapterDOI
TL;DR: The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling and it is concluded that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.
Abstract: The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality data of diseases of digestive system and analysis of the mortality data of bronchitis, emphysema and asthma. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by three methods: standard normal distribution as estimation, the estimation of the White's theory and the continuous time estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.


Posted Content
TL;DR: In this paper, the authors examined recursive out-of-sample forecasting of monthly postwar U.S. core inflation and log price levels using the autoregressive fractionally integrated moving average model withexplanatory variables (ARFIMAX).
Abstract: We examine recursive out-of-sample forecasting of monthly postwarU.S. core inflation and log price levels. We use theautoregressive fractionally integrated moving average model withexplanatory variables (ARFIMAX). Our analysis suggests asignificant explanatory power of leading indicators associatedwith macroeconomic activity and monetary conditions forforecasting horizons up to two years. Even after correcting forthe effect of explanatory variables, there is conclusive evidenceof both fractional integration and structural breaks in the meanand variance of inflation in the 1970s and 1980s and weincorporate these breaks in the forecasting model for the 1980sand 1990s. We compare the results of the fractionally integratedARFIMA(0,d,0) model with those for ARIMA(1,d,1) models withfixed order of d=0 and d=1 for inflation. Comparing meansquared forecast errors, we find that the ARMA(1,1) model performsworse than the other models over our evaluation period 1984-1999.The ARIMA(1,1,1) model provides the best forecasts, but itsmulti-step forecast intervals are too large.

Posted Content
TL;DR: In this paper, an integer-valued moving average model by cross-sectional and temporal aggregation is proposed to reflect mean check-in and the check-out probability of Norwegian guests in Swedish hotels.
Abstract: Starting from a day-to-day model on hotel specific guest nights we obtain an integer-valued moving average model by cross-sectional and temporal aggregation, The two parameters of the aggregate model reflect mean check-in and the check-out probability. Letting the parameters be functions of dummy and economic variables we demonstrate the potential of the approach in terms of interesting interpretations. Empirical results are presented for a series of Norwegian guests in Swedish hotels. The results indicate strong seasonal patterns in both mean check-in and in the check-out probability. Models based on differenced series are preferred in terms of goodness-of-fit. In a forecast comparison the improvements due to economic variables are small.