scispace - formally typeset
Search or ask a question

Showing papers on "STAR model published in 2015"


Book
01 Jan 2015
TL;DR: The EViews work file as discussed by the authors is a large collection of work on statistical analysis of growth models with time-related effects and their application in the context of economic forecasting. But it is not a complete set of models.
Abstract: Preface. 1 EViews workfile and descriptive data analysis. 1.1 What is the EViews workfile? 1.2 Basic options in EViews. 1.3 Creating a workfile. 1.4 Illustrative data analysis. 1.5 Special notes and comments. 1.6 Statistics as a sample space. 2 Continuous growth models. 2.1 Introduction. 2.2 Classical growth models. 2.3 Autoregressive growth models. 2.4. Residual tests. 2.5 Bounded autoregressive growth models. 2.6 Lagged variables or autoregressive growth models. 2.7 Polynomial growth model. 2.8 Growth models with exogenous variables. 2.9 A Taylor series approximation model. 2.10 Alternative univariate growth models. 2.11 Multivariate growth models. 2.12 Multivariate AR(p) GLM with trend. 2.13 Generalized multivariate models with trend. 2.14 Special notes and comments. 2.15 Alternative multivariate models with trend. 2.16 Generalized multivariate models with time-related effects. 3 Discontinuous growth models. 3.1 Introduction. 3.2 Piecewise growth models. 3.3 Piecewise S-shape growth models. 3.4 Two-piece polynomial bounded growth models. 3.5 Discontinuous translog linear AR(1) growth models. 3.6 Alternative discontinuous growth models. 3.7 Stability test. 3.8 Generalized discontinuous models with trend. 3.9 General two-piece models with time-related effects. 3.10 Multivariate models by states and time periods. 4 Seemingly causal models. 4.1 Introduction. 4.2 Statistical analysis based on a single time series. 4.3 Bivariate seemingly causal models. 4.4 Trivariate seemingly causal models. 4.5 System equations based on trivariate time series. 4.6 General system of equations. 4.7 Seemingly causal models with dummy variables. 4.8 General discontinuous seemingly causal models. 4.9 Additional selected seemingly causal models. 4.10 Final notes in developing models. 5 Special cases of regression models. 5.1 Introduction. 5.2 Specific cases of growth curve models. 5.3 Seemingly causal models. 5.4 Lagged variable models. 5.5 Cases based on the US domestic price of copper. 5.6 Return rate models. 5.7 Cases based on the BASICS workfile. 6 VAR and system estimation methods. 6.1 Introduction. 6.2 The VAR models. 6.3 The vector error correction models. 6.4 Special notes and comments. 7 Instrumental variables models. 7.1 Introduction. 7.2 Should we apply instrumental models? 7.3 Residual analysis in developing instrumental models. 7.4 System equation with instrumental variables. 7.5 Selected cases based on the US-DPOC data. 7.6 Instrumental models with time-related effects. 7.7 Instrumental seemingly causal models. 7.8 Multivariate instrumental models based on the US-DPOC. 7.9 Further extension of the instrumental models. 8 ARCH models. 8.1 Introduction. 8.2 Options of ARCH models. 8.3 Simple ARCH models. 8.4 ARCH models with exogenous variables. 8.5 Alternative GARCH variance series. 9 Additional testing hypotheses. 9.1 Introduction. 9.2 The unit root tests. 9.3 The omitted variables tests. 9.4 Redundant variables test (RV-test). 9.5 Nonnested test (NN-test). 9.6 The Ramsey RESET test. 9.7 Illustrative examples based on the Demo.wf1. 10 Nonlinear least squares models. 10.1 Introduction. 10.2 Classical growth models. 10.3 Generalized Cobb-Douglas models. 10.4 Generalized CES models. 10.5 Special notes and comments. 10.6 Other NLS models. 11 Nonparametric estimation methods. 11.1 What is the nonparametric data analysis. 11.2 Basic moving average estimates. 11.3 Measuring the best fit model. 11.4 Advanced moving average models. 11.5 Nonparametric regression based on a time series. 11.6 The local polynomial Kernel fit regression. 11.7 Nonparametric growth models. Appendix A: Models for a single time series. A.1 The simplest model. A.2 First-order autoregressive models. A.3 Second-order autoregressive model. A.4 First-order moving average model. A.5 Second-order moving average model. A.6 The simplest ARMA model. A.7 General ARMA model. Appendix B: Simple linear models. B.1 The simplest linear model. B.2 Linear model with basic assumptions. B.3 Maximum likelihood estimation method. B.4 First-order autoregressive linear model. B.5 AR(p) linear model. B.6 Alternative models. B.7 Lagged-variable model. B.8 Lagged-variable autoregressive models. B.9 Special notes and comments. Appendix C: General linear models. C.1 General linear model with i.i.d. Gaussian disturbances. C.2 AR(1) general linear model. C.3 AR(p) general linear model. C.4 General lagged-variable autoregressive model. C.5 General models with Gaussian errors. Appendix D: Multivariate general linear models. D.1 Multivariate general linear models. D.2 Moments of an endogenous multivariate. D.3 Vector autoregressive model. D.4 Vector moving average model. D.5 Vector autoregressive moving average model. D.6 Simple multivariate models with exogenous variables. D.7 General estimation methods. D.8 Maximum likelihood estimation for an MGLM. D.9 MGLM with autoregressive errors. References. Index.

110 citations


Journal ArticleDOI
TL;DR: In this article, quantile autocorrelation function (QACF) and quantile partial correlation (QPACF) were proposed to identify the autoregressive order of a model.
Abstract: In this article, we propose two important measures, quantile correlation (QCOR) and quantile partial correlation (QPCOR). We then apply them to quantile autoregressive (QAR) models, and introduce two valuable quantities, the quantile autocorrelation function (QACF) and the quantile partial autocorrelation function (QPACF). This allows us to extend the Box–Jenkins three-stage procedure (model identification, model parameter estimation, and model diagnostic checking) from classical autoregressive models to quantile autoregressive models. Specifically, the QPACF of an observed time series can be employed to identify the autoregressive order, while the QACF of residuals obtained from the fitted model can be used to assess the model adequacy. We not only demonstrate the asymptotic properties of QCOR and QPCOR, but also show the large sample results of QACF, QPACF, and the quantile version of the Box–Pierce test. Moreover, we obtain the bootstrap approximations to the distributions of parameter estimators and p...

105 citations


Journal ArticleDOI
TL;DR: In this paper, the authors extend the class of score type change point statistics considered in 2007 by Huskova, Praskova, and Steinebach to the vector autoregressive (VAR) case and the epidemic change alternative.
Abstract: The primary contributions of this article are rigorously developed novel statistical methods for detecting change points in multivariate time series. We extend the class of score type change point statistics considered in 2007 by Huskova, Praskova, and Steinebach to the vector autoregressive (VAR) case and the epidemic change alternative. Our proposed procedures do not require the observed time series to actually follow the VAR model. Instead, following the strategy implicitly employed by practitioners, our approach takes model misspecification into account so that our detection procedure uses the model background merely for feature extraction. We derive the asymptotic distributions of our test statistics and show that our procedure has asymptotic power of 1. The proposed test statistics require the estimation of the inverse of the long-run covariance matrix which is particularly difficult in higher-dimensional settings (i.e., where the dimension of the time series and the dimension of the parameter vecto...

98 citations


Journal ArticleDOI
TL;DR: A novel copula-based model is proposed that allows for the non-linear and non-symmetric modeling of serial as well as between-series dependencies and exploits the flexibility of vine copulas.
Abstract: The analysis of multivariate time series is a common problem in areas like finance and economics. The classical tools for this purpose are vector autoregressive models. These however are limited to the modeling of linear and symmetric dependence. We propose a novel copula-based model that allows for the non-linear and non-symmetric modeling of serial as well as between-series dependencies. The model exploits the flexibility of vine copulas, which are built up by bivariate copulas only. We describe statistical inference techniques for the new model and discuss how it can be used for testing Granger causality. Finally, we use the model to investigate inflation effects on industrial production, stock returns and interest rates. In addition, the out-of-sample predictive ability is compared with relevant benchmark models. Copyright © 2014 John Wiley & Sons, Ltd.

77 citations


Journal ArticleDOI
TL;DR: The theory and application of generalized linear autoregressive moving average observation-driven models for time series of counts with explanatory variables and the estimation of these models using the R package glarma are reviewed.
Abstract: We review the theory and application of generalized linear autoregressive moving average observation-driven models for time series of counts with explanatory variables and describe the estimation of these models using the R package glarma. Forecasting, diagnostic and graphical methods are also illustrated by several examples.

74 citations


Journal ArticleDOI
TL;DR: In this paper, a Tobit model with spatial autoregressive interactions is examined, and the maximum likelihood estimator is shown to be consistent and asymptotically normally distributed, and Monte Carlo experiments are performed to verify finite sample properties.

56 citations


Journal ArticleDOI
TL;DR: It is found that, depending on the person, approximately 30–50% of the total variance was due to measurement error, and that disregarding this measurement error results in a substantial underestimation of the autoregressive parameters.
Abstract: Measurement error is omnipresent in psychological data. However, the vast majority of applications of autoregressive time series analyses in psychology do not take measurement error into account. Disregarding measurement error when it is present in the data results in a bias of the autoregressive parameters. We discuss two models that take measurement error into account: An autoregressive model with a white noise term (AR+WN), and an autoregressive moving average (ARMA) model. In a simulation study we compare the parameter recovery performance of these models, and compare this performance for both a Bayesian and frequentist approach. We find that overall, the AR+WN model performs better. Furthermore, we find that for realistic (i.e., small) sample sizes, psychological research would benefit from a Bayesian approach in fitting these models. Finally, we illustrate the effect of disregarding measurement error in an AR(1) model by means of an empirical application on mood data in women. We find that, depending on the person, approximately 30-50% of the total variance was due to measurement error, and that disregarding this measurement error results in a substantial underestimation of the autoregressive parameters.

56 citations


Journal ArticleDOI
TL;DR: The applications reveal that FTS-N produces more accurate forecasts for the 11 real-world time-series data sets, and has a network structure and is called a fuzzy-time-series network (F TS-N).
Abstract: Non-probabilistic forecasting methods are commonly used in various scientific fields. Fuzzy-time-series methods are well-known non-probabilistic and nonlinear forecasting methods. Although these methods can produce accurate forecasts, linear autoregressive models can produce forecasts that are more accurate than those produced by fuzzy-time-series methods for some real-world time series. It is well known that hybrid forecasting methods are useful techniques for forecasting time series and that they have the capabilities of their components. In this study, a new hybrid forecasting method is proposed. The components of the new hybrid method are a high-order fuzzy-time-series forecasting model and autoregressive model. The new hybrid forecasting method has a network structure and is called a fuzzy-time-series network (FTS-N). The fuzzy c-means method is used for the fuzzification of time series in FTS-N, which is trained by particle swarm optimization. Istanbul Stock Exchange daily data sets from 2009 to 2013 and the Taiwan Stock Exchange Capitalization Weighted Stock Index data sets from 1999 to 2004 were used to evaluate the performance of FTS-N. The applications reveal that FTS-N produces more accurate forecasts for the 11 real-world time-series data sets.

54 citations


Journal ArticleDOI
TL;DR: It is shown that the GRBF-AR model not only achieves much more parsimonious structure but also much better prediction performance than that of GRBF network.
Abstract: We propose a gradient radial basis function based varying-coefficient autoregressive (GRBF-AR) model for modeling and predicting time series that exhibit nonlinearity and homogeneous nonstationarity. This GRBF-AR model is a synthesis of the gradient RBF and the functional-coefficient autoregressive (FAR) model. The gradient RBFs, which react to the gradient of the series, are used to construct varying coefficients of the FAR model. The Mackey-Glass chaotic time series are used to evaluate the performance of the proposed method. It is shown that the GRBF-AR model not only achieves much more parsimonious structure but also much better prediction performance than that of GRBF network.

51 citations


Proceedings ArticleDOI
10 Aug 2015
TL;DR: This paper develops a dynamic Poisson autoregressive model with exogenous inputs variables (DPARX) for flu forecasting and applies this model and the corresponding learning method on historical ILI records for 15 countries around the world using a variety of syndromic surveillance data sources.
Abstract: Influenza-like-illness (ILI) is among of the most common diseases worldwide, and reliable forecasting of the same can have significant public health benefits. Recently, new forms of disease surveillance based upon digital data sources have been proposed and are continuing to attract attention over traditional surveillance methods. In this paper, we focus on short-term ILI case count prediction and develop a dynamic Poisson autoregressive model with exogenous inputs variables (DPARX) for flu forecasting. In this model, we allow the autoregressive model to change over time. In order to control the variation in the model, we construct a model similarity graph to specify the relationship between pairs of models at two time points and embed prior knowledge in terms of the structure of the graph. We formulate ILI case count forecasting as a convex optimization problem, whose objective balances the autoregressive loss and the model similarity regularization induced by the structure of the similarity graph. We then propose an efficient algorithm to solve this problem by block coordinate descent. We apply our model and the corresponding learning method on historical ILI records for 15 countries around the world using a variety of syndromic surveillance data sources. Our approach provides consistently better forecasting results than state-of-the-art models available for short-term ILI case count forecasting.

47 citations


Journal ArticleDOI
TL;DR: In this paper, the hysteretic autoregressive model was proposed, which enjoys the piecewise linear structure of a threshold model but has a more flexible regime switching mechanism, and a sufficient condition is given for geometric ergodicity.
Abstract: This paper extends the classical two-regime threshold autoregressive model by introducing hysteresis to its regime-switching structure, which leads to a new model: the hysteretic autoregressive model. The proposed model enjoys the piecewise linear structure of a threshold model but has a more flexible regime switching mechanism. A sufficient condition is given for geometric ergodicity. Conditional least squares estimation is discussed, and the asymptotic distributions of its estimators and information criteria for model selection are derived. Simulation results and an example support the model.

Journal ArticleDOI
TL;DR: It is shown that Bayesian nonparametric VAR (BayesNP-VAR) is a flexible model that is able to account for nonlinear relationships as well as heteroscedasticity in the data and predictively outperforms competing models.
Abstract: Vector autoregressive (VAR) models are the main work-horse model for macroeconomic forecasting, and provide a framework for the analysis of complex dynamics that are present between macroeconomic variables. Whether a classical or a Bayesian approach is adopted, most VAR models are linear with Gaussian innovations. This can limit the model’s ability to explain the relationships in macroeconomic series. We propose a nonparametric VAR model that allows for nonlinearity in the conditional mean, heteroscedasticity in the conditional variance, and non-Gaussian innovations. Our approach differs to that of previous studies by modelling the stationary and transition densities using Bayesian nonparametric methods. Our Bayesian nonparametric VAR (BayesNP-VAR) model is applied to USA and Eurozone macroeconomic time series, and compared to other Bayesian VAR models. We show that BayesNP-VAR is a flexible model that is able to account for nonlinear relationships as well as heteroscedasticity in the data. In terms of short-run out-of-sample predictions, we show that BayesNP-VAR predictively outperforms competing models.

Journal ArticleDOI
TL;DR: In this article, a nonlinear spatial autoregressive model for share data is developed, and the authors consider possible instrumental variable (IV) and maximum likelihood estimation (MLE) for this model, and analyze asymptotic properties of the IV and MLE based on the notion of spatial near-epoch dependence.

Journal ArticleDOI
TL;DR: This work gives general regularity conditions under which the asymptotic null behavior of the corresponding tests in addition to their behavior under alternatives are derived, where conditions become particularly simple for sufficiently smooth estimating and monitoring functions.

Journal ArticleDOI
TL;DR: In this paper, the performance of statistical and Bayesian combination models with classical single time series models for short-term traffic forecasting is compared, and it is shown that combining models with different degrees of spatio-temporal complexity and exogeneities is most likely to be the best choice in terms of accuracy.
Abstract: This study compares the performance of statistical and Bayesian combination models with classical single time series models for short-term traffic forecasting. Combinations are based on fractionally integrated autoregressive time series models of travel speed with exogenous variables that consider speed's spatio-temporal evolution, and volume and weather conditions. Several statistical hypotheses on the effectiveness of combinations compared to the single models are also tested. Results show that, in the specific application, linear regression combination techniques may provide more accurate forecasts than Bayesian combination models. Moreover, combining models with different degrees of spatio-temporal complexity and exogeneities is most likely to be the best choice in terms of accuracy. Moreover, the risk of combining forecasts is lower than the risk of choosing a single model with increased spatio-temporal complexity.

Journal ArticleDOI
TL;DR: In this article, a general formulation for the univariate nonlinear autoregressive model discussed by Glasbey (Journal of the Royal Statistical Society: Series C, 50(2001), 143-154) in the first order case is presented.
Abstract: This paper presents a general formulation for the univariate nonlinear autore- gressive model discussed by Glasbey (Journal of the Royal Statistical Society: Series C, 50(2001), 143-154) in the …rst order case, and provides a more thorough treat- ment of its theoretical properties and practical usefulness. The model belongs to the family of mixture autoregressive models but it diers from its previous alternatives in several advantageous ways. A major theoretical advantage is that, by the de…- nition of the model, conditions for stationarity and ergodicity are always met and these properties are much more straightforward to establish than is common in non- linear autoregressive models. Moreover, for a pth order model an explicit expression of the (p+ 1)-dimensional stationary distribution is known and given by a mixture of Gaussian distributions with constant mixing weights. Lower dimensional sta- tionary distributions have a similar form whereas the conditional distribution given the past observations is a Gaussian mixture with time varying mixing weights that depend on p lagged values of the series in a natural way. Due to the known sta- tionary distribution exact maximum likelihood estimation is feasible, and one can assess the applicability of the model in advance by using a nonparametric estimate of the density function. An empirical example with interest rate series illustrates the practical usefulness of the model. � The …rst and third authors thank the Academy of Finland and the OP-Pohjola Group Research

Journal ArticleDOI
TL;DR: In this paper, the authors provide out-of-sample forecasts of linear and non-linear models of US and Census regions housing prices, including point forecasts, but also include interval and density forecasts of the housing price distributions.
Abstract: This paper provides out-of-sample forecasts of linear and non-linear models of US and Census regions housing prices. The forecasts include the traditional point forecasts, but also include interval and density forecasts of the housing price distributions. The non-linear smooth-transition autoregressive model outperforms the linear autoregressive model in point forecasts at longer horizons, but the linear autoregressive model dominates the non-linear smooth-transition autoregressive model at short horizons. In addition, we generally do not find major differences in performance for the interval and density forecasts between the linear and non-linear models. Finally, in a dynamic 25-step ex-ante and interval forecasting design, we, once again, do not find major differences between the linear and nonlinear models.

Proceedings ArticleDOI
31 Aug 2015
TL;DR: This study proposes a novel prediction method which composes not only a kind of DBN with RBM and MLP but also ARIMA, and results showed the effectiveness of the proposed method.
Abstract: Time series data analyze and prediction is very important to the study of nonlinear phenomenon. Studies of time series prediction have a long history since last century, linear models such as autoregressive integrated moving average (ARIMA) model, and nonlinear models such as multi-layer perceptron (MLP) are well-known. As the state-of-art method, a deep belief net (DBN) using multiple Restricted Boltzmann machines (RBMs) was proposed recently. In this study, we propose a novel prediction method which composes not only a kind of DBN with RBM and MLP but also ARIMA. Prediction experiments for the time series of the actual data and chaotic time series were performed, and results showed the effectiveness of the proposed method.

Journal ArticleDOI
TL;DR: In this article, the performance of alternative methods for calculating in-sample confidence and out-of-sample forecast bands for time-varying parameters was studied. But the performance was limited to a large class of observation driven models and a wide range of estimation procedures.
Abstract: We study the performance of alternative methods for calculating in-sample confidence and out of-sample forecast bands for time-varying parameters. The in-sample bands reflect parameter uncertainty only. The out-of-sample bands reflect both parameter uncertainty and innovation uncertainty. The bands are applicable to a large class of observation driven models and a wide range of estimation procedures. A Monte Carlo study is conducted for time-varying parameter models such as generalized autoregressive conditional heteroskedasticity and autoregressive conditional duration models. Our results show clear differences between the actual coverage provided by the different methods. We illustrate our findings in a volatility analysis for monthly Standard & Poor’s 500 index returns.

Journal ArticleDOI
TL;DR: In this article, the capability of five classes of nonlinear time series models, namely threshold autoregressive (TAR), Smooth Transition Autoregressive, Exponential Autoregression (EXPAR), Bilinear Model (BL) and Markov Switching Auto-Regression (MSAR), to capture the dynamics in the Colorado river discharge time series was investigated.

Journal ArticleDOI
TL;DR: In this article, a first-order mixed integer-valued autoregressive process with zero-inflated generalized power series innovations is proposed to model zero-INFLATED time series of counts.
Abstract: To model zero-inflated time series of counts, we propose a first-order mixed integer-valued autoregressive process with zero-inflated generalized power series innovations. These innovations contain the commonly used zero-inflated Poisson and geometric distributions. Strict stationarity, ergodicity of the process, and some important probabilistic properties such as the transition probabilities, the k-step ahead conditional mean and variance are obtained. The conditional maximum likelihood estimators for the parameters in this process are derived and the performances of the estimators are studied via simulation. As illustration, an application to an offence data set is given to show the effectiveness of the proposed model.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the concept of autoregressive sieve bootstrap for the case of vector autoregression (VAR) time series and provided a general check criterion, which allows to decide whether the VAR Sieve Bootstrap asymptotically works for a specific statistic or not.
Abstract: The concept of autoregressive sieve bootstrap is investigated for the case of vector autoregressive (VAR) time series. This procedure fits a finite-order VAR model to the given data and generates residual-based bootstrap replicates of the time series. The paper explores the range of validity of this resampling procedure and provides a general check criterion, which allows to decide whether the VAR sieve bootstrap asymptotically works for a specific statistic or not. In the latter case, we will point out the exact reason that causes the bootstrap to fail. The developed check criterion is then applied to some particularly interesting statistics.

Journal ArticleDOI
TL;DR: In this article, a first order non-negative integer valued autoregressive process with power series innovations based on the binomial thinning is introduced, and the main properties of the model are derived, such as mean, variance and the autocorrelation function.
Abstract: In this paper, we introduce a first order non-negative integer valued autoregressive process with power series innovations based on the binomial thinning. This new model contains, as particular cases, several models such as the Poisson INAR(1) model (Al-Osh and Alzaid (J. Time Series Anal. 8 (1987) 261–275)), the geometric INAR(1) model (Jazi, Jones and Lai (J. Iran. Stat. Soc. (JIRSS) 11 (2012) 173–190)) and many others. The main properties of the model are derived, such as mean, variance and the autocorrelation function. Yule–Walker, conditional least squares and conditional maximum likelihood estimators of the model parameters are derived. An extensive Monte Carlo experiment is conducted to evaluate the performances of these estimators in finite samples. Special sub-models are studied in some detail. Applications to two real data sets are given to show the flexibility and potentiality of the new model.

Journal ArticleDOI
TL;DR: In this article, a unified framework for fixed effects (FE) and random effects (RE) estimation of higher-order spatial autoregressive panel data models with heteroscedasticity of unknown form in the idiosyncratic error component was developed.
Abstract: This paper develops a unified framework for fixed effects (FE) and random effects (RE) estimation of higher-order spatial autoregressive panel data models with spatial autoregressive disturbances and heteroscedasticity of unknown form in the idiosyncratic error component. We derive the moment conditions and optimal weighting matrix without distributional assumptions for a generalized moments (GM) estimation procedure of the spatial autoregressive parameters of the disturbance process and define both an RE and an FE spatial generalized two-stage least squares estimator for the regression parameters of the model. We prove consistency of the proposed estimators and derive their joint asymptotic distribution, which is robust to heteroscedasticity of unknown form in the idiosyncratic error component. Finally, we derive a robust Hausman test of the spatial random against the spatial FE model.

Journal ArticleDOI
30 Jun 2015
TL;DR: The research aims to investigate the potential of artificial neural networks (ANN) in solving the forecast task in the most general case, when the time series are non-stationary, using a feed-forward neural architecture: the nonlinear autoregressive network with exogenous inputs.
Abstract: Considering the fact that markets are generally influenced by different external factors, the stock market prediction is one of the most difficult tasks of time series analysis. The research reported in this paper aims to investigate the potential of artificial neural networks (ANN) in solving the forecast task in the most general case, when the time series are non-stationary. We used a feed-forward neural architecture: the nonlinear autoregressive network with exogenous inputs. The network training function used to update the weight and bias parameters corresponds to gradient descent with adaptive learning rate variant of the backpropagation algorithm. The results obtained using this technique are compared with the ones resulted from some ARIMA models. We used the mean square error (MSE) measure to evaluate the performances of these two models. The comparative analysis leads to the conclusion that the proposed model can be successfully applied to forecast the financial data.

Journal ArticleDOI
TL;DR: In this article, a stochastic volatility (SV) model is proposed that combines autoregressive SV models and Bayesian nonparametric modeling to capture long-range dependence of volatility.
Abstract: This article proposes a novel stochastic volatility (SV) model that draws from the existing literature on autoregressive SV models, aggregation of autoregressive processes, and Bayesian nonparametric modeling to create a SV model that can capture long-range dependence. The volatility process is assumed to be the aggregate of autoregressive processes, where the distribution of the autoregressive coefficients is modeled using a flexible Bayesian approach. The model provides insight into the dynamic properties of the volatility. An efficient algorithm is defined which uses recently proposed adaptive Monte Carlo methods. The proposed model is applied to the daily returns of stocks.

Journal ArticleDOI
TL;DR: This letter suggests analyzing the similarities of two TVAR models, sample after sample, by recursively computing the Jeffrey's divergence between the joint distributions of the successive values of each TVAR model, and shows that this divergence tends to the Itakura divergence in the stationary case.
Abstract: Autoregressive (AR) and time-varying AR (TVAR) models are widely used in various applications, from speech processing to biomedical signal analysis. Various dissimilarity measures such as the Itakura divergence have been proposed to compare two AR models. However, they do not take into account the variances of the driving processes and only apply to stationary processes. More generally, the comparison between Gaussian processes is based on the Kullback-Leibler (KL) divergence but only asymptotic expressions are classically used. In this letter, we suggest analyzing the similarities of two TVAR models, sample after sample, by recursively computing the Jeffrey’s divergence between the joint distributions of the successive values of each TVAR model. Then, we show that, under some assumptions, this divergence tends to the Itakura divergence in the stationary case.

Proceedings ArticleDOI
01 Dec 2015
TL;DR: A simplified spectrum prediction scheme based on AR modeling, namely, Forward Backward AR (FBAR) model, that contributes to the field of Cognitive Radio (CR) through enhancing the prediction accuracy and the Secondary user throughput as compared to the existing techniques.
Abstract: This paper proposes a simplified spectrum prediction scheme based on AR modeling, namely, Forward Backward AR (FBAR) model, that contributes to the field of Cognitive Radio (CR) through enhancing the prediction accuracy and the Secondary user throughput as compared to the existing techniques. Simulation Results show that the proposed method outperformed the nominal performance measures in literature with much reduced computational complexity.

Journal ArticleDOI
TL;DR: A large family of such models which generalize the well known Markov-switching AutoRegressive (MS-AR) by allowing non-homogeneous switching and encompass Threshold AutoRegression (TAR) models are considered.
Abstract: Many nonlinear time series models have been proposed in the last decades. Among them, the models with regime switchings provide a class of versatile and interpretable models which have received a particular attention in the literature. In this paper, we consider a large family of such models which generalize the well known Markov-switching AutoRegressive (MS-AR) by allowing non-homogeneous switching and encompass Threshold AutoRegressive (TAR) models. We prove various theoretical results related to the stability of these models and the asymptotic properties of the Maximum Likelihood Estimates (MLE). The ability of the model to catch complex nonlinearities is then illustrated on various time series.

Journal ArticleDOI
TL;DR: In this paper, dynamic programming (DP) algorithm is applied to automatically segment multivariate time series, and it is proposed that besides the regression by constant, autoregression should be taken into account.
Abstract: In this paper, dynamic programming (DP) algorithm is applied to automatically segment multivariate time series. The definition and recursive formulation of segment errors of univariate time series are extended to multivariate time series, so that DP algorithm is computationally viable for multivariate time series. The order of autoregression and segmentation are simultaneously determined by Schwarz’s Bayesian information criterion. The segmentation procedure is evaluated with artificially synthesized and hydrometeorological multivariate time series. Synthetic multivariate time series are generated by threshold autoregressive model, and in real-world multivariate time series experiment we propose that besides the regression by constant, autoregression should be taken into account. The experimental studies show that the proposed algorithm performs well.