scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Time Series Analysis in 2012"


Journal ArticleDOI
TL;DR: In this article, the authors consider N panels and each panel is based on T observations and they are interested to test if the means of the panels remain the same during the observation period against the alternative that the means change at an unknown time.
Abstract: We consider N panels and each panel is based on T observations. We are interested to test if the means of the panels remain the same during the observation period against the alternative that the means change at an unknown time. We provide tests which are derived from a likelihood argument and they are based on the adaptation of the CUSUM method to panel data. Asymptotic distributions are derived under the no change null hypothesis and the consistency of the tests are proven under the alternative. The asymptotic results are shown to work in case of small and moderate sample sizes via Monte Carlo simulations.

158 citations


Journal ArticleDOI
TL;DR: In this article, a stationary first-order nonnegative integer valued autoregressive process with zero inflated Poisson innovations is proposed to model the counts of events in consecutive points of time.
Abstract: The first-order nonnegative integer valued autoregressive process has been applied to model the counts of events in consecutive points of time. It is known that, if the innovations are assumed to follow a Poisson distribution then the marginal model is also Poisson. This model may however not be suitable for overdispersed count data. One frequent manifestation of overdispersion is that the incidence of zero counts is greater than expected from a Poisson model. In this paper, we introduce a new stationary first-order integer valued autoregressive process with zero inflated Poisson innovations. We derive some structural properties such as the mean, variance, marginal and joint distribution functions of the process. We consider estimation of the unknown parameters by conditional or approximate full maximum likelihood. We use simulation to study the limiting marginal distribution of the process and the performance of our fitting algorithms. Finally, we demonstrate the usefulness of the proposed model by analyzing some real time series on animal health laboratory submissions.

111 citations


Journal ArticleDOI
Zhou Zhou1
TL;DR: This article proposes and theoretically verify a subsampling methodology for the inference of sample ADCF for dependent data and provides a useful tool for exploring nonlinear dependence structures in time‐series.
Abstract: We extend the concept of distance correlation of Szekely et al. (2007) and propose the auto distance correlation function (ADCF) to measure the temporal dependence structure of time-series. Unlike the classic measures of correlations such as the autocorrelation function, the proposed measure is zero if and only if the measured time-series components are independent. In this article, we propose and theoretically verify a subsampling methodology for the inference of sample ADCF for dependent data. Our methodology provides a useful tool for exploring nonlinear dependence structures in time-series.

107 citations


Journal ArticleDOI
TL;DR: In this paper, a test procedure for determining the order p in the FAR(p) model with dependent regressors is described. But the test procedure is restricted to the case where p is a fully functional linear model.
Abstract: This chapter is concerned with determining the order p in the FAR(p) model $$Z_{i} = \sum\limits_{j = 1}^{p}\phi_j(Z_{i - j}) + \varepsilon_i.$$ We describe a testing procedure proposed by Kokoszka and Reimherr (2011). At its core is the representation of the FAR(p) process as a fully functional linear model with dependent regressors.

73 citations


Journal ArticleDOI
TL;DR: In this paper, a new portmanteau diagnostic test for vector autoregressive moving average (VARMA) models that is based on the determinant of the standardized multivariate residual autocorrelations is derived.
Abstract: A new portmanteau diagnostic test for vector autoregressive moving average (VARMA) models that is based on the determinant of the standardized multivariate residual autocorrelations is derived. The new test statistic may be considered an extension of the univariate portmanteau test statistic suggested by Peňa and Rodriguez (2002). The asymptotic distribution of the test statistic is derived as well as a chi-square approximation. However, the Monte–Carlo test is recommended unless the series is very long. Extensive simulation experiments demonstrate the usefulness of this test as well as its improved power performance compared to widely used previous multivariate portmanteau diagnostic check. Two illustrative applications are given.

62 citations


Journal ArticleDOI
TL;DR: In this article, a test statistic of cumulative sum type for general Poisson autoregressions of order 1 was proposed to detect a changepoint in the structure of an integer-valued time series.
Abstract: In this article, we discuss the problem of testing for a changepoint in the structure of an integer-valued time series. In particular, we consider a test statistic of cumulative sum type for general Poisson autoregressions of order 1. We investigate the asymptotic behaviour of conditional least-squares estimates of the parameters in the presence of a changepoint. Then, we derive the asymptotic distribution of the test statistic under the hypothesis of no change, allowing for the calculation of critical values. We prove consistency of the test, that is, asymptotic power 1, and consistency of the corresponding changepoint estimate. As an application, we have a look at changepoint detection in daily epileptic seizure counts from a clinical study.

62 citations


Journal ArticleDOI
TL;DR: In this paper, the authors prove the consistency of the averaged periodogram estimator (APE) for negative memory parameters, after suitable tapering, for a power law in the cross-spectrum and therefore for a coherency, provided that sufficiently many frequencies are used in estimation.
Abstract: We prove the consistency of the averaged periodogram estimator (APE) in two new cases. First, we prove that the APE is consistent for negative memory parameters, after suitable tapering. Second, we prove that the APE is consistent for a power law in the cross-spectrum and therefore for a power law in the coherency, provided that su-ciently many frequencies are used in estimation. Simulation evidence suggests that the lower bound on the number of frequencies is a necessary condition for consistency. For a Taylor series approximation to the estimator of the power law in the cross-spectrum, we consider the rate of convergence, and obtain a central limit theorem under suitable regularity conditions.

45 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a methodology for approximating the full posterior distribution of various change point characteristics in the presence of parameter uncertainty, which does not require estimates of the underlying state sequence.
Abstract: Quantifying the uncertainty in the location and nature of change points in time series is important in a variety of applications. Many existing methods for estimation of the number and location of change points fail to capture fully or explicitly the uncertainty regarding these estimates, whilst others require explicit simulation of large vectors of dependent latent variables. This article proposes methodology for approximating the full posterior distribution of various change point characteristics in the presence of parameter uncertainty. The methodology combines recent work on evaluation of exact change point distributions conditional on model parameters via finite Markov chain imbedding in a hidden Markov model setting, and accounting for parameter uncertainty and estimation via Bayesian modelling and sequential Monte Carlo. The combination of the two leads to a flexible and computationally efficient procedure, which does not require estimates of the underlying state sequence. We illustrate that good estimation of the posterior distributions of change point characteristics is provided for simulated data and functional magnetic resonance imaging data. We use the methodology to show that the modelling of relevant physical properties of the scanner can influence detection of change points and their uncertainty.

43 citations


Journal ArticleDOI
TL;DR: In this paper, the authors revisited a time series model introduced by MCElroy and Politis (2007a) and generalized it in several ways to encompass a wider class of stationary, nonlinear, heavy-tailed time series with long memory.
Abstract: In this article, we revisit a time series model introduced by MCElroy and Politis (2007a) and generalize it in several ways to encompass a wider class of stationary, nonlinear, heavy-tailed time series with long memory. The joint asymptotic distribution for the sample mean and sample variance under the extended model is derived; the associated convergence rates are found to depend crucially on the tail thickness and long memory parameter. A self-normalized sample mean that concurrently captures the tail and memory behaviour, is defined. Its asymptotic distribution is approximated by subsampling without the knowledge of tail or/and memory parameters; a result of independent interest regarding subsampling consistency for certain long-range dependent processes is provided. The subsampling-based confidence intervals for the process mean are shown to have good empirical coverage rates in a simulation study. The influence of block size on the coverage and the performance of a data-driven rule for block size selection are assessed. The methodology is further applied to the series of packet-counts from ethernet traffic traces.

42 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed testing procedures for the detection of structural changes in nonlinear autoregressive processes, where the regression function is modeled by a single layer feed-forward neural network.
Abstract: In this paper we develop testing procedures for the detection of structural changes in nonlinear autoregressive processes. For the detection procedure we model the regression function by a single layer feedforward neural network. We show that CUSUM-type tests based on cumulative sums of estimated residuals, that have been intensively studied for linear regression, can be extended to this case. The limit distribution under the null hypothesis is obtained, which is needed to construct asymptotic tests. For a large class of alternatives it is shown that the tests have asymptotic power one. In this case, we obtain a consistent change-point estimator which is related to the test statistics. Power and size are further investigated in a small simulation study with a particular emphasis on situations where the model is misspecified, i.e. the data is not generated by a neural network but some other regression function. As illustration, an application on the Nile data set as well as S&P log-returns is given.

36 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe some specific challenges that await the attention of statisticians and applied probabilists, relevant aspects of the physical theory, current inferential efforts and simulation aspects of a central model for the dynamics of nano-scale particles in viscoelastic fluids, the generalized Langevin equation.
Abstract: Microrheology is the study of the properties of a complex fluid through the diffusion dynamics of small particles, typically latex beads, moving through that material. Currently, it is the dominant technique in the study of the physical properties of biological fluids, of the material properties of membranes or the cytoplasm of cells, or of the entire cell. The theoretical underpinning of microrheology was given in Mason and Weitz (Physical Review Letters; 1995), who introduced a framework for the use of path data of diffusing particles to infer viscoelastic properties of its fluid environment. The multi-particle tracking techniques that were subsequently developed have presented numerous challenges for experimentalists and theoreticians. This study describes some specific challenges that await the attention of statisticians and applied probabilists. We describe relevant aspects of the physical theory, current inferential efforts and simulation aspects of a central model for the dynamics of nano-scale particles in viscoelastic fluids, the generalized Langevin equation.

Journal ArticleDOI
TL;DR: In this paper, a similarity-based approach to non-stationary autoregression is proposed, which can display characteristics consistent with stationary, unit root and explosive behaviour, depending on the similarity between the dependent variable and its past values.
Abstract: We suggest in this article a similarity-based approach to time-varying coefficient non-stationary autoregression. In a given sample, the model can display characteristics consistent with stationary, unit root and explosive behaviour, depending on the similarity between the dependent variable and its past values. We establish consistency of the quasi-maximum likelihood estimator of the model, with a general norming factor. Asymptotic score-based hypothesis tests are derived. The model is applied to a data set comprised of dual stocks traded in NASDAQ and the Tokyo Stock Exchange.

Journal ArticleDOI
TL;DR: In this article, it is shown that the Fokker-planck-equation can be solved numerically by using distributed approximating functionals (DAFs), and the approximation is very accurate and resolves the time interval between observations in one calculation step.
Abstract: This paper introduces a new method for nonlinear continuous-discrete filtering. It is shown that the Fokker-Planck-Equation can be solved numerically by using distributed approximating functionals (DAFs). The approximation is very accurate and resolves the time interval between observations in one calculation step. Additionally, the operator matrix has to be evaluated only once and not necessarily online. Therefore the method is very efficient.

Journal ArticleDOI
TL;DR: In this paper, a mixed integer-valued autoregressive model of order p is proposed and the existence of this unique, stationary and ergodic process is proved and its autocorrelation structure and some conditional stochastic characteristics are derived.
Abstract: A mixed integer-valued autoregressive model of order p is proposed. The existence of this unique, stationary and ergodic process is proved and its autocorrelation structure and some conditional stochastic characteristics are derived. Model parameters are estimated via Yule-Walker, conditional least squares and conditional maximum likelihood methods. Finally, possible application of the model to real data sets is considered.

Journal ArticleDOI
TL;DR: A new automatic procedure to the model selection problem by using the genetic algorithm, where the Bayesian information criterion is used as a tool to identify the order of the PAR model.
Abstract: Periodic autoregressive (PAR) models extend the classical autoregressive models by allowing the parameters tovary with seasons. Selecting PAR time-series models can be computationally expensive, and the results are notalways satisfactory. In this article, we propose a new automatic procedure to the model selection problem by usingthe genetic algorithm. The Bayesian information criterion is used as a tool to identify the order of the PAR model.The success of the proposed procedure is illustrated in a small simulation study, and an application with monthlydata is presented.Keywords: Periodic time series; identification; genetic algorithms; parameter constraints; BIC.

Journal ArticleDOI
TL;DR: In this paper, the authors consider a process X = (Xt)t∈Z belonging to a large class of causal models including AR(∞), ARCH( ∞), TARCH(´), and... models and consider the problem of testing for change in the parameter.
Abstract: We consider a process X = (Xt)t∈Z belonging to a large class of causal models including AR(∞), ARCH(∞), TARCH(∞),... models. We assume that the model depends on a parameter θ0 ∈ IR d and consider the problem of testing for change in the parameter. Two statistics b Q (1) and b Q (2) are constructed using quasi-likelihood estimator (QLME) of the parameter. Under the null hypothesis that there is no change, it is shown that each of these two statistics weakly converges to the supremum of the sum of the squares of inde- pendent Brownian bridges. Under the local alternative that there is one change, we show that the test statistic b Qn = max ( b

Journal ArticleDOI
TL;DR: The purpose of this article is to propose and analyse two new spectral estimation methods that are based on the sample autocovariances in a nonlinear way, and the rate of convergence of the new estimators is quantified.
Abstract: Traditional kernel spectral density estimators are linear as a function of the sample autocovariance sequence. The purpose of the present paper is to propose and analyze two new spectral estimation methods that are based on the sample autocovariances in a nonlinear way. The rate of convergence of the new estimators is quantified, and practical issues such as bandwidth and/or threshold choice are addressed. The new estimators are also compared to traditional ones using flat-top lag-windows in a simulation experiment involving sparse time series models.

Journal ArticleDOI
TL;DR: In this article, a modified version of the AIC (Akaike information criterion) is proposed, which requires the estimation of the matrices involved in the asymptotic variance of the quasi-maximum likelihood estimator of these models.
Abstract: This article considers the problem of order selection of the vector autoregressive moving-average models and of the sub-class of the vector autoregressive models under the assumption that the errors are uncorrelated but not necessarily independent. We propose a modified version of the AIC (Akaike information criterion). This criterion requires the estimation of the matrice involved in the asymptotic variance of the quasi-maximum likelihood estimator of these models. Monte carlo experiments show that the proposed modified criterion estimates the model orders more accurately than the standard AIC and AICc (corrected AIC) in large samples and often in small samples.

Journal ArticleDOI
TL;DR: In this paper, the serial dependences between the observed time series and the lagged series, taken into account one-by-one, are graphically analyzed by what we have chosen to call the autodependogram.
Abstract: In this article the serial dependences between the observed time series and the lagged series, taken into account one-by-one, are graphically analysed by what we have chosen to call the ‘autodependogram’. This tool is a sort of natural nonlinear counterpart of the well-known autocorrelogram used in the linear context. The autodependogram is based on the simple idea of using, instead of autocorrelations at varying time lags, the χ2-test statistics applied to convenient contingency tables. The efficacy of this graphical device is confirmed by real and artificial time series and by simulations from certain classes of well-known models, characterized by randomness and by different kinds of linear and nonlinear dependences.

Journal ArticleDOI
TL;DR: In this paper, statistical tests are introduced for distinguishing between short-range dependent time series with a single change in mean, and long-range dependant time series, with the former making the null hypothesis.
Abstract: Statistical tests are introduced for distinguishing between short-range dependent time series with a single change in mean, and long-range dependent time series, with the former making the null hypothesis. The tests are based on estimation of the self-similarity parameter after removing the change in mean from the series. The focus is on the GPH (Geweke and Porter-Hudak, 1983) and local Whittle estimation methods in the spectral domain. Theoretical properties of the resulting estimators are established when testing for a single change in mean, and small sample properties of the tests are examined in simulations. The introduced tests improve on the BHKS (Berkes et al., 2006) test which is the only other available test for the considered problem. It is argued that the BHKS test has a low power against long-range dependence alternatives and that this happens because the BHKS test statistic involves estimation of the long-run variance. The BHKS test could be improved readily by considering its R/S-like regression version which estimates the self-similarity parameter and which does not involve the long-run variance. Yet better alternatives are to use more powerful estimation methods (such as GPH or local Whittle) and lead to the tests introduced here.

Journal ArticleDOI
TL;DR: In this article, the authors developed a likelihood ratio (LR) test procedure for discriminating between a short-memory time series with a change-point (CP) and a long-memory (LM) time series.
Abstract: We develop a likelihood ratio (LR) test procedure for discriminating between a short-memory time series with a change-point (CP) and a long-memory (LM) time series. Under the null hypothesis, the time series consists of two segments of short-memory time series with different means and possibly different covariance functions. The location of the shift in the mean is unknown. Under the alternative, the time series has no shift in mean but rather is LM. The LR statistic is defined as the normalized log-ratio of the Whittle likelihood between the CP model and the LM model, which is asymptotically normally distributed under the null. The LR test provides a parametric alternative to the CUSUM test proposed by Berkes et al. (2006). Moreover, the LR test is more general than the CUSUM test in the sense that it is applicable to changes in other marginal or dependence features other than a change-in-mean. We show its good performance in simulations and apply it to two data examples.

Journal ArticleDOI
TL;DR: In this paper, a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting is proposed, where the joint posterior distribution of the model parameters and future values is well defined.
Abstract: This paper proposes a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated Markov chain Monte Carlo algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.

Journal ArticleDOI
TL;DR: In this paper, the problem of testing for a copula parameter change based on the cusum test is considered and the authors first handle this issue in i.i.d. samples and extend it to semiparametric copula ARMA-GARCH models.
Abstract: In this article, we consider the problem of testing for a copula parameter change based on the cusum test. We first handle this issue in i.i.d. samples and extend it to semiparametric copula ARMA-GARCH models. We construct the cusum test based on pseudo maximum likelihood estimation of the copula parameter and derive its limiting null distribution. Simulation results are reported for illustration.

Journal ArticleDOI
TL;DR: In this paper, the empirical likelihood method for long-memory time series models is used to obtain an empirical likelihood ratio which is shown to be asymptotically chi-square distributed.
Abstract: This article studies the empirical likelihood method for long-memory time series models. By virtue of the Whittle likelihood, one obtains a score function that can be viewed as an estimating equation of the parameters of a fractional integrated autoregressive moving average (ARFIMA) model. This score function is used to obtain an empirical likelihood ratio which is shown to be asymptotically chi-square distributed. Confidence regions for the parameters are constructed based on the asymptotic distribution of the empirical likelihood ratio. Bartlett correction and finite sample properties of the empirical likelihood confidence regions are examined.

Journal ArticleDOI
TL;DR: A non‐parametric theory for smoothing and prediction in the time domain for circular time‐series data is proposed based on local constant and local linear fitting estimates of a minimizer of an angular risk function.
Abstract: Not much research has been done in the field of circular time-series analysis. We propose a non-parametric theory for smoothing and prediction in the time domain for circular time-series data. Our model is based on local constant and local linear fitting estimates of a minimizer of an angular risk function. Both asymptotic arguments and empirical examples are used to describe the accuracy of our methods.

Journal ArticleDOI
TL;DR: A feasible and efficient Bayesian estimation method for nonlinear and non‐Gaussian state space models based on Max‐stable processes and a Markov chain Monte Carlo algorithm where the sampling efficiency is improved by the normal mixture sampler are proposed and shown to be highly accurate.
Abstract: Extreme values are often correlated over time, for example, in a financial time series, and these values carry various risks. Max-stable processes such as maxima of moving maxima (M3) processes have been recently considered in the literature to describe time-dependent dynamics, which have been difficult to estimate. This article first proposes a feasible and efficient Bayesian estimation method for nonlinear and non-Gaussian state space models based on these processes and describes a Markov chain Monte Carlo algorithm where the sampling efficiency is improved by the normal mixture sampler. Furthermore, a unique particle filter that adapts to extreme observations is proposed and shown to be highly accurate in comparison with other well-known filters. Our proposed algorithms were applied to daily minima of high-frequency stock return data, and a model comparison was conducted using marginal likelihoods to investigate the time-dependent dynamics in extreme stock returns for financial risk management.

Journal ArticleDOI
TL;DR: In this article, the authors developed a method for estimating the kernel function of a continuous-time moving average (CMA) process Y which takes advantage of the high-frequency of the data.
Abstract: Interest in continuous-time processes has increased rapidly in recent years, largely because of the high-frequency data available in many areas of application, particularly in finance and turbulence. We develop a method for estimating the kernel function of a continuous-time moving average (CMA) process Y which takes advantage of the high-frequency of the data. In order to do so we examine the relation between the CMA process Y and the discrete-time process $Y^\Delta$ obtained by sampling Y at times which are integer multiples of some small positive $\Delta$. In particular we derive asymptotic results as $\Delta\downarrow 0$ which generalize results of Brockwell, Ferrazzano and Kluppelberg (2011) for high-frequency sampling of CARMA processes. We propose an estimator of the continuous-time kernel based on observations of $Y^\Delta$, investigate its properties and illustrate its performance using simulated data. Particular attention is paid to the performance of the estimator as $\Delta\downarrow 0$. Time-domain and frequency-domain methods are used to obtain insight into CMA processes and their sampled versions.

Journal ArticleDOI
TL;DR: It is shown that multiple independent time series from the same ARMA process can be represented by a single univariate ARMA time series through an interleaving of the original series.
Abstract: This article shows that multiple independent time series from the same ARMA process can be represented by a single univariate ARMA time series through an interleaving of the original series. Using this result, existing univariate modelling software can be used to fit a single ARMA time series model simultaneously to multiple independent realizations of the same ARMA process. The interleaving approach and its properties will be presented and compared with alternative estimation options. It will be applied to the modelling of 66 years of daily maximum temperatures for Perth, Western Australia and to other time series models.


Journal ArticleDOI
TL;DR: In this article, the maximum periodogram method on data segments was used as an estimator of the phase of a stationary process and subsequently a least squares technique was used to estimate the phase.
Abstract: A classical model in time series analysis is a stationary process superposed by one or several deterministic sinusoidal components. Di erent methods are applied to estimate the frequency (w) of those components such as Least Squares Estimation and the maximization of the periodogram. In many applications the assumption of a constant frequency is violated and we turn to a time dependent frequency function (w(s)). For example in the physics literature this is viewed as nonlinearity of the phase of a process. A way to estimate w(s) is the local application of the above methods. In this dissertation we study the maximum periodogram method on data segments as an estimator of w(s) and subsequently a least squares technique for estimating the phase. We prove consistency and asymptotic normality in the context of "infill asymptotics", a concept that off ers a meaningful asymptotic theory in cases of local estimations. Finally, we investigate an estimator based on a local linear approximation of the frequency function, prove its consistency and asymptotic normality in the "infi ll asymptotics" sense and show that it delivers better estimations than the ordinary periodogram. The theoretical results are also supported by some simulations.