scispace - formally typeset
Search or ask a question
Book ChapterDOI

Statistical Analysis of Time Series

01 Jan 2010-pp 361-385
TL;DR: In this article, the authors present formulas relevant for time series analysis: 31.1. Predictions in Time Series, 31.2. Decomposition of (economic) Time Series and 31.3. Estimation of Correlation and Spectral Characteristics.
Abstract: Chapter 31 contains formulas relevant for time series analysis: 31.1. Predictions in Time Series, 31.2. Decomposition of (Economic) Time Series, 31.3. Estimation of Correlation and Spectral Characteristics, 31.4. Linear Time Series, 31.5 Nonlinear and Financial Time Series, 31.6 Multivariate Time Series, 31.7. Kalman Filter.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, a computer program for modelling financial time series is presented, based on the Random Walk Hypothesis, which is used to forecast trends in prices in futures markets.
Abstract: Features of Financial Returns Modelling Price Volatility Forecasting Standard Deviations The Accuracy of Autocorrelation Estimates Testing the Random Walk Hypothesis Forecasting Trends in Prices Evidence Against the Efficiency of Futures Markets Valuing Options Appendix: A Computer Program for Modelling Financial Time Series.

1,115 citations

Journal ArticleDOI
TL;DR: The theoretical basis, computational strategy and application to empirical G-causality inference of the MVGC Toolbox are explained and the advantages of the Toolbox over previous methods in terms of computational accuracy and statistical inference are shown.

771 citations


Cites background or methods from "Statistical Analysis of Time Series..."

  • ...Standard theory (Anderson, 1971) yields that a rather general class of covariance-stationary multivariate process—including many nonlinear processes—may be modelled as VARs, albeit of theoretically infinite order (see also the next Section)....

    [...]

  • ...Standard theory (Anderson, 1971) yields that a rather general class of covariance-stationary multivariate process—including many nonlinear processes—may be modelled as VARs, albeit of theoretically infinite order (see also the next section)....

    [...]

  • ...For a VAR process (1), the autocovariance sequence is related to the VAR parameters via the Yule-Walker equations (Anderson, 1971)...

    [...]

Journal ArticleDOI
TL;DR: Granger causality (G-causality) analysis is used for the characterization of functional circuits underpinning perception, cognition, behavior, and consciousness in neuroscience.
Abstract: ### Introduction A key challenge in neuroscience and, in particular, neuroimaging, is to move beyond identification of regional activations toward the characterization of functional circuits underpinning perception, cognition, behavior, and consciousness. Granger causality (G-causality) analysis

657 citations


Cites background from "Statistical Analysis of Time Series..."

  • ...Even substantially nonlinear interactions that unfold over a small number of observations can sometimes be approximated by a (linear) VAR model with a large model order (Anderson, 1971)....

    [...]

Journal ArticleDOI
TL;DR: Simulations show excellent agreement with the high-dimensional scaling of the error predicted by the theory, and illustrate their consequences for a number of specific learning models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low- rank matrices from random projections.
Abstract: We study an instance of high-dimensional inference in which the goal is to estimate a matrix Θ * ∈ ℝ m1×m2 on the basis of N noisy observations. The unknown matrix Θ * is assumed to be either exactly low rank, or "near" low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider a standard M-estimator based on regularization by the nuclear or trace norm over matrices, and analyze its performance under high-dimensional scaling. We define the notion of restricted strong convexity (RSC) for the loss function, and use it to derive nonasymptotic bounds on the Frobenius norm error that hold for a general class of noisy observation models, and apply to both exactly low-rank and approximately low rank matrices. We then illustrate consequences of this general theory for a number of specific matrix models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes and recovery of low-rank matrices from random projections. These results involve nonasymptotic random matrix theory to establish that the RSC condition holds, and to determine an appropriate choice of regularization parameter. Simulation results show excellent agreement with the high-dimensional scaling of the error predicted by our theory.

587 citations

Book ChapterDOI
15 Jul 2012
TL;DR: This chapter presents an overview of machine learning techniques in time series forecasting by focusing on three aspects: the formalization of one- step forecasting problems as supervised learning tasks, the discussion of local learning techniques as an effective tool for dealing with temporal data and the role of the forecasting strategy when the authors move from one-step to multiple-step forecasting.
Abstract: The increasing availability of large amounts of historical data and the need of performing accurate forecasting of future behavior in several scientific and applied domains demands the definition of robust and efficient techniques able to infer from observations the stochastic dependency between past and future. The forecasting domain has been influenced, from the 1960s on, by linear statistical methods such as ARIMA models. More recently, machine learning models have drawn attention and have established themselves as serious contenders to classical statistical models in the forecasting community. This chapter presents an overview of machine learning techniques in time series forecasting by focusing on three aspects: the formalization of one-step forecasting problems as supervised learning tasks, the discussion of local learning techniques as an effective tool for dealing with temporal data and the role of the forecasting strategy when we move from one-step to multiple-step forecasting.

367 citations


Cites background from "Statistical Analysis of Time Series..."

  • ...is a specific realization of a random process, where the randomness arises from many independent degrees of freedom interacting linearly [4]....

    [...]

References
More filters
Book
01 Jan 1970
TL;DR: In this article, a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970 is presented, focusing on practical techniques throughout, rather than a rigorous mathematical treatment of the subject.
Abstract: From the Publisher: This is a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970. It focuses on practical techniques throughout, rather than a rigorous mathematical treatment of the subject. It explores the building of stochastic (statistical) models for time series and their use in important areas of application —forecasting, model specification, estimation, and checking, transfer function modeling of dynamic relationships, modeling the effects of intervention events, and process control. Features sections on: recently developed methods for model specification, such as canonical correlation analysis and the use of model selection criteria; results on testing for unit root nonstationarity in ARIMA processes; the state space representation of ARMA models and its use for likelihood estimation and forecasting; score test for model checking; and deterministic components and structural components in time series models and their estimation based on regression-time series model methods.

19,748 citations

Journal ArticleDOI
TL;DR: This revision of a classic, seminal, and authoritative book explores the building of stochastic models for time series and their use in important areas of application —forecasting, model specification, estimation, and checking, transfer function modeling of dynamic relationships, modeling the effects of intervention events, and process control.
Abstract: From the Publisher: This is a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970. It focuses on practical techniques throughout, rather than a rigorous mathematical treatment of the subject. It explores the building of stochastic (statistical) models for time series and their use in important areas of application —forecasting, model specification, estimation, and checking, transfer function modeling of dynamic relationships, modeling the effects of intervention events, and process control. Features sections on: recently developed methods for model specification, such as canonical correlation analysis and the use of model selection criteria; results on testing for unit root nonstationarity in ARIMA processes; the state space representation of ARMA models and its use for likelihood estimation and forecasting; score test for model checking; and deterministic components and structural components in time series models and their estimation based on regression-time series model methods.

12,650 citations

Journal ArticleDOI

12,005 citations

Journal ArticleDOI
TL;DR: A ordered sequence of events or observations having a time component is called as a time series, and some good examples are daily opening and closing stock prices, daily humidity, temperature, pressure, annual gross domestic product of a country and so on.
Abstract: Preface1Difference Equations12Lag Operators253Stationary ARMA Processes434Forecasting725Maximum Likelihood Estimation1176Spectral Analysis1527Asymptotic Distribution Theory1808Linear Regression Models2009Linear Systems of Simultaneous Equations23310Covariance-Stationary Vector Processes25711Vector Autoregressions29112Bayesian Analysis35113The Kalman Filter37214Generalized Method of Moments40915Models of Nonstationary Time Series43516Processes with Deterministic Time Trends45417Univariate Processes with Unit Roots47518Unit Roots in Multivariate Time Series54419Cointegration57120Full-Information Maximum Likelihood Analysis of Cointegrated Systems63021Time Series Models of Heteroskedasticity65722Modeling Time Series with Changes in Regime677A Mathematical Review704B Statistical Tables751C Answers to Selected Exercises769D Greek Letters and Mathematical Symbols Used in the Text786Author Index789Subject Index792

10,011 citations

Book
19 Aug 2009
TL;DR: In this article, the mean and autocovariance functions of ARIMA models are estimated for multivariate time series and state-space models, and the spectral representation of the spectrum of a Stationary Process is inferred.
Abstract: 1 Stationary Time Series.- 2 Hilbert Spaces.- 3 Stationary ARMA Processes.- 4 The Spectral Representation of a Stationary Process.- 5 Prediction of Stationary Processes.- 6* Asymptotic Theory.- 7 Estimation of the Mean and the Autocovariance Function.- 8 Estimation for ARMA Models.- 9 Model Building and Forecasting with ARIMA Processes.- 10 Inference for the Spectrum of a Stationary Process.- 11 Multivariate Time Series.- 12 State-Space Models and the Kalman Recursions.- 13 Further Topics.- Appendix: Data Sets.

5,260 citations