scispace - formally typeset
Search or ask a question
Author

Helmut Ltkepohl

Bio: Helmut Ltkepohl is an academic researcher. The author has contributed to research in topics: Time series & Autoregressive model. The author has an hindex of 1, co-authored 1 publications receiving 4747 citations.

Papers
More filters
BookDOI
04 Oct 2007
TL;DR: This reference work and graduate level textbook considers a wide range of models and methods for analyzing and forecasting multiple time series, which include vector autoregressive, cointegrated, vector Autoregressive moving average, multivariate ARCH and periodic processes as well as dynamic simultaneous equations and state space models.
Abstract: This reference work and graduate level textbook considers a wide range of models and methods for analyzing and forecasting multiple time series. The models covered include vector autoregressive, cointegrated, vector autoregressive moving average, multivariate ARCH and periodic processes as well as dynamic simultaneous equations and state space models. Least squares, maximum likelihood, and Bayesian methods are considered for estimating these models. Different procedures for model selection and model specification are treated and a wide range of tests and criteria for model checking are introduced. Causality analysis, impulse response analysis and innovation accounting are presented as tools for structural analysis. The book is accessible to graduate students in business and economics. In addition, multiple time series courses in other fields such as statistics and engineering may be based on it. Applied researchers involved in analyzing multiple time series may benefit from the book as it provides the background and tools for their tasks. It bridges the gap to the difficult technical literature on the topic.

5,244 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper overviews the particular challenges present in time-series data and provides a review of the works that have either applied time- series data to unsupervised feature learning algorithms or alternatively have contributed to modifications of featurelearning algorithms to take into account the challenges present.

1,055 citations

Journal ArticleDOI
TL;DR: In this paper, a review article aims to explain the complexity of available solutions, their strengths and weaknesses, and the opportunities and threats that the forecasting tools offer or that may be encountered.

1,016 citations

Posted Content
TL;DR: In this article, a review article aims at explaining the complexity of available solutions, their strengths and weaknesses, and the opportunities and treats that the forecasting tools offer or that may be encountered.
Abstract: A variety of methods and ideas have been tried for electricity price forecasting (EPF) over the last 15 years, with varying degrees of success. This review article aims at explaining the complexity of available solutions, their strengths and weaknesses, and the opportunities and treats that the forecasting tools offer or that may be encountered. The paper also looks ahead and speculates on the directions EPF will or should take in the next decade or so. In particular, it postulates the need for objective comparative EPF studies involving (i) the same datasets, (ii) the same robust error evaluation procedures and (iii) statistical testing of the significance of the outperformance of one model by another.

1,007 citations

Book
28 Mar 2008
TL;DR: The nature and uses of Forecasting, and some Comments on Practical Implementation and use of Statistical Forecasting Techniques, are outlined.
Abstract: 1. Introduction to Forecasting. 1.1 The Nature and uses of Forecasts. 1.2 Some Examples of Time Series. 1.3 The Forecasting Process. 1.4 Resources for Forecasting. 2. Statistics Background for Forecasting. 2.1 Introduction. 2.2 Graphical Displays. 2.3 Numerical Description of Time Series Data. 2.4 Use of Data Transformations and Adjustments. 2.5 General Approach to Time Series Analysis and Forecasting. 2.6 Evaluating and Monitoring Forecasting Model Performance. 3. Regression Analysis and Forecasting. 3.1 Introduction. 3.2 Least Squares Estimation in Linear Regression Models. 3.3 Statistical Inference in Linear Regression. 3.4 Prediction of New Observations. 3.5 Model Adequacy Checking. 3.6 Variable Selection Methods in Regression. 3.7 Generalized and Weighted Least Squares. 3.8 Regression Models for General Time Series Data. 4. Exponential Smoothing Methods. 4.1 Introduction. 4.2 First-Order Exponential Smoothing. 4.3 Modeling Time series Data. 4.4 Second-Order Exponential Smoothing. 4.5 Higher-Order Exponential Smoothing. 4.6 Forecasting. 4.7 Exponential Smoothing for Seasonal Data. 4.8 Exponential Smoothers and ARIMA Models. 5. Autoregressive Integrated Moving Average (ARIMA) Models. 5.1 Introduction. 5.2 Linear Models for Stationary Time Series. 5.3 Finite Order Moving Average (MA) Processes. 5.4 Finite Order Autoregressive Processes. 5.5 Mixed Autoregressive-Moving Average (ARMA) Processes. 5.6 Non-stationary Processes. 5.7 Time Series Model Building . 5.8 Forecasting ARIMA Processes . 5.9 Seasonal Processes. 5.10 Final Comments. 6. Transfer Function and Intervention Models. 6.1 Introduction. 6.2 Transfer Function Models. 6.3 Transfer Function-Noise Models. 6.4 Cross Correlation Function. 6.5 Model Specification. 6.6 Forecasting with Transfer Function-Noise Models. 6.7 Intervention Analysis. 7. Survey of Other Forecasting Methods. 7.1 Multivariate Time Series Models and Forecasting. 7.2 State Space Models. 7.3 ARCH and GARCH Models. 7.4 Direct Forecasting of Percentiles. 7.5 Combining Forecasts to Improve Prediction Performance. 7.6 Aggregation and Disaggregation of Forecasts. 7.7 Neural Networks and Forecasting. 7.8 Some Comments on Practical Implementation and use of Statistical Forecasting Techniques. Bibliography. Appendix. Appendix A Statistical Tables. Table A.1 Cumulative Normal Distribution. Table A.2 Percentage Points of the Chi-Square Distribution. Table A.3 Percentage Points of the t Distribution. Table A.4 Percentage Points of the F Distribution. Table A.5 Critical Values of the Durbin-Watson Statistic. Appendix B Data Sets for Exercises. Table B.1 Market Yield on U.S. Treasury Securities at 10-year Constant Maturity. Table B.2 Pharmaceutical Product Sales. Table B.3 Chemical Process Viscosity. Table B.4 U.S Production of Blue and Gorgonzola Cheeses. Table B.5 U.S. Beverage Manufacturer Product Shipments, Unadjusted. Table B.6 Global Mean Surface Air Temperature Anomaly and Global CO22 Concentration. Table B.7 Whole Foods Market Stock Price, Daily Closing Adjusted for Splits. Table B.8 Unemployment Rate - Full-Time Labor Force, Not Seasonally Adjusted. Table B.9 International Sunspot Numbers. Table B.10 United Kingdom Airline Miles Flown. Table B.11 Champagne Sales. Table B.12 Chemical Process Yield, with Operating Temperature (Uncontrolled). Table B.13 U.S. Production of Ice Cream and Frozen Yogurt. Table B.14 Atmospheric CO2 Concentrations at Mauna Loa Observatory. Table B.15 U.S. National Violent Crime Rate. Table B.16 U.S. Gross Domestic Product. Table B.17 U.S. Total Energy Consumption. Table B.18 U.S. Coal Production. Table B.19 Arizona Drowning Rate, Children 1-4 Years Old. Table B.20 U.S. Internal Revenue Tax Refunds. Index.

981 citations

Proceedings ArticleDOI
27 Jun 2018
TL;DR: A novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge of multivariate time series forecasting, using the Convolution Neural Network and the Recurrent Neural Network to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends.
Abstract: Multivariate time series forecasting is an important machine learning problem across many domains, including predictions of solar plant energy output, electricity consumption, and traffic jam situation. Temporal data arise in these real-world applications often involves a mixture of long-term and short-term patterns, for which traditional approaches such as Autoregressive models and Gaussian Process may fail. In this paper, we proposed a novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge. LSTNet uses the Convolution Neural Network (CNN) and the Recurrent Neural Network (RNN) to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends. Furthermore, we leverage traditional autoregressive model to tackle the scale insensitive problem of the neural network model. In our evaluation on real-world data with complex mixtures of repetitive patterns, LSTNet achieved significant performance improvements over that of several state-of-the-art baseline methods. All the data and experiment codes are available online.

878 citations