scispace - formally typeset
Open AccessJournal ArticleDOI

Text-based crude oil price forecasting: A deep learning approach

Reads0
Chats0
TLDR
This study proposes a feature grouping method based on the Latent Dirichlet Allocation (LDA) topic model for distinguishing effects from various online news topics and suggests that the proposed topic-sentiment synthesis forecasting models perform better than the older benchmark models.
About
This article is published in International Journal of Forecasting.The article was published on 2019-10-01 and is currently open access. It has received 128 citations till now. The article focuses on the topics: Latent Dirichlet allocation & Topic model.

read more

Citations
More filters
Journal ArticleDOI

A new forecasting model with wrapper-based feature selection approach using multi-objective optimization technique for chaotic crude oil time series

TL;DR: The obtained empirical results show that the proposed forecasting model can capture the nonlinear properties of crude oil time series, and that better forecasting performance can be obtained in terms of precision and volatility than the other current forecasting models.

Cointegration Between Oil Spot and Future Prices of the Same and Different Grades in the Presence of Structural Change

TL;DR: This paper examined whether crude oil spot and futures prices of the same and different grades are cointegrated using a residual-based cointegration test that allows for one structural break in the cointegrating vector and high-frequency data.
Posted Content

Neural forecasting: Introduction and literature overview.

TL;DR: An introduction to the recent literature on neural networks for forecasting and applications and an overview of some of the advances that have permitted the resurgence of neural networks in machine learning are provided.
Journal ArticleDOI

Forecasting of COVID-19 using deep layer Recurrent Neural Networks (RNNs) with Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) cells.

TL;DR: In this paper, the authors proposed state-of-the-art deep learning Recurrent Neural Networks (RNN) models to predict the country-wise cumulative confirmed cases, cumulative recovered cases and the cumulative fatalities.
Journal ArticleDOI

Forecasting the U.S. oil markets based on social media information during the COVID-19 pandemic

TL;DR: In this paper, the authors collected vast online oil news and used convolutional neural network to extract relevant information automatically for predicting the oil price, production, and consumption during the COVID-19 pandemic.
References
More filters
Journal ArticleDOI

Random Forests

TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Latent dirichlet allocation

TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Proceedings Article

Latent Dirichlet Allocation

TL;DR: This paper proposed a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hof-mann's aspect model, also known as probabilistic latent semantic indexing (pLSI).
Journal ArticleDOI

Distribution of the Estimators for Autoregressive Time Series with a Unit Root

TL;DR: In this article, the limit distributions of the estimator of p and of the regression t test are derived under the assumption that p = ± 1, where p is a fixed constant and t is a sequence of independent normal random variables.
Related Papers (5)