scispace - formally typeset
Proceedings ArticleDOI

Stock market's price movement prediction with LSTM neural networks

TLDR
This article studies the usage of LSTM networks on that scenario, to predict future trends of stock prices based on the price history, alongside with technical analysis indicators, and the results are promising.
Abstract
Predictions on stock market prices are a great challenge due to the fact that it is an immensely complex, chaotic and dynamic environment. There are many studies from various areas aiming to take on that challenge and Machine Learning approaches have been the focus of many of them. There are many examples of Machine Learning algorithms been able to reach satisfactory results when doing that type of prediction. This article studies the usage of LSTM networks on that scenario, to predict future trends of stock prices based on the price history, alongside with technical analysis indicators. For that goal, a prediction model was built, and a series of experiments were executed and theirs results analyzed against a number of metrics to assess if this type of algorithm presents and improvements when compared to other Machine Learning methods and investment strategies. The results that were obtained are promising, getting up to an average of 55.9% of accuracy when predicting if the price of a particular stock is going to go up or not in the near future.

read more

Citations
More filters
Journal ArticleDOI

Financial time series forecasting with deep learning : A systematic literature review: 2005–2019

TL;DR: A comprehensive literature review on DL studies for financial time series forecasting implementations and grouped them based on their DL model choices, such as Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Long-Short Term Memory (LSTM).
Journal ArticleDOI

Application of Long Short-Term Memory (LSTM) Neural Network for Flood Forecasting

TL;DR: In this paper, a Long Short-Term Memory (LSTM) neural network model was used for flood forecasting, where the daily discharge and rainfall were used as input data, and characteristics of the data sets which may influence the model performance were also of interest.
Journal ArticleDOI

CNNpred: CNN-based stock market prediction using a diverse set of variables

TL;DR: A CNN-based framework is suggested, that can be applied on a collection of data from a variety of sources, including different markets, in order to extract features for predicting the future of those markets.

A Random Walk Down Wall Street

TL;DR: The a random walk down wall street is universally compatible with any devices to read, allowing you to get the most less latency time to download any of the authors' books like this one.
Journal ArticleDOI

ModAugNet: A new forecasting framework for stock market index value with an overfitting prevention LSTM module and a prediction LSTM module

TL;DR: ModAugNet-c yields a lower test error than the comparative model (SingleNet) in which an overfitting prevention LSTM module is absent, and its applicability in various instances where it is challenging to artificially augment data, such as medical data analysis and financial time-series modeling is found.
References
More filters
Book ChapterDOI

I and J

Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Efficient capital markets: a review of theory and empirical work*

Eugene F. Fama
- 01 May 1970 - 
TL;DR: Efficient Capital Markets: A Review of Theory and Empirical Work Author(s): Eugene Fama Source: The Journal of Finance, Vol. 25, No. 2, Papers and Proceedings of the Twenty-Eighth Annual Meeting of the American Finance Association New York, N.Y. December, 28-30, 1969 (May, 1970), pp. 383-417 as mentioned in this paper
Journal ArticleDOI

LSTM: A Search Space Odyssey

TL;DR: This paper presents the first large-scale analysis of eight LSTM variants on three representative tasks: speech recognition, handwriting recognition, and polyphonic music modeling, and observes that the studied hyperparameters are virtually independent and derive guidelines for their efficient adjustment.
Book

Supervised Sequence Labelling with Recurrent Neural Networks

Alex Graves
TL;DR: A new type of output layer that allows recurrent networks to be trained directly for sequence labelling tasks where the alignment between the inputs and the labels is unknown, and an extension of the long short-term memory network architecture to multidimensional data, such as images and video sequences.
Related Papers (5)