Sequence to Sequence Weather Forecasting with Long Short-Term Memory Recurrent Neural Networks
Citations
2,014 citations
468 citations
Cites methods from "Sequence to Sequence Weather Foreca..."
...The sequence to sequence approach has been previously employed in speech recognition and machine translation applications [29, 30, 28], and while the approach has been previously used in short-term weather forecasting [31], its application in energy prediction context has been largely unexplored....
[...]
205 citations
179 citations
133 citations
Cites background or methods from "Sequence to Sequence Weather Foreca..."
...4) Deep LSTM Encoder–Decoder: We use the model presented in [19], which is a stacked architecture of LSTM layers connected to a time-distributed dense layer....
[...]
...8 that the LSTM encoder–decoder [19] model performs better than the other baseline models, given that the LSTM memory cells better capture the long-term dependencies in the sequential dataset than the other baseline models....
[...]
...In ML, this is typically regarded as a sequential time-series forecasting problem or sequence-to-sequence forecasting [19]....
[...]
References
72,897 citations
"Sequence to Sequence Weather Foreca..." refers methods in this paper
...[3] Alex Graves, Marcus Liwicki, Horst Bunke, Jürgen Schmidhuber, and Santiago Fernández....
[...]
...[2] Douglas Eck and Jürgen Schmidhuber....
[...]
...[4] Alex Graves and Jürgen Schmidhuber....
[...]
...[7] Sepp Hochreiter and Jürgen Schmidhuber....
[...]
...They were introduced by Hochreiter and Schmidhuber [7]....
[...]
12,299 citations
11,936 citations
"Sequence to Sequence Weather Foreca..." refers background in this paper
...the difference between LSTMs and other traditional Recurrent Neural Networks (RNNs) is its ability to process and predict time series sequences without forgetting unimportant information, LSTMs achieve state of the art results in sequence related problems like handwriting recognition [4, 3], speech recognition [6, 1], music composition [2] and grammar learning [8] (In natural language processing)....
[...]
9,091 citations
"Sequence to Sequence Weather Foreca..." refers background in this paper
...the difference between LSTMs and other traditional Recurrent Neural Networks (RNNs) is its ability to process and predict time series sequences without forgetting unimportant information, LSTMs achieve state of the art results in sequence related problems like handwriting recognition [4, 3], speech recognition [6, 1], music composition [2] and grammar learning [8] (In natural language processing)....
[...]
3,120 citations