Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting
Citations
2,776 citations
Cites methods from "Convolutional LSTM Network: A Machi..."
...This includes the Convolutional LSTM (Shi et al., 2015), which replaces the fully-connected layers in an LSTM with convolutional layers to allow for additional structure in the recurrent layers; the Quasi-RNN model (Bradbury et al., 2017) that interleaves convolutional layers with simple recurrent layers; and the dilated RNN (Chang et al., 2017), which adds dilations to recurrent architectures....
[...]
...This includes the Convolutional LSTM (Shi et al., 2015), which replaces the fully-connected layers in an LSTM with convolutional layers to allow for additional structure in the recurrent layers; the Quasi-RNN model (Bradbury et al., 2017) that interleaves convolutional layers with simple recurrent…...
[...]
...This includes the Convolutional LSTM (Shi et al., 2015), which replaces the fully-connected layers in an LSTM with convolutional layers to allow for additional structure in the recurrent layers; the Quasi-RNN model (Bradbury et al....
[...]
2,103 citations
2,014 citations
1,178 citations
Cites background from "Convolutional LSTM Network: A Machi..."
...Recently, researchers combined above networks and proposed a convolutional LSTM network (Xingjian et al. 2015) that learns spatial and temporal dependencies simultaneously....
[...]
922 citations
References
72,897 citations
"Convolutional LSTM Network: A Machi..." refers background or methods in this paper
...Recent advances in deep learning, especially recurrent neural network (RNN) and long short-term memory (LSTM) models [12, 11, 7, 8, 23, 13, 18, 21, 26], provide some useful insights on how to tackle this problem....
[...]
...One advantage of using the memory cell and gates to control information flow is that the gradient will be trapped in the cell (also known as constant error carousels [12]) and be prevented from vanishing too quickly, which is a critical problem for the vanilla RNN model [12, 17, 2]....
[...]
...For general-purpose sequence modeling, LSTM as a special RNN structure has proven stable and powerful for modeling long-range dependencies in various previous studies [12, 11, 17, 23]....
[...]
[...]
38,208 citations
28,225 citations
Additional excerpts
...Also, if we adopt a similar view as [16], the inputs, cell outputs and hidden states of the traditional FC-LSTM represented by (2) may also be seen as 3D tensors with the last two dimensions being 1....
[...]
19,998 citations
11,936 citations
"Convolutional LSTM Network: A Machi..." refers background or methods in this paper
...Recent advances in deep learning, especially recurrent neural network (RNN) and long short-term memory (LSTM) models [12, 11, 7, 8, 23, 13, 18, 21, 26], provide some useful insights on how to tackle this problem....
[...]
...The pioneering LSTM encoder-decoder framework proposed in [23] provides a general framework for sequence-to-sequence learning problems by training temporally concatenated LSTMs, one for the input sequence and another for the output sequence....
[...]
...We formulate precipitation nowcasting as a spatiotemporal sequence forecasting problem that can be solved under the general sequence-to-sequence learning framework proposed in [23]....
[...]
...We can interpret this structure using a similar viewpoint as [23]....
[...]
...Such models have been applied to solve many real-life sequence modeling problems [23, 26]....
[...]