An End-to-End Model for Question Answering over Knowledge Base with Cross-Attention Combining Global Knowledge
Citations
482 citations
Cites background from "An End-to-End Model for Question An..."
...In particular, co-attention or crossattention mechanisms have been applied to solve complicated NLP tasks [8, 36]....
[...]
348 citations
Cites background from "An End-to-End Model for Question An..."
...Several deep learning based models [10, 12, 21, 27, 46] achieved this projection by feeding words in questions into convolutional neural networks [12, 46], LSTM networks [18, 21], or gated recurrent units neural networks [10, 27]....
[...]
...A series of work [5, 6, 9, 11, 12, 21, 27, 43, 44] proposed to project questions and candidate answers (or entire facts) into a unified low-dimensional space based on the training questions, and measure their matching scores by the similarities between their low-dimensional representations....
[...]
...To bridge the gap, Question Answering over Knowledge Graph (QA-KG) is proposed [10, 21]....
[...]
314 citations
303 citations
285 citations
Cites background from "An End-to-End Model for Question An..."
...Methods like (Dai et al., 2016; Dong et al., 2015; Hao et al., 2017; Lukovnikov et al., 2017; Yin et al., 2016) utilize neural networks to learn a scoring functions to rank the candidate answers....
[...]
References
72,897 citations
"An End-to-End Model for Question An..." refers background in this paper
...The hidden unit of forward and backward LSTM is d2 , so the concatenated vector is of dimension d....
[...]
...Similarly, Bi LSTM+C-ATT+GKI significantly outperforms Bi LSTM+GKI by 2.5 points, improving Bi LSTM+A-Q-ATT+GKI by 0.3 points....
[...]
...Then, the embeddings are fed into a long shortterm memory (LSTM) (Hochreiter and Schmidhuber, 1997) networks....
[...]
...To avoid this, we employ bidirectional LSTM as Bahdanau (2015) does, which consists of both forward and backward networks....
[...]
...The forward LSTM handles the question from left to right, and the backward LSTM processes in the reverse order....
[...]
20,027 citations
Additional excerpts
...Bahdanau et al. (2015) first applied attention model in NLP....
[...]
14,077 citations
12,299 citations
11,936 citations
"An End-to-End Model for Question An..." refers background in this paper
...be effective in many natural language processing (NLP) tasks such as machine translation (Sutskever et al., 2014) and dependency parsing (Dyer et al....
[...]
...…governing officials, government.position held.office holder, /m/02qg4z) is a 2-top connection. be effective in many natural language processing (NLP) tasks such as machine translation (Sutskever et al., 2014) and dependency parsing (Dyer et al., 2015), and it is adept in harnessing long sentences....
[...]