Learning to Ask: Neural Question Generation for Reading Comprehension
Xinya Du,Junru Shao,Claire Cardie +2 more
- Vol. 1, pp 1342-1352
TLDR
This paper proposed an attention-based sequence learning model for question generation from text passages in reading comprehension, which is trainable end-to-end via sequence-tosequence learning and significantly outperforms the state-of-the-art rule-based system.Abstract:
We study automatic question generation for sentences from text passages in reading comprehension. We introduce an attention-based sequence learning model for the task and investigate the effect of encoding sentence- vs. paragraph-level information. In contrast to all previous work, our model does not rely on hand-crafted rules or a sophisticated NLP pipeline; it is instead trainable end-to-end via sequence-to-sequence learning. Automatic evaluation results show that our system significantly outperforms the state-of-the-art rule-based system. In human evaluations, questions generated by our system are also rated as being more natural (i.e.,, grammaticality, fluency) and as more difficult to answer (in terms of syntactic and lexical divergence from the original text and reasoning needed to answer).read more
Citations
More filters
Posted Content
Tackling Climate Change with Machine Learning
David Rolnick,Priya L. Donti,Lynn H. Kaack,K. Kochanski,Alexandre Lacoste,Kris Sankaran,Andrew S. Ross,Nikola Milojevic-Dupont,Natasha Jaques,Anna Waldman-Brown,Alexandra Luccioni,Tegan Maharaj,Evan D. Sherwin,S. Karthik Mukkavilli,Konrad P. Kording,Carla P. Gomes,Andrew Y. Ng,Demis Hassabis,John Platt,Felix Creutzig,Jennifer Chayes,Yoshua Bengio +21 more
TL;DR: From smart grids to disaster management, high impact problems where existing gaps can be filled by ML are identified, in collaboration with other fields, to join the global effort against climate change.
Posted Content
Unified Language Model Pre-training for Natural Language Understanding and Generation
Li Dong,Nan Yang,Wenhui Wang,Furu Wei,Xiaodong Liu,Yu Wang,Jianfeng Gao,Ming Zhou,Hsiao-Wuen Hon +8 more
TL;DR: A new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks that compares favorably with BERT on the GLUE benchmark, and the SQuAD 2.0 and CoQA question answering tasks.
Proceedings ArticleDOI
Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks
TL;DR: A maxout pointer mechanism with gated self-attention encoder to address the challenges of processing long text inputs for question generation, which outperforms previous approaches with either sentence-level or paragraph-level inputs.
Proceedings ArticleDOI
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training.
TL;DR: A new sequence-to-sequence pre-training model called ProphetNet is presented, which introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism that predicts the next n tokens simultaneously based on previous context tokens at each time step.
Proceedings ArticleDOI
Question Generation for Question Answering
TL;DR: Experimental results show that, by using generated questions as an extra signal, significant QA improvement can be achieved.
References
More filters
Proceedings Article
Sequence to Sequence Learning with Neural Networks
TL;DR: The authors used a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector.
Proceedings Article
ROUGE: A Package for Automatic Evaluation of Summaries
TL;DR: Four different RouGE measures are introduced: ROUGE-N, ROUge-L, R OUGE-W, and ROUAGE-S included in the Rouge summarization evaluation package and their evaluations.
Journal Article
Binary codes capable of correcting deletions, insertions, and reversals
Proceedings ArticleDOI
Effective Approaches to Attention-based Neural Machine Translation
TL;DR: A global approach which always attends to all source words and a local one that only looks at a subset of source words at a time are examined, demonstrating the effectiveness of both approaches on the WMT translation tasks between English and German in both directions.
Proceedings ArticleDOI
The Stanford CoreNLP Natural Language Processing Toolkit
Christopher D. Manning,Mihai Surdeanu,John Bauer,Jenny Rose Finkel,Steven Bethard,David McClosky +5 more
TL;DR: The design and use of the Stanford CoreNLP toolkit is described, an extensible pipeline that provides core natural language analysis, and it is suggested that this follows from a simple, approachable design, straightforward interfaces, the inclusion of robust and good quality analysis components, and not requiring use of a large amount of associated baggage.