Story Comprehension for Predicting What Happens Next
Snigdha Chaturvedi,Haoruo Peng,Dan Roth +2 more
- pp 1603-1614
TLDR
This paper presents a story comprehension model that explores three distinct semantic aspects: the sequence of events described in the story, its emotional trajectory, and its plot consistency, and uses a hidden variable to weigh the semantic aspects in the context of the story.Abstract:
Automatic story comprehension is a fundamental challenge in Natural Language Understanding, and can enable computers to learn about social norms, human behavior and commonsense. In this paper, we present a story comprehension model that explores three distinct semantic aspects: (i) the sequence of events described in the story, (ii) its emotional trajectory, and (iii) its plot consistency. We judge the model’s understanding of real-world stories by inquiring if, like humans, it can develop an expectation of what will happen next in a given story. Specifically, we use it to predict the correct ending of a given short story from possible alternatives. The model uses a hidden variable to weigh the semantic aspects in the context of the story. Our experiments demonstrate the potential of our approach to characterize these semantic aspects, and the strength of the hidden variable based approach. The model outperforms the state-of-the-art approaches and achieves best results on a publicly available dataset.read more
Citations
More filters
What is Narrative Analysis
TL;DR: Recording of presentation introducing narrative analysis, outlining what it is, why it can be a useful approach, how to do it and where to find out more.
Proceedings ArticleDOI
T-CVAE: transformer-based conditioned variational autoencoder for story completion
Tianming Wang,Xiaojun Wan +1 more
TL;DR: This paper presents a novel conditional variational autoencoder based on Transformer for missing plot generation that generates better story plots than state-of-the-art models in terms of readability, diversity and coherence.
Posted Content
Story Ending Generation with Incremental Encoding and Commonsense Knowledge
TL;DR: A novel model for story ending generation that adopts an incremental encoding scheme to represent context clues which are spanning in the story context and can generate more reasonable story endings than state-of-the-art baselines1.
Proceedings ArticleDOI
Joint Constrained Learning for Event-Event Relation Extraction
TL;DR: This work proposes a joint constrained learning framework for modeling event-event relations that enforces logical constraints within and across multiple temporal and subevent relations by converting these constraints into differentiable learning objectives.
Proceedings ArticleDOI
Tackling the Story Ending Biases in The Story Cloze Test
TL;DR: A new crowdsourcing scheme is designed that creates a new SCT dataset that overcomes some of the biases and benchmarked a few models on the new dataset, showing that the top-performing model on the original SCT datasets fails to keep up its performance.
References
More filters
Journal ArticleDOI
Maximum likelihood from incomplete data via the EM algorithm
Proceedings ArticleDOI
Glove: Global Vectors for Word Representation
TL;DR: A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Proceedings ArticleDOI
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis
TL;DR: A new approach to phrase-level sentiment analysis is presented that first determines whether an expression is neutral or polar and then disambiguates the polarity of the polar expressions.
Journal ArticleDOI
Scripts, Plans, Goals, and Understanding: An Inquiry into Human Knowledge Structures
What is Narrative Analysis
TL;DR: Recording of presentation introducing narrative analysis, outlining what it is, why it can be a useful approach, how to do it and where to find out more.