scispace - formally typeset
Open AccessPosted Content

A Survey on Semantic Parsing from the perspective of Compositionality.

Reads0
Chats0
TLDR
This work focuses on (a) meaning composition from syntactical structure (Partee, 1975), and (b) the ability of semantic parsers to handle lexical variation given the context of a knowledge base (KB).
Abstract
Different from previous surveys in semantic parsing (Kamath and Das, 2018) and knowledge base question answering(KBQA)(Chakraborty et al., 2019; Zhu et al., 2019; Hoffner et al., 2017) we try to takes a different perspective on the study of semantic parsing. Specifically, we will focus on (a)meaning composition from syntactical structure(Partee, 1975), and (b) the ability of semantic parsers to handle lexical variation given the context of a knowledge base (KB). In the following section after an introduction of the field of semantic parsing and its uses in KBQA, we will describe meaning representation using grammar formalism CCG (Steedman, 1996). We will discuss semantic composition using formal languages in Section 2. In section 3 we will consider systems that uses formal languages e.g. $\lambda$-calculus (Steedman, 1996), $\lambda$-DCS (Liang, 2013). Section 4 and 5 consider semantic parser using structured-language for logical form. Section 6 is on different benchmark datasets ComplexQuestions (Bao et al.,2016) and GraphQuestions (Su et al., 2016) that can be used to evaluate semantic parser on their ability to answer complex questions that are highly compositional in nature.

read more

Citations
More filters
Journal ArticleDOI

It’s the Meaning That Counts: The State of the Art in NLP and Semantics

TL;DR: This work reviews the state of computational semantics in NLP and investigates how different lines of inquiry reflect distinct understandings of semantics and prioritize different layers of linguistic meaning.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings ArticleDOI

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

TL;DR: BERT as mentioned in this paper pre-trains deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
Book ChapterDOI

DBpedia: a nucleus for a web of open data

TL;DR: The extraction of the DBpedia datasets is described, and how the resulting information is published on the Web for human-andmachine-consumption and how DBpedia could serve as a nucleus for an emerging Web of open data.
Proceedings ArticleDOI

Freebase: a collaboratively created graph database for structuring human knowledge

TL;DR: MQL provides an easy-to-use object-oriented interface to the tuple data in Freebase and is designed to facilitate the creation of collaborative, Web-based data-oriented applications.
Proceedings Article

Semantic Parsing on Freebase from Question-Answer Pairs

TL;DR: This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.