scispace - formally typeset
Open AccessProceedings Article

Introduction to the special issue on natural language generation

TLDR
The natural language generation community is a thriving one, with a research base that has been developing steadily--although perhaps at a slower pace because of the smaller size of the community--for just as long as work in natural language understanding.
Abstract
There are two sides to natural language processing. On the one hand, work in natural language understanding is concerned with the mapping from some surface representation of linguistic material expressed as speech or text--to an underlying representation of the meaning carried by that surface representation. But there is also the question of how one maps from some underlying representation of meaning into text or speech: this is the domain of natural language generation. Whether our end-goal is the construction of artifacts that use natural languages intelligently, the formal characterization of phenomena in human languages, or the computational modeling of the human language processing mechanism, we cannot ignore the fact that language is both spoken (or written) and heard (or read). Both are equally large and important problems, but the literature contains much less work on natural language generation (NLG) than it does on natural language understanding (NLU). There are many reasons why this might be so, although clearly an important one is that researchers in natural language understanding in some sense start out with a more well-defined task: the input is known, and there is a lot of it around. This is not the case in natural language generation: there, it is the desired output that is known, but the input is an unknown; and while the world is awash with text waiting to be processed, there are fewer instances of what we might consider appropriate inputs for the process of natural language generation. For researchers in the field, this highlights the fundamental question that always has to be asked: What do we generate from? Despite this problem, the natural language generation community is a thriving one, with a research base that has been developing steadily--although perhaps at a slower pace because of the smaller size of the community--for just as long as work in natural language understanding. It should not be forgotten that much of NLP has its origins in the early work on machine translation in the 1950s; and that to carry out machine translation, one has to not only analyze existing texts but also to generate new ones. The early machine translation experiments, however, did not recognize the problems that give modern work in NLG its particular character. The first significant pieces of work in the field appeared during the 1970s; in particular, Goldman's work on the problem of lexicalizing underlying conceptual material (Goldman 1974) and

read more

Citations
More filters

Question Generation from Concept Maps

TL;DR: A question generation approach suitable for tutorial dialogues based on previous psychological theories that hypothesize questions are generated from a knowledge representation modeled as a concept map is presented.
Posted Content

Image-to-Markup Generation with Coarse-to-Fine Attention

TL;DR: In this paper, a neural encoder-decoder model was proposed to convert images into presentational markup based on a scalable coarse-to-fine attention mechanism, which outperformed classical mathematical OCR systems by a large margin on in-domain rendered data and also performs well on out-of-domain handwritten data.
Book

Sensorimotor Cognition and Natural Language Syntax

TL;DR: Knott as mentioned in this paper argues that the syntax of a concrete sentence can be interpreted as a description of sensorimotor processes, and that many of the syntactic principles understood in Minimalism as encoding innate linguistic knowledge are actually sensor-imotor in origin.

Paronomasic puns: Target recoverability towards automatic generation

TL;DR: The aim of this dissertation is to create a theory to model the factors, prominently, but not exclusively the phonological similarity, important in imperfect punning and to outline the implementation of this measure for the evaluation of possible imperfect puns given an input word and a set of possible target words.
Book ChapterDOI

Experiments on Generating Questions About Facts

TL;DR: A simple attribute-value language and its interpretation engine is enhanced with context-sensitive primitives and added a linguistic layer deep enough for the overall system to score well on user satisfiability and the 'linguistically well-founded' criteria used to measure up language generation systems.
References
More filters
Book ChapterDOI

Logic and conversation

H. P. Grice
- 12 Dec 1975 - 
Journal ArticleDOI

Computational Interpretations of the Gricean Maxims in the Generation of Referring Expressions

TL;DR: The authors examined the problem of generating definite noun phrases that are appropriate referring expressions, that is, noun phrases which successfully identify the intended referent to the hearer whilst not conveying to him or her any false conversational implicatures.
Journal ArticleDOI

Building applied natural language generation systems

TL;DR: An overview of Natural Language Generation from an applied system-building perspective, with the emphasis on established techniques that can be used to build simple but practical working systems now.
MonographDOI

Text generation: using discourse strategies and focus constraints to generate natural language text

TL;DR: Preface Introduction 2. Discourse structure 3. Focusing in discourse 4. TEXT system implementation 5.discourse history 6. Related generation research 7. Summary and conclusions
Book

Planning English sentences

TL;DR: This book is an investigation into the problems of generating natural language utterances to satisfy specific goals in the speaker's mind and is thus an ambitious and significant contribution to research on language generation in artificial intelligence.