scispace - formally typeset
Proceedings ArticleDOI

Probabilistic representation of formal languages

Taylor L. Booth
- pp 74-81
Reads0
Chats0
TLDR
It is shown that under some conditions it is possible to recognize a non finitestate language with a finite state acceptor if one is willing to accept a small probability of making an error.
Abstract
The problem of assigning a probability to each string of a language L(G) generated by a grammar G is considered Two methods are considered One method assigns a probability to each production associated with G and the other assigns the probabilities on the basis of particular features of the language Several necessary conditions that must be satisfied by these probability assignment techniques if they are to be consistant are presented The problem of recognizing languages is also considered It is shown that under some conditions it is possible to recognize a non finitestate language with a finite state acceptor if one is willing to accept a small probability of making an error

read more

Citations
More filters
Journal ArticleDOI

Expectation-based syntactic comprehension

TL;DR: A simple information-theoretic characterization of processing difficulty as the work incurred by resource reallocation during parallel, incremental, probabilistic disambiguation in sentence comprehension is proposed, and its equivalence to the theory of Hale is demonstrated.
Journal ArticleDOI

Composition in distributional models of semantics.

TL;DR: This article proposes a framework for representing the meaning of word combinations in vector space in terms of additive and multiplicative functions, and introduces a wide range of composition models that are evaluated empirically on a phrase similarity task.
Journal ArticleDOI

What do we mean by prediction in language comprehension

TL;DR: It is argued that the bulk of behavioural and neural evidence suggests that the authors predict probabilistically and at multiple levels and grains of representation, and that all these properties of language understanding can be naturally explained and productively explored within a multi-representational hierarchical actively generative architecture.
Journal ArticleDOI

Representing word meaning and order information in a composite holographic lexicon.

TL;DR: The authors used simple convolution and superposition mechanisms to learn distributed holographic representations for words, which can be used for higher order models of language comprehension, relieving the complexity required at the higher level.

Representing Word Meaning and Order Information in a Composite

TL;DR: A computational model that builds a holographic lexicon representing both word meaning and word order from unsupervised experience with natural language demonstrates that a broad range of psychological data can be accounted for directly from the structure of lexical representations learned in this way, without the need for complexity to be built into either the processing mechanisms or the representations.
Related Papers (5)