Open Access
Massively-Parallel Inferencing for Natural Language Understanding and Memory Retrieval in Structured Spreading-Activation Networks
Reads0
Chats0
TLDR
A full model of the language understanding and memory retrieval processes must take into account the interaction of the two and how they effect each other.Abstract:
One of the most difficult parts of the natural language understanding process is forming a semantic interpretation of the text. A reader must often make multiple inferences to understand the motives of actors and to causally connect actions that are unrelated on the basis of surface semantics alone. The inference process is complicated by the fact that text is often ambiguous both lexically and pragmatically, and that new context often forces a reinterpretation of the input’s meaning. This language understanding process itself does not exist in a vacuum -as people read text or hold conversations, they are often reminded of analogous stories or episodes. The types of memories that are triggered are influenced by context from the inferences and disambiguations of the understanding process. A full model of the language understanding and memory retrieval processes must take into account the interaction of the two and how they effect each other.read more
Citations
More filters
Journal Article
Review of Trajectories through knowledge space: a dynamic framework for machine comprehension by Lawrence A. Bookman. Kluwer Academic Publishers 1994.
Proceedings ArticleDOI
A time-constrained architecture for cognition
TL;DR: A new set of requirements for a cognitive architecture, including strong tractability and avoidance of epistemological commitments are suggested, which can be satisfied by a time-constrained model of memory, which takes the form of a massively parallel network of objects exchanging simple signals.
References
More filters
Book
The Connection Machine
TL;DR: The Connection Machine describes a fundamentally different kind of computer that Daniel Hillis and others are now developing to perform tasks that no conventional, sequential machine can solve in a reasonable time.
Book
Inside Case-Based Reasoning
TL;DR: CBR tends to be a good approach for rich, complex domains in which there are myriad ways to generalize a case, and is similar to the rule-induction algorithms of machine learning.
Journal ArticleDOI
From simple associations to systematic reasoning: A connectionist representation of rules, variables and dynamic bindings using temporal synchrony
TL;DR: A computational model is described that takes a step toward addressing the cognitive science challenge and resolving the artificial intelligence paradox and shows how a connectionist network can encode millions of facts and rules involving n-ary predicates and variables and perform a class of inferences in a few hundred milliseconds.
Journal ArticleDOI
Massively Parallel Parsing: A Strongly Interactive Model of Natural Language Interpretation*
David L. Waltz,Jordan Pollack +1 more
TL;DR: This work describes a parallel model for the representation of context and of the priming of concepts in a natural language processing system with modular knowledge sources but strongly interactive processing.
Journal ArticleDOI
Analog retrieval by constraint satisfaction
TL;DR: ARS is a program that demonstrates how a set of semantic, structural, and pragmatic constraints can be used to select relevant analogs by forming a network of hypotheses and attempting to satisfy the constraints simultaneously.