scispace - formally typeset
Open Access

Memory-based language processing

TLDR
Memory-based language processing as discussed by the authors is based on the idea that the direct re-use of examples using analogical reasoning is more suited for solving language processing problems than the application of rules extracted from those examples.
Abstract
Memory-based language processing--a machine learning and problem solving method for language technology--is based on the idea that the direct re-use of examples using analogical reasoning is more suited for solving language processing problems than the application of rules extracted from those examples. This book discusses the theory and practice of memory-based language processing, showing its comparative strengths over alternative methods of language modelling. Language is complex, with few generalizations, many sub-regularities and exceptions, and the advantage of memory-based language processing is that it does not abstract away from this valuable low-frequency information.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book

Natural Language Processing with Python

TL;DR: This book offers a highly accessible introduction to natural language processing, the field that supports a variety of language technologies, from predictive text and email filtering to automatic summarization and translation.
Journal ArticleDOI

MaltParser: A language-independent system for data-driven dependency parsing

TL;DR: Experimental evaluation confirms that MaltParser can achieve robust, efficient and accurate parsing for a wide range of languages without language-specific enhancements and with rather limited amounts of training data.
Proceedings Article

MaltParser: A Data-Driven Parser-Generator for Dependency Parsing

TL;DR: MaltParser is introduced, a data-driven parser generator for dependency parsing given a treebank in dependency format and can be used to induce a parser for the language of the treebank.
Book

Dependency Parsing

TL;DR: This book surveys the three major classes of parsing models that are in current use: transition- based, graph-based, and grammar-based models, and gives a thorough introduction to the methods that are most widely used today.
Journal ArticleDOI

Probabilistic models of language processing and acquisition

TL;DR: A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with Probabilistic theories of categorization and ambiguity resolution in perception.
References
More filters
Book ChapterDOI

Fast effective rule induction

TL;DR: This paper evaluates the recently-proposed rule learning algorithm IREP on a large and diverse collection of benchmark problems, and proposes a number of modifications resulting in an algorithm RIPPERk that is very competitive with C4.5 and C 4.5rules with respect to error rates, but much more efficient on large samples.
Journal ArticleDOI

Toward memory-based reasoning

TL;DR: The intensive use of memory to recall specific episodes from the past—rather than rules—should be the foundation of machine reasoning.
Posted Content

Text Chunking using Transformation-Based Learning

TL;DR: The authors used transformation-based learning for part-of-speech tagging with fairly high accuracy, achieving recall and precision rates of roughly 92% for base NP chunks and 88% for more complex chunks.

A framework of a mechanical translation between Japanese and English by analogy principle

TL;DR: A model based on a series of human language processing and in particular the use of analogical thinking is defined, based on the ability of analogy finding in human beings.
Book

Beyond grammar : an experience-based theory of language

Rens Bod
TL;DR: This work presents a DOP model for tree representations, a formal stochastic language theory, and a model for non-context-free representations for compositional semantic representations.
Trending Questions (1)
What is memory processing style?

The provided paper is about memory-based language processing, which is a machine learning and problem-solving method for language technology. It discusses the theory and practice of memory-based language processing and its advantages over alternative methods of language modeling. However, the paper does not specifically mention the term "memory processing style."