scispace - formally typeset
Open AccessBook ChapterDOI

Text Chunking Using Transformation-Based Learning

Lance Ramshaw, +1 more
- pp 157-176
Reads0
Chats0
TLDR
This work has shown that the transformation-based learning approach can be applied at a higher level of textual interpretation for locating chunks in the tagged text, including non-recursive “baseNP” chunks.
Abstract
Transformation-based learning, a technique introduced by Eric Brill (1993b), has been shown to do part-of-speech tagging with fairly high accuracy. This same method can be applied at a higher level of textual interpretation for locating chunks in the tagged text, including non-recursive “baseNP” chunks. For this purpose, it is convenient to view chunking as a tagging problem by encoding the chunk structure in new tags attached to each word. In automatic tests using Treebank-derived data, this technique achieved recall and precision rates of roughly 93% for baseNP chunks (trained on 950K words) and 88% for somewhat more complex chunks that partition the sentence (trained on 200K words). Working in this new application and with larger template and training sets has also required some interesting adaptations to the transformation-based learning approach.

read more

Citations
More filters
Journal ArticleDOI

Leveraging word confusion networks for named entity modeling and detection from conversational telephone speech

TL;DR: This work proposes using Word Confusion Networks (WCNs), sequences of bundled words, for both NE modeling and detection by regarding the word bundles as units instead of the independent words, and shows that by using the WCNs, the accuracy of NE detection improved regardless of the NE modeling method.
Proceedings ArticleDOI

Target-specified Sequence Labeling with Multi-head Self-attention for Target-oriented Opinion Words Extraction.

TL;DR: This paper proposes Target-Specified sequence labeling with Multi-head Self-Attention (TSMSA) for TOWE, in which any pre-trained language model with multi-head self-attention can be integrated conveniently and indicates that TSMSA outperforms the benchmark methods on TowE significantly.
Proceedings ArticleDOI

Learning to Imagine: Integrating Counterfactual Thinking in Neural Discrete Reasoning

TL;DR: This work devise a Learning to Imagine (L2I) module, which can be seamlessly incorporated into NDR models to perform the imagination of unseen counterfactual and applies the proposed L2I to TAGOP, the state-of-the-art solution on TAT-QA, validating the rationality and effectiveness of this approach.
Journal ArticleDOI

Elicitation and use of relevance feedback information

TL;DR: Two approaches to interactively refining user search formulations are presented and their evaluation in the new High Accuracy Retrieval from Documents (HARD) track of TREC-12 show that one of the methods is an effective means of interactive query expansion and yields significant performance improvements.
Journal ArticleDOI

ATE-SPD: simultaneous extraction of aspect-term and aspect sentiment polarity using Bi-LSTM-CRF neural network

TL;DR: A deep neural network model named ATE-SPD for aspect-based sentiment analysis that simultaneously extracts aspect- terms and their corresponding polarities in review sentences and provides a novel set of sequential tags for extracting aspect-terms along with their sentiment polarities.
References
More filters
Book ChapterDOI

Parsing By Chunks

TL;DR: The typical chunk consists of a single content word surrounded by a constellation of function words, matching a fixed template, and the relationships between chunks are mediated more by lexical selection than by rigid templates.
Proceedings ArticleDOI

A Stochastic Parts Program and Noun Phrase Parser for Unrestricted Text

TL;DR: The authors used a linear-time dynamic programming algorithm to find an assignment of parts of speech to words that optimizes the product of (a) lexical probabilities (probability of observing part of speech i given word i) and (b) contextual probabilities (pb probability of observing n following partsof speech).
Proceedings Article

Some advances in transformation-based part of speech tagging

TL;DR: In this article, a rule-based approach to tagging unknown words is described, where the tagger-can be extended into a k-best tagger, where multiple tags can be assigned to words in some cases of uncertainty.
Journal ArticleDOI

Performance structures: A psycholinguistic and linguistic appraisal☆

TL;DR: In this paper, two lines of research are combined to deal with a long-standing problem in both fields: why the performance structures of sentences (structures based on experimental data, such as pausing and parsing values) are not fully accountable for by linguistic theories of phrase structure.
Book

A corpus-based approach to language learning

Eric D. Brill
TL;DR: A learning algorithm is described that takes a small structurally annotated corpus of text and a larger unannotated corpus as input, and automatically learns how to assign accurate structural descriptions to sentences not in the training corpus.
Related Papers (5)