Open AccessProceedings Article
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
John Lafferty,Andrew McCallum,Fernando Pereira +2 more
- pp 282-289
Reads0
Chats0
TLDR
This work presents iterative parameter estimation algorithms for conditional random fields and compares the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data.Abstract:
We present conditional random fields , a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions made in those models. Conditional random fields also avoid a fundamental limitation of maximum entropy Markov models (MEMMs) and other discriminative Markov models based on directed graphical models, which can be biased towards states with few successor states. We present iterative parameter estimation algorithms for conditional random fields and compare the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data.read more
Citations
More filters
Collective classification with relational dependency networks
TL;DR: This paper presents relational dependency networks (RDNs), a collective classification model that offers simple parameter estimation and efficient structure learning and shows that collective classification improves performance.
Proceedings ArticleDOI
Facial expression recognition with temporal modeling of shapes
TL;DR: This work proposes a framework for automatic facial expression recognition from continuous video sequence by modeling temporal variations within shapes using Latent-Dynamic Conditional Random Fields, and shows that the proposed approach outperforms CRFs for recognizing facial expressions.
Posted Content
Exploiting saliency for object segmentation from image level labels
TL;DR: This paper proposes using a saliency model as additional information and hereby exploit prior knowledge on the object extent and image statistics and shows how to combine both information sources in order to recover 80% of the fully supervised performance of pixel-wise semantic labelling.
Proceedings ArticleDOI
Semi-Supervised Conditional Random Fields for Improved Sequence Segmentation and Labeling
TL;DR: A new semi-supervised training procedure for conditional random fields (CRFs) that can be used to train sequence segmentors and labelers from a combination of labeled and unlabeled training data is presented, based on extending the minimum entropy regularization framework to the structured prediction case.
Journal ArticleDOI
A Combination Approach to Web User Profiling
TL;DR: This article formalizes the profiling problem as several subtasks: profile extraction, profile integration, and user interest discovery, and proposes a combination approach to deal with the profiling tasks.
References
More filters
Journal ArticleDOI
A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting
Yoav Freund,Robert E. Schapire +1 more
TL;DR: The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and it is shown that the multiplicative weight-update Littlestone?Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.
Gradient-based learning applied to document recognition
Yann LeCun,Léon Bottou,Léon Bottou,Yoshua Bengio,Yoshua Bengio,Yoshua Bengio,Patrick Haffner,Patrick Haffner +7 more
TL;DR: This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task, and Convolutional neural networks are shown to outperform all other techniques.
Book
Foundations of Statistical Natural Language Processing
TL;DR: This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear and provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations.
Book
Biological Sequence Analysis: Probabilistic Models of Proteins and Nucleic Acids
TL;DR: This book gives a unified, up-to-date and self-contained account, with a Bayesian slant, of such methods, and more generally to probabilistic methods of sequence analysis.
Journal ArticleDOI
A maximum entropy approach to natural language processing
TL;DR: A maximum-likelihood approach for automatically constructing maximum entropy models is presented and how to implement this approach efficiently is described, using as examples several problems in natural language processing.