scispace - formally typeset
Open AccessJournal ArticleDOI

Learning in the presence of concept drift and hidden contexts

TLDR
A family of learning algorithms that flexibly react to concept drift and can take advantage of situations where contexts reappear are described, including a heuristic that constantly monitors the system's behavior.
Abstract
On-line learning in domains where the target concept depends on some hidden context poses serious problems. A changing context can induce changes in the target concepts, producing what is known as concept drift. We describe a family of learning algorithms that flexibly react to concept drift and can take advantage of situations where contexts reappear. The general approach underlying all these algorithms consists of (1) keeping only a window of currently trusted examples and hypotheses; (2) storing concept descriptions and reusing them when a previous context re-appears; and (3) controlling both of these functions by a heuristic that constantly monitors the system's behavior. The paper reports on experiments that test the systems' perfomance under various conditions such as different levels of noise and different extent and rate of concept drift.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Anomaly detection based on eccentricity analysis

TL;DR: This paper proposes a new less conservative and more sensitive condition for anomaly detection, quite different from the traditional “nσ” type conditions, and points to some possible applications which will be the domain of future work.
Journal ArticleDOI

Dynamic churn prediction framework with more effective use of rare event data: The case of private banking

TL;DR: A dynamic churn prediction framework for generating training data from customer records is proposed, and it significantly increases the prediction accuracy across prediction horizons compared to the standard approach of one observation per customer.
Journal ArticleDOI

Adaptive Clustering for Dynamic IoT Data Streams

TL;DR: This work proposes a method which determines how many different clusters can be found in a stream based on the data distribution, and demonstrates how the number of clusters in a real-world data stream can be determined by analyzing the data distributions.
Journal ArticleDOI

Exploiting Class Hierarchies for Knowledge Transfer in Hyperspectral Data

TL;DR: A knowledge transfer framework that leverages the information extracted from the existing labeled data to classify spatially separate and multitemporal test data is proposed and shows that in the absence of any labeled data in the new area, the approach is better than a direct application of the original classifier on the new data.
Journal ArticleDOI

Real-time data mining of non-stationary data streams from sensor networks

TL;DR: A new real-time data-mining algorithm called IOLIN (incremental on-line information network), which saves a significant amount of computational effort by updating an existing model as long as no major concept drift is detected.
References
More filters
Proceedings ArticleDOI

A theory of the learnable

TL;DR: This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Journal ArticleDOI

Instance-Based Learning Algorithms

TL;DR: This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
Book

Machine Learning: An Artificial Intelligence Approach

TL;DR: This book contains tutorial overviews and research papers on contemporary trends in the area of machine learning viewed from an AI perspective, including learning from examples, modeling human learning strategies, knowledge acquisition for expert systems, learning heuristics, discovery systems, and conceptual data analysis.
Journal ArticleDOI

Learnability and the Vapnik-Chervonenkis dimension

TL;DR: This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Journal ArticleDOI

Queries and Concept Learning

TL;DR: This work considers the problem of using queries to learn an unknown concept, and several types of queries are described and studied: membership, equivalence, subset, superset, disjointness, and exhaustiveness queries.