Learning in the presence of concept drift and hidden contexts
Gerhard Widmer,Miroslav Kubat +1 more
TLDR
A family of learning algorithms that flexibly react to concept drift and can take advantage of situations where contexts reappear are described, including a heuristic that constantly monitors the system's behavior.Citations
More filters
Journal ArticleDOI
A robust incremental learning method for non-stationary environments
TL;DR: This work proposes a new method, for single-layer neural networks, that is based on the introduction of a forgetting function in an incremental online learning algorithm that gives a monotonically increasing importance to new data.
Patent
Methods, media, and systems for securing communications between a first node and a second node
Salvatore J. Stolfo,Gabriela F. Ciocarlie,Vanessa Frias-Martinez,Janak Parekh,Angelos D. Keromytis,Joseph Sherrick +5 more
TL;DR: In this article, the authors present methods, media, and systems for securing communications between a first node and a second node, in which the first node is authorized to receive traffic from the second node based on the difference between the at least one model of behavior of the second user and at least the first user's behavior.
Book ChapterDOI
Adaptive Learning Rate for Online Linear Discriminant Classifiers
TL;DR: The adaptive learning rate was applied to four online linear classifier models on generated and real streaming data with concept drift and O-LDC was found to be better than balanced Winnow, the perceptron and a recently proposed online linear discriminant analysis.
Journal ArticleDOI
A Low-Granularity Classifier for Data Streams with Concept Drifts and Biased Class Distribution
TL;DR: This paper shows that reducing model granularity reduces the update cost, as models of fine granularity enable us to efficiently pinpoint local components in the model that are affected by the concept drift, and enables us to derive new model components to reflect the current data distribution, thus avoiding expensive updates on a global scale.
Journal ArticleDOI
Probabilistic local reconstruction for k-NN regression and its application to virtual metrology in semiconductor manufacturing
TL;DR: A probabilistic local reconstruction (PLR) is proposed as an extended version of LLR in the k-NN regression to prevent over-fitting and the uncertainty information on the prediction outcomes provided by PLR supports more appropriate decision making.
References
More filters
Proceedings ArticleDOI
A theory of the learnable
TL;DR: This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Journal ArticleDOI
Instance-Based Learning Algorithms
TL;DR: This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
Book
Machine Learning: An Artificial Intelligence Approach
TL;DR: This book contains tutorial overviews and research papers on contemporary trends in the area of machine learning viewed from an AI perspective, including learning from examples, modeling human learning strategies, knowledge acquisition for expert systems, learning heuristics, discovery systems, and conceptual data analysis.
Journal ArticleDOI
Learnability and the Vapnik-Chervonenkis dimension
TL;DR: This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Journal ArticleDOI
Queries and Concept Learning
TL;DR: This work considers the problem of using queries to learn an unknown concept, and several types of queries are described and studied: membership, equivalence, subset, superset, disjointness, and exhaustiveness queries.