scispace - formally typeset
Open AccessJournal ArticleDOI

Instance-Based Learning Algorithms

TLDR
This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
Abstract
Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to solve incremental learning tasks. In this paper, we describe a framework and methodology, called instance-based learning, that generates classification predictions using only specific instances. Instance-based learning algorithms do not maintain a set of abstractions derived from specific instances. This approach extends the nearest neighbor algorithm, which has large storage requirements. We describe how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy. While the storage-reducing algorithm performs well on several real-world databases, its performance degrades rapidly with the level of attribute noise in training instances. Therefore, we extended it with a significance test to distinguish noisy instances. This extended algorithm's performance degrades gracefully with increasing noise levels and compares favorably with a noise-tolerant decision tree algorithm.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A new nearest neighbor classification method based on fuzzy set theory and aggregation operators

TL;DR: The FABC is validated on the most known datasets, representing various classification difficulties and compared to the many extensions of the FuzzyNNC approach, and concludes that the optimized similarity and global voting rule are more robust to handle the uncertainty in the classification process than those used by the comparison methods.
Proceedings ArticleDOI

Attribute Selection in Software Engineering Datasets for Detecting Fault Modules

TL;DR: This paper makes use of different data mining algorithms to select attributes from the different datasets publicly available (PROMISE repository), and then, uses different classifiers to defect faulty modules to show that the smaller datasets maintain the prediction capability with a lower number of attributes than the original datasets.
Journal ArticleDOI

A Feature Subset Selection Method Based On High-Dimensional Mutual Information

TL;DR: It is proved that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y, then X is a Markov Blanket of Y and the approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets.
Journal ArticleDOI

Comparing Structures Using a Hopfield-Style Neural Network

TL;DR: The former approaches of structural matching and constraint relaxation by spreading activation in neural networks and the method of solving optimization tasks using Hopfield-style nets are combined.
Book ChapterDOI

On the use of brain decoded signals for online user adaptive gesture recognition systems

TL;DR: A hand gesture recognition system from wearable motion sensors that adapts online by taking advantage of error related potentials (ErrP) recognition, which becomes self-aware of its performance, and can self-improve through re-occurring detection of ErrP signals.
References
More filters
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.
Book

Classification and regression trees

Leo Breiman
TL;DR: The methodology used to construct tree structured rules is the focus of a monograph as mentioned in this paper, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Journal ArticleDOI

Nearest neighbor pattern classification

TL;DR: The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.