scispace - formally typeset
Open AccessJournal ArticleDOI

Instance-Based Learning Algorithms

TLDR
This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
Abstract
Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to solve incremental learning tasks. In this paper, we describe a framework and methodology, called instance-based learning, that generates classification predictions using only specific instances. Instance-based learning algorithms do not maintain a set of abstractions derived from specific instances. This approach extends the nearest neighbor algorithm, which has large storage requirements. We describe how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy. While the storage-reducing algorithm performs well on several real-world databases, its performance degrades rapidly with the level of attribute noise in training instances. Therefore, we extended it with a significance test to distinguish noisy instances. This extended algorithm's performance degrades gracefully with increasing noise levels and compares favorably with a noise-tolerant decision tree algorithm.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Maintaining Case‐Based Reasoners: Dimensions and Directions

TL;DR: A framework for understanding the general problem of case‐based reasoner maintenance (CBRM) is presented by refining and updating the earlier framework of dimensions for case‐base maintenance, applying the refined dimensions to the entire range of knowledge containers, and extending the theory to include coordinated cross‐container maintenance.
Journal ArticleDOI

Noise modelling and evaluating learning from examples

TL;DR: The nature of noise in the model is discussed and modelled using information-theoretic ideas especially that of majorisation and it is shown that increasing noise has a detrimental effect on learning.
Journal ArticleDOI

Hybrid $k$ -Nearest Neighbor Classifier

TL;DR: The random subspace ensemble framework based on HBKNN (RS-HBKNN) classifier is proposed to perform classification on the datasets with noisy attributes in the high-dimensional space, and outperforms most of the state-of-the-art classification approaches.
Journal ArticleDOI

Machine Learning in Amyotrophic Lateral Sclerosis: Achievements, Pitfalls, and Future Directions.

TL;DR: A comprehensive, systematic, and critical review of ML initiatives in ALS to date and their potential in research, clinical, and pharmacological applications finds the combination of multiple clinical, biofluid, and imaging biomarkers is likely to increase the accuracy of mathematical modeling and contribute to optimized clinical trial designs.
Journal ArticleDOI

Voting over multiple condensed nearest neighbors

TL;DR: The condensed nearest neighbor classifier incrementally stores a subset of the sample, thus decreasing storage and computation requirements and it is proposed to train multiple such subsets and take a vote over them, thus combining predictions from a set of concept descriptions.
References
More filters
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.
Book

Classification and regression trees

Leo Breiman
TL;DR: The methodology used to construct tree structured rules is the focus of a monograph as mentioned in this paper, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Journal ArticleDOI

Nearest neighbor pattern classification

TL;DR: The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.