scispace - formally typeset
Open AccessJournal ArticleDOI

Instance-Based Learning Algorithms

TLDR
This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
Abstract
Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to solve incremental learning tasks. In this paper, we describe a framework and methodology, called instance-based learning, that generates classification predictions using only specific instances. Instance-based learning algorithms do not maintain a set of abstractions derived from specific instances. This approach extends the nearest neighbor algorithm, which has large storage requirements. We describe how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy. While the storage-reducing algorithm performs well on several real-world databases, its performance degrades rapidly with the level of attribute noise in training instances. Therefore, we extended it with a significance test to distinguish noisy instances. This extended algorithm's performance degrades gracefully with increasing noise levels and compares favorably with a noise-tolerant decision tree algorithm.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal Article

Learning in Environments with Unknown Dynamics: Towards more Robust Concept Learners

TL;DR: An incremental decision tree that is updated with incoming examples and is better than evaluated methods in its ability to deal with concept drift when dealing with problems in which: concept change occurs at different speeds, noise may be present and, examples may arrive from different areas of the problem domain.

Quantifying the Impact of Learning Algorithm Parameter Tuning (short version)

TL;DR: In this paper, the impact of learning algorithm optimization by means of parameter tuning is studied, where two quality attributes, sensitivity and classification performance, are investigated, and two metr...
Proceedings ArticleDOI

Memory-Based Learning: Using Similarity for Smoothing

TL;DR: It is argued that feature weighting methods in the Memory-Based paradigm can offer the advantage of automatically specifying a suitable domain-specific hierarchy between most specific and most general conditioning information without the need for a large number of parameters.
Journal ArticleDOI

Accurate prediction of hot spot residues through physicochemical characteristics of amino acid sequences.

TL;DR: This article investigated the problem of identifying hot spots using only physicochemical characteristics extracted from amino acid sequences and significantly outperformed other machine learning algorithms and state‐of‐the‐art hot spot predictors.
Journal ArticleDOI

Image-Based Delineation and Classification of Built Heritage Masonry

TL;DR: The ground work carried out to make this tool possible is presented: the automatic, image-based delineation of stone masonry as the input to a classifier for a geometrically characterized feature of a built heritage object.
References
More filters
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.
Book

Classification and regression trees

Leo Breiman
TL;DR: The methodology used to construct tree structured rules is the focus of a monograph as mentioned in this paper, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Journal ArticleDOI

Nearest neighbor pattern classification

TL;DR: The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.