scispace - formally typeset
Open AccessProceedings Article

Two case studies in cost-sensitive concept acquisition

Reads0
Chats0
TLDR
Two effective and familiar learning methods, ID3 and IBL, are extended to address the problem of learning from examples when feature measurement costs are significant: they deal effectively with varying cost distributions and with irrelevant features.
Abstract
This paper explores the problem of learning from examples when feature measurement costs are significant. It then extends two effective and familiar learning methods, ID3 and IBL, to address this problem. The extensions, CS-ID3 and CS-IBL, are described in detail and are tested in a natural robot domain and a synthetic domain. Empirical studies support the hypothesis that the extended methods are indeed sensitive to feature costs: they deal effectively with varying cost distributions and with irrelevant features.

read more

Citations
More filters
Journal ArticleDOI

Very Simple Classification Rules Perform Well on Most Commonly Used Datasets

TL;DR: On most datasets studied, the best of very simple rules that classify examples on the basis of a single attribute is as accurate as the rules induced by the majority of machine learning systems.
Proceedings ArticleDOI

Cost-based modeling for fraud and intrusion detection: results from the JAM project

TL;DR: There is clear evidence that state-of-the-art commercial fraud detection systems can be substantially improved in stopping losses due to fraud by combining multiple models of fraudulent transaction shared among banks.
Book ChapterDOI

Prototype and feature selection by sampling and random mutation hill climbing algorithms

TL;DR: On four datasets, it is shown that only three or four prototypes sufficed to give predictive accuracy equal or superior to a basic nearest neighbor algorithm whose run-time storage costs were approximately 10 to 200 times greater.
Journal ArticleDOI

Decision Tree Induction Based on Efficient Tree Restructuring

TL;DR: Two approaches to decision tree induction are described, one being incremental tree induction (ITI) and the other being non-incremental tree induction using a measure of tree quality instead of test quality (DMTI), which offer new computational and classifier characteristics that lend themselves to particular applications.
Journal ArticleDOI

Selective Sampling for Nearest Neighbor Classifiers

TL;DR: In this article, a look-ahead algorithm for selective sampling of examples for nearest neighbor classifiers is proposed, where the algorithm is looking for the example with the highest utility, taking its effect on the resulting classifier into account.
References
More filters
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.
Journal ArticleDOI

Incremental Induction of Decision Trees

TL;DR: An incremental algorithm for inducing decision trees equivalent to those formed by Quinlan's nonincremental ID3 algorithm, given the same training instances is presented, named ID5R.
Proceedings Article

Noise-tolerant instance-based learning algorithms

TL;DR: This paper describes a simple extension of instancebased learning algorithms for detecting and removing noisy instances from concept descriptions that degrades more slowly in the presence of noise, improves classification accuracies, and further reduces storage requirements in several artificial and real-world database applications.
Proceedings Article

Defining operationality for explanation-based learning

TL;DR: This work surveys how operationality is defined and assessed in several explanation-based systems, and presents a more comprehensive definition of operationality, and describes an implemented system that incorporates the new definition and overcomes some of the limitations exhibited by current operationality assessment schemes.
Book ChapterDOI

Cost-sensitive concept learning of sensor use in approach and recognition

TL;DR: This chapter explores a prototype learning method that complements recent work in incremental learning by considering the role of external costs arising from realistic environmental assumptions.