scispace - formally typeset
Open AccessJournal ArticleDOI

Wrappers for feature subset selection

Ron Kohavi, +1 more
- 01 Dec 1997 - 
- Vol. 97, Iss: 1, pp 273-324
TLDR
The wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain and compares the wrapper approach to induction without feature subset selection and to Relief, a filter approach tofeature subset selection.
About
This article is published in Artificial Intelligence.The article was published on 1997-12-01 and is currently open access. It has received 8610 citations till now. The article focuses on the topics: Feature selection & Minimum redundancy feature selection.

read more

Citations
More filters
Book

Data Mining: Concepts and Techniques

TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Book

Data Mining: Practical Machine Learning Tools and Techniques

TL;DR: This highly anticipated third edition of the most acclaimed work on data mining and machine learning will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.
Journal ArticleDOI

An introduction to variable and feature selection

TL;DR: The contributions of this special issue cover a wide range of aspects of variable selection: providing a better definition of the objective function, feature construction, feature ranking, multivariate feature selection, efficient search methods, and feature validity assessment methods.
Journal ArticleDOI

Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy

TL;DR: In this article, the maximal statistical dependency criterion based on mutual information (mRMR) was proposed to select good features according to the maximal dependency condition. But the problem of feature selection is not solved by directly implementing mRMR.
Journal ArticleDOI

Gene Selection for Cancer Classification using Support Vector Machines

TL;DR: In this article, a Support Vector Machine (SVM) method based on recursive feature elimination (RFE) was proposed to select a small subset of genes from broad patterns of gene expression data, recorded on DNA micro-arrays.
References
More filters
Journal ArticleDOI

The B* tree search algorithm: a best-first proof procedure

TL;DR: The algorithm, which is named B*, finds a proof that an arc at the root of a search tree is better than any other by attempting to find both the best arc atThe root and the simplest proof, in best-first fashion.
Book ChapterDOI

Searching for Dependencies in Bayesian Classifiers

TL;DR: It is shown that the backward sequential elimination and joining algorithm provides the most improvement over the naive Bayesian classifier and that the violations of the independence assumption that affect the accuracy of the classifier can be detected from training data.
Book ChapterDOI

Feature Selection Using Rough Sets Theory

TL;DR: The paper is related to one of the aspects of learning from examples, namely learning how to identify a class of objects a given object instance belongs to, and a method of generating sequence of features allowing such identification is presented.
Journal ArticleDOI

Stochastic discrete optimization

TL;DR: In this paper, a stochastic search method is proposed for finding a global solution to the discrete optimization problem in which the objective function must be estimated by Monte Carlo simulation, and it is shown under mild conditions that the Markov chain is strongly ergodic.
Book

Essentials of Artificial Intelligence

TL;DR: Based on the author's course at Stanford University, Essentials of Artificial Intelligence is an integrated, cohesive introduction to the field that combines clear presentations with humor and AI anecdotes.