scispace - formally typeset
Open AccessJournal ArticleDOI

Wrappers for feature subset selection

Ron Kohavi, +1 more
- 01 Dec 1997 - 
- Vol. 97, Iss: 1, pp 273-324
TLDR
The wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain and compares the wrapper approach to induction without feature subset selection and to Relief, a filter approach tofeature subset selection.
About
This article is published in Artificial Intelligence.The article was published on 1997-12-01 and is currently open access. It has received 8610 citations till now. The article focuses on the topics: Feature selection & Minimum redundancy feature selection.

read more

Citations
More filters
Proceedings ArticleDOI

Filter Bank Common Spatial Pattern (FBCSP) in Brain-Computer Interface

TL;DR: A novel filter bank common spatial pattern (FBCSP) is proposed to perform autonomous selection of key temporal-spatial discriminative EEG characteristics and shows that FBCSP, using a particular combination feature selection and classification algorithm, yields relatively higher cross-validation accuracies compared to prevailing approaches.
Proceedings ArticleDOI

Unsupervised feature selection for multi-cluster data

TL;DR: Inspired from the recent developments on manifold learning and L1-regularized models for subset selection, a new approach is proposed, called Multi-Cluster Feature Selection (MCFS), for unsupervised feature selection, which select those features such that the multi-cluster structure of the data can be best preserved.
Proceedings ArticleDOI

Adversarial classification

TL;DR: This paper views classification as a game between the classifier and the adversary, and produces a classifier that is optimal given the adversary's optimal strategy, and experiments show that this approach can greatly outperform a classifiers learned in the standard way.
Journal ArticleDOI

Feature Selection for Unsupervised Learning

TL;DR: This paper explores the feature selection problem and issues through FSSEM (Feature Subset Selection using Expectation-Maximization (EM) clustering) and through two different performance criteria for evaluating candidate feature subsets: scatter separability and maximum likelihood.
Journal ArticleDOI

Data preprocessing techniques for classification without discrimination

TL;DR: This paper surveys and extends existing data preprocessing techniques, being suppression of the sensitive attribute, massaging the dataset by changing class labels, and reweighing or resampling the data to remove discrimination without relabeling instances and presents the results of experiments on real-life data.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Book

C4.5: Programs for Machine Learning

TL;DR: A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.