scispace - formally typeset
Open AccessJournal ArticleDOI

Wrappers for feature subset selection

Ron Kohavi, +1 more
- 01 Dec 1997 - 
- Vol. 97, Iss: 1, pp 273-324
TLDR
The wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain and compares the wrapper approach to induction without feature subset selection and to Relief, a filter approach tofeature subset selection.
About
This article is published in Artificial Intelligence.The article was published on 1997-12-01 and is currently open access. It has received 8610 citations till now. The article focuses on the topics: Feature selection & Minimum redundancy feature selection.

read more

Citations
More filters
Journal ArticleDOI

Discrete Bayesian Network Classifiers: A Survey

TL;DR: This article surveys the whole set of discrete Bayesian network classifiers devised to date, organized in increasing order of structure complexity: naive Bayes, selective naive Baye, seminaive Bayer, one-dependence Bayesian classifiers, k-dependency Bayesianclassifiers, Bayes network-augmented naiveBayes, Markov blanket-based Bayesian Classifier, unrestricted BayesianClassifiers, and Bayesian multinets.
Journal ArticleDOI

Orthogonal forward selection and backward elimination algorithms for feature subset selection

TL;DR: This study derives an orthogonal forward selection (OFS) and an Orthogonal backward elimination (OBE) algorithms for feature subset selection by incorporating Gram-Schmidt and Givens orthogona transforms into forward selection and backward elimination procedures, respectively.
Book

Modeling the Internet and the Web: Probabilistic Method and Algorithms

TL;DR: This book discusses Commerce on the Web: Models and Applications, a Bayesian Perspective, which aims to explain the development of models and applications for knowledge representation in the rapidly changing environment.
Journal ArticleDOI

Nearest neighbor classification from multiple feature subsets

TL;DR: MFS, a combining algorithm designed to improve the accuracy of the nearest neighbor NN classifier, is presented, which significantly outperformed several standard NN variants and was competitive with boosted decision trees.
Journal ArticleDOI

Sparse linear discriminant analysis by thresholding for high dimensional data

TL;DR: In this paper, a sparse linear discriminant analysis (LDA) was proposed to classify human cancer into two classes of leukemia based on a set of 7,129 genes and a training sample of size 72.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Book

C4.5: Programs for Machine Learning

TL;DR: A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.