scispace - formally typeset
Open AccessJournal ArticleDOI

Wrappers for feature subset selection

Ron Kohavi, +1 more
- 01 Dec 1997 - 
- Vol. 97, Iss: 1, pp 273-324
TLDR
The wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain and compares the wrapper approach to induction without feature subset selection and to Relief, a filter approach tofeature subset selection.
About
This article is published in Artificial Intelligence.The article was published on 1997-12-01 and is currently open access. It has received 8610 citations till now. The article focuses on the topics: Feature selection & Minimum redundancy feature selection.

read more

Citations
More filters
Journal ArticleDOI

Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features

TL;DR: A random forest-based feature selection algorithm that incorporates the feature cost into the base decision tree construction process to produce low-cost feature subsets and achieves better performance than other state-of-art feature selection methods in real-world problems.
Journal ArticleDOI

Optimal construction of a fast and accurate polarisable water potential based on multipole moments trained by machine learning.

TL;DR: The predictive ability and speed of two additional machine learning methods, radial basis function neural networks (RBFNN) and Kriging, are assessed with respect to previous MLP based polarisable water models, and combinations are found that are no less accurate, yet are 58% faster for the dimer, and 26% slower for the pentamer.
Book

Data Mining: Opportunities and Challenges

TL;DR: A Survey of Bayesian Data Mining Control of inductive Bias in Supervised Learning Using Evolutionary Computation: A Wrapper-Based Approach Cooperative Learning and Virtual Reality-Based Visualization for Data Mining
Journal ArticleDOI

Subspace learning for unsupervised feature selection via matrix factorization

TL;DR: A new unsupervised feature selection criterion developed from the viewpoint of subspace learning, which is treated as a matrix factorization problem and which provides a sound foundation for embedding kernel tricks into feature selection problems.
Journal ArticleDOI

Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

TL;DR: A feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets, improving both the classification accuracy and its runtime when dealing with big data problems.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Book

C4.5: Programs for Machine Learning

TL;DR: A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.