Journal ArticleDOI
Floating search methods in feature selection
TLDR
Sequential search methods characterized by a dynamically changing number of features included or eliminated at each step, henceforth "floating" methods, are presented and are shown to give very good results and to be computationally more effective than the branch and bound method.About:
This article is published in Pattern Recognition Letters.The article was published on 1994-11-01. It has received 3104 citations till now. The article focuses on the topics: Beam search & Jump search.read more
Citations
More filters
Journal ArticleDOI
Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection
Daren Yu,Shuang An,Qinghua Hu +2 more
TL;DR: This paper introduces the fuzzy information entropy and fuzzy mutual information for computing relevance between numerical or fuzzy features and decision, and combines them with ”min-Redundancy-Max-Relevance”, ”Max-Dependency” and ” min-Redundy- Max-Dependence” algorithms.
A Wrapper-Based Feature Selection for Analysis of Large Data Sets
TL;DR: This paper presents a novel method to investigate the quality and structure of data sets, i.e., whether there are noisy and irrelevant features em- bedded in data sets by using a wrapper-based feature method using genetic algorithm and an external clas- sifier for selecting the discriminative features.
Journal ArticleDOI
AIEOU: Automata-based improved equilibrium optimizer with U-shaped transfer function for feature selection
TL;DR: An improved version of EO is proposed with the inclusion of learning based automata to find proper values of its parameters and AdaptiveβHill Climbing to find a better equilibrium pool, illustrating the supremacy of the proposed method over the other state-of-the-art methods mentioned inthe literature.
Journal ArticleDOI
W-operator window design by minimization of mean conditional entropy
TL;DR: This paper presents a technique that gives a minimal window W for the estimation of a W-operator from training data to choose a subset of variables W that maximizes the information observed in a training set.
Journal ArticleDOI
An Evolutionary Computation Based Feature Selection Method for Intrusion Detection
TL;DR: The self-adaptive differential evolution (SaDE) algorithm is adopted in this paper to deal with feature selection problems for IDS and experimental results demonstrate that SaDE is more promising than the algorithms it compares.
References
More filters
Journal ArticleDOI
A Branch and Bound Algorithm for Feature Subset Selection
TL;DR: In this paper, a branch and bound-based feature subset selection algorithm is proposed to select the best subset of m features from an n-feature set without exhaustive search, which is computationally computationally unfeasible.
Journal ArticleDOI
A note on genetic algorithms for large-scale feature selection
W. Siedlecki,Jack Sklansky +1 more
TL;DR: The preliminary results suggest that GA is a powerful means of reducing the time for finding near-optimal subsets of features from large sets.
Journal ArticleDOI
A Direct Method of Nonparametric Measurement Selection
TL;DR: A direct method of measurement selection is proposed to determine the best subset of d measurements out of a set of D total measurements, using a nonparametric estimate of the probability of error given a finite design sample set.
Journal ArticleDOI
On the effectiveness of receptors in recognition systems
T. Marill,D. Green +1 more
TL;DR: Some of the theoretical problems encountered in trying to determine a more formal measure of the effectiveness of a set of tests are discussed; a measure which might be a practical substitute for the empirical evaluation.
Journal ArticleDOI
On automatic feature selection
W. Siedlecki,Jack Sklansky +1 more
TL;DR: In this paper, a review of feature selection for multidimensional pattern classification is presented, and the potential benefits of Monte Carlo approaches such as simulated annealing and genetic algorithms are compared.