Particle swarm optimisation for feature selection in classification
Citations
1,237 citations
1,057 citations
534 citations
371 citations
357 citations
Cites background from "Particle swarm optimisation for fea..."
...But, when the best feature subset contains a relatively large number of features, backward selection has a larger chance to obtain the best solution [41]....
[...]
References
35,104 citations
27,271 citations
"Particle swarm optimisation for fea..." refers methods in this paper
...Since NSM is slow and can be applied only to the problems with low dimensionality, this initialisation method may not be appropriate for feature selection problems, where the dimensionality is typically large....
[...]
...Parsopoulos and Vrahatis [33] use the Nonlinear Simplex method (NSM) [34] to generate initial particles in PSO....
[...]
...Experiments show that the NSM based initialisation can improve the performance of PSO on 14 benchmark functions....
[...]
19,261 citations
14,509 citations
"Particle swarm optimisation for fea..." refers background in this paper
...However, irrelevant and redundant features are not useful for classification and they may even reduce the classification performance due to the large search space, which is termed “the curse of dimensionality” [1, 2]....
[...]
...The size of the search space increases exponentially with respect to the number of available features in the dataset [1]....
[...]
...There could be two-way or multi-way interactions among features [1, 5]....
[...]
...Although many different search techniques have been applied to feature selection, most of these algorithms still suffer from the problems of stagnation in local optima or being computationally expensive [2, 1, 4]....
[...]