scispace - formally typeset
Journal ArticleDOI

Multi-Objective Particle Swarm Optimization Approach for Cost-Based Feature Selection in Classification

TLDR
Experimental results show that the proposed PSO-based multi-objective feature selection algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.
Abstract
Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing Generally, there are some situations where a user is interested in not only maximizing the classification performance but also minimizing the cost that may be associated with features This kind of problem is called cost-based feature selection However, most existing feature selection approaches treat this task as a single-objective optimization problem This paper presents the first study of multi-objective particle swarm optimization PSO for cost-based feature selection problems The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems

read more

Citations
More filters
Journal ArticleDOI

Binary differential evolution with self-learning for multi-objective feature selection

TL;DR: Experimental results on a series of public datasets show that the effective combination of the binary mutation and OPS makes the MOFS-BDE achieve a trade-off between local exploitation and global exploration.
Journal ArticleDOI

Self-Adaptive Particle Swarm Optimization for Large-Scale Feature Selection in Classification

TL;DR: The experimental results show that the solution size obtained by the SaPSO algorithm is smaller than its EC counterparts on all datasets, and it performs better than its non-EC and EC counterparts in terms of classification accuracy not only on most training sets but also on most test sets.
Journal ArticleDOI

A survey on swarm intelligence approaches to feature selection in data mining

TL;DR: A comprehensive survey on the state-of-the-art works applying swarm intelligence to achieve feature selection in classification, with a focus on the representation and search mechanisms.
Journal ArticleDOI

Review of swarm intelligence-based feature selection methods

TL;DR: A comparative analysis of different feature selection methods is presented, and a general categorization of these methods is performed, which shows the strengths and weaknesses of the different studied swarm intelligence-based feature selection Methods.
Journal ArticleDOI

Variable-Size Cooperative Coevolutionary Particle Swarm Optimization for Feature Selection on High-Dimensional Data

TL;DR: The experimental results show that VS-CCPSO has the capability of obtaining good feature subsets, suggesting its competitiveness for tackling FS problems with high dimensionality.
References
More filters
Journal ArticleDOI

A fast and elitist multiobjective genetic algorithm: NSGA-II

TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Proceedings ArticleDOI

Particle swarm optimization

TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Journal ArticleDOI

Textural Features for Image Classification

TL;DR: These results indicate that the easily computable textural features based on gray-tone spatial dependancies probably have a general applicability for a wide variety of image-classification applications.
Proceedings ArticleDOI

A modified particle swarm optimizer

TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Related Papers (5)