scispace - formally typeset
Journal ArticleDOI

Multiobjective Particle Swarm Optimization for Feature Selection With Fuzzy Cost

TLDR
A fuzzy multiobjective FS method with particle swarm optimization, called PSOMOFS, is studied, which develops a fuzzy dominance relationship to compare the goodness of candidate particles and defines a fuzzy crowding distance measure to prune the elitist archive and determine the global leader of particles.
Abstract
Feature selection (FS) is an important data processing technique in the field of machine learning. There have been various FS methods, but all assume that the cost associated with a feature is precise, which restricts their real applications. Focusing on the FS problem with fuzzy cost, a fuzzy multiobjective FS method with particle swarm optimization, called PSOMOFS, is studied in this article. The proposed method develops a fuzzy dominance relationship to compare the goodness of candidate particles and defines a fuzzy crowding distance measure to prune the elitist archive and determine the global leader of particles. Also, a tolerance coefficient is introduced into the proposed method to ensure that the Pareto-optimal solutions obtained satisfy decision makers’ preferences. The developed method is used to tackle a series of the UCI datasets and is compared with three fuzzy multiobjective evolutionary methods and three typical multiobjective FS methods. Experimental results show that the proposed method can achieve feature sets with superior performances in approximation, diversity, and feature cost.

read more

Citations
More filters
Journal ArticleDOI

A comprehensive survey on recent metaheuristics for feature selection

TL;DR: The most outstanding recent metaheuristic feature selection algorithms of the last two decades in terms of their performance in exploration/exploitation operators, selection methods, transfer functions, fitness value evaluations, and parameter setting techniques are surveyed in this paper .
Journal ArticleDOI

bSSA: Binary Salp Swarm Algorithm With Hybrid Data Transformation for Feature Selection

TL;DR: In this article, a new binary Salp Swarm Algorithm (bSSA) was proposed for selecting the best feature set from transformed datasets, which first transforms the original data-set using Principal Component Analysis (PCA) and fast independent component analysis (fastICA) based hybrid data transformation methods; next, a binary salp swarm optimizer is used for finding the best features.
Journal ArticleDOI

Multi-Objective Feature Selection With Missing Data in Classification

TL;DR: A novel modelling of FS is proposed that includes reliability as the third objective of the problem and the application of the non-dominated sorting genetic algorithm-III (NSGA-III) is proposed to address the modified problem.
Journal ArticleDOI

Boosting Arithmetic Optimization Algorithm with Genetic Algorithm Operators for Feature Selection: Case Study on Cox Proportional Hazards Model

TL;DR: The findings of this paper illustrated that the proposed AOAGA method finds new best solutions for several test cases, and it got promising results compared to other comparative methods published in the literature.
Journal ArticleDOI

Multi-objective PSO based online feature selection for multi-label classification

TL;DR: The proposed algorithm outperforms the results obtained by state-of-the-art approaches in most cases and automatically determines the best subset of features that is suitable for multi-label classification.
References
More filters
Journal ArticleDOI

A fast and elitist multiobjective genetic algorithm: NSGA-II

TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Journal ArticleDOI

Wrappers for feature subset selection

TL;DR: The wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain and compares the wrapper approach to induction without feature subset selection and to Relief, a filter approach tofeature subset selection.
Journal ArticleDOI

MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition

TL;DR: Experimental results have demonstrated that MOEA/D with simple decomposition methods outperforms or performs similarly to MOGLS and NSGA-II on multiobjective 0-1 knapsack problems and continuous multiobjectives optimization problems.

SPEA2: Improving the strength pareto evolutionary algorithm

TL;DR: An improved version of SPEA, namely SPEA2, is proposed, which incorporates in contrast to its predecessor a fine-grained fitness assignment strategy, a density estimation technique, and an enhanced archive truncation method.
Journal ArticleDOI

A Survey on Evolutionary Computation Approaches to Feature Selection

TL;DR: This paper presents a comprehensive survey of the state-of-the-art work on EC for feature selection, which identifies the contributions of these different algorithms.
Related Papers (5)