A Fuzzy Mutual Information-based Feature Selection Method for Classification
TLDR
Experimental results show that the feature selection method gives high classification accuracy in most high dimensional datasets as well as the accuracy of proposed classifiers outperforms the traditional KNN classifier.About:
This article is published in Fuzzy Information and Engineering.The article was published on 2016-09-01 and is currently open access. It has received 31 citations till now. The article focuses on the topics: Feature selection & Naive Bayes classifier.read more
Citations
More filters
Journal ArticleDOI
EFS-MI: an ensemble feature selection method for classification
TL;DR: This paper provides an ensemble feature selection method using feature–class and feature-feature mutual information to select an optimal subset of features by combining multiple subsets of features.
Journal ArticleDOI
Real-time DDoS attack detection using FPGA
TL;DR: A real-time DDoS attack detection method is proposed that uses a novel correlation measure to identify DDoS attacks and the FPGA implementation requires less than one microsecond to identify an attack.
Journal ArticleDOI
Detection of Parkinson’s disease based on voice patterns ranking and optimized support vector machine
TL;DR: Assessing the performance of eight different pattern ranking techniques when coupled with nonlinear support vector machine (SVM) to distinguish between PD patients and healthy control subjects shows that the receiver operating characteristic and the Wilcoxon-based ranking techniques provide the highest sensitivity and specificity.
Journal ArticleDOI
A novel hybrid wrapper–filter approach based on genetic algorithm, particle swarm optimization for feature subset selection
Fateme Moslehi,Abdorrahman Haeri +1 more
TL;DR: A hybrid filter-wrapper method is proposed in the present study for feature subset selection established with integration of evolutionary based genetic algorithms (GA) and particle swarm optimization (PSO) to reduce the complication of calculation and the search time expended to achieve an optimum solution to the high dimensional datasets feature selection problem.
Journal ArticleDOI
An efficient multivariate feature ranking method for gene selection in high-dimensional microarray data
TL;DR: Classification of microarray data plays a significant role in the diagnosis and prediction of cancer, however, its high-dimensionality compared to the number of observations is questioned.
References
More filters
Journal ArticleDOI
An introduction to variable and feature selection
Isabelle Guyon,André Elisseeff +1 more
TL;DR: The contributions of this special issue cover a wide range of aspects of variable selection: providing a better definition of the objective function, feature construction, feature ranking, multivariate feature selection, efficient search methods, and feature validity assessment methods.
Journal ArticleDOI
Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy
TL;DR: In this article, the maximal statistical dependency criterion based on mutual information (mRMR) was proposed to select good features according to the maximal dependency condition. But the problem of feature selection is not solved by directly implementing mRMR.
Feature selection based on mutual information: criteria ofmax-dependency, max-relevance, and min-redundancy
TL;DR: This work derives an equivalent form, called minimal-redundancy-maximal-relevance criterion (mRMR), for first-order incremental feature selection, and presents a two-stage feature selection algorithm by combining mRMR and other more sophisticated feature selectors (e.g., wrappers).
Journal Article
Understanding interobserver agreement: the kappa statistic.
TL;DR: Items such as physical exam findings, radiographic interpretations, or other diagnostic tests often rely on some degree of subjective interpretation by observers and studies that measure the agreement between two or more observers should include a statistic that takes into account the fact that observers will sometimes agree or disagree simply by chance.
Book ChapterDOI
A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II
TL;DR: Simulation results on five difficult test problems show that the proposed NSGA-II, in most problems, is able to find much better spread of solutions and better convergence near the true Pareto-optimal front compared to PAES and SPEA--two other elitist multi-objective EAs which pay special attention towards creating a diverse Paretimal front.