scispace - formally typeset
Journal ArticleDOI

Label correlation in multi-label classification using local attribute reductions with fuzzy rough sets

Reads0
Chats0
TLDR
To search for the indispensable and representative features of each label-related subset, the local-reduction-based feature selection method (LRFS-α) is designed and comprehensive experimental results on fifteen multi- label datasets characterize the performance of the methods against other seven multi-label learning methods.
About
This article is published in Fuzzy Sets and Systems.The article was published on 2022-01-15. It has received 14 citations till now. The article focuses on the topics: Correlation & Multi-label classification.

read more

Citations
More filters
Journal ArticleDOI

Multi-label feature selection based on fuzzy neighborhood rough sets

TL;DR: Wang et al. as mentioned in this paper proposed a multi-label feature selection method based on fuzzy neighborhood rough set, which considers the importance of features from multiple perspectives, which analyzes features not comprehensive enough.
Journal ArticleDOI

Glee: A granularity filter for feature selection

TL;DR: In this article , a novel framework named Glee: Granularity Factor Factorization for Feature Selection (Glee) is proposed to improve the efficiency and effectiveness of feature selector, which can not only eliminate the iterative calculations of information granulation in the whole process of selecting, but also provide sequence of features which may be insensitive to data perturbation.
Journal ArticleDOI

Local Indiscernibility Relation Reduction for Information Tables

Xu Li, +2 more
- 01 Jan 2022 - 
TL;DR: The discernibility matrix for the proposed reduction is established, and examples for single- and multi-decision classes are presented to illustrate that the proposed local indiscernibility relation reduction can be applied to decision tables.
Journal ArticleDOI

A novel multi-label feature selection method with association rules and rough set

TL;DR: In this paper , a feature selection method was proposed to select a more relevant and compact feature subset by considering the label distribution and inter-label correlations, which can effectively remove irrelevant and redundant features in the feature space.
Journal ArticleDOI

Feature selection based on double-hierarchical and multiplication-optimal fusion measurement in fuzzy neighborhood rough sets

TL;DR: In this paper , two measurement strategies regarding class-hierarchical fusion and multiplication-optimal fusion are proposed, and three measure-based heuristic feature selection algorithms are developed.
References
More filters
Book

Fuzzy sets

TL;DR: A separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Journal ArticleDOI

Multiple Comparisons among Means

TL;DR: In this paper, the authors considered the possibility of picking in advance a number (say m) of linear contrasts among k means, and then estimating these m linear contrasts by confidence intervals based on a Student t statistic, in such a way that the overall confidence level for the m intervals is greater than or equal to a preassigned value.
Journal ArticleDOI

ML-KNN: A lazy learning approach to multi-label learning

TL;DR: Experiments on three different real-world multi-label learning problems, i.e. Yeast gene functional analysis, natural scene classification and automatic web page categorization, show that ML-KNN achieves superior performance to some well-established multi- label learning algorithms.
Journal ArticleDOI

A Review On Multi-Label Learning Algorithms

TL;DR: This paper aims to provide a timely review on this area with emphasis on state-of-the-art multi-label learning algorithms with relevant analyses and discussions.
Journal ArticleDOI

Rough fuzzy sets and fuzzy rough sets

TL;DR: It is argued that both notions of a rough set and a fuzzy set aim to different purposes, and it is more natural to try to combine the two models of uncertainty (vagueness and coarseness) rather than to have them compete on the same problems.
Related Papers (5)