scispace - formally typeset
Search or ask a question
Topic

Class (philosophy)

About: Class (philosophy) is a research topic. Over the lifetime, 821 publications have been published within this topic receiving 28000 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The experimental results proved the ability of DBRF to solve the problem of classifying objects located on the class boundary, including objects of minority classes, by taking into account the density of objects in space.
Abstract: Many machine learning problem domains, such as the detection of fraud, spam, outliers, and anomalies, tend to involve inherently imbalanced class distributions of samples. However, most classification algorithms assume equivalent sample sizes for each class. Therefore, imbalanced classification datasets pose a significant challenge in prediction modeling. Herein, we propose a density-based random forest algorithm (DBRF) to improve the prediction performance, especially for minority classes. DBRF is designed to recognize boundary samples as the most difficult to classify and then use a density-based method to augment them. Subsequently, two different random forest classifiers were constructed to model the augmented boundary samples and the original dataset dependently, and the final output was determined using a bagging technique. A real-world material classification dataset and 33 open public imbalanced datasets were used to evaluate the performance of DBRF. On the 34 datasets, DBRF could achieve improvements of 2–15% over random forest in terms of the F1-measure and G-mean. The experimental results proved the ability of DBRF to solve the problem of classifying objects located on the class boundary, including objects of minority classes, by taking into account the density of objects in space.

9 citations

Proceedings ArticleDOI
16 Apr 2022
TL;DR: The ProbExpan is proposed, a novel probabilistic ESE framework utilizing the entity representation obtained by the aforementioned language model to expand entities and outperforms previous state-of-the-art methods.
Abstract: Entity Set Expansion (ESE) is a promising task which aims to expand entities of the target semantic class described by a small seed entity set. Various NLP and IR applications will benefit from ESE due to its ability to discover knowledge. Although previous ESE methods have achieved great progress, most of them still lack the ability to handle hard negative entities (i.e., entities that are difficult to distinguish from the target entities), since two entities may or may not belong to the same semantic class based on different granularity levels we analyze on. To address this challenge, we devise an entity-level masked language model with contrastive learning to refine the representation of entities. In addition, we propose the ProbExpan, a novel probabilistic ESE framework utilizing the entity representation obtained by the aforementioned language model to expand entities. Extensive experiments and detailed analyses on three datasets show that our method outperforms previous state-of-the-art methods.

9 citations

Book ChapterDOI
06 Jul 1999
TL;DR: Extensional Set (XS) library is an extension of ECLiPSe which solves set-theoretical constraints over extensional sets containing variables with numeric domains and a domain representation and an approximate unification algorithm are proposed.
Abstract: Extensional Set (XS) library is an extension of ECLiPSe which solves set-theoretical constraints over extensional sets containing variables with numeric domains. To efficiently process such a class of set domains, XS library employs a constraint programming method called Subdefinite Computations. Within that framework, a domain representation and an approximate unification algorithm are proposed. The abilities of the library are illustrated by a geometric application.

9 citations

Journal ArticleDOI
TL;DR: This paper proposes a Label-guided Self-training approach to Semi-supervised Learning (LaSSL), which improves pseudo-label generations from two mutually boosted strategies and evaluates LaSSL on several classification benchmarks under partially labeled settings and demonstrates its superiority over the state-of-the-art approaches.
Abstract: The key to semi-supervised learning (SSL) is to explore adequate information to leverage the unlabeled data. Current dominant approaches aim to generate pseudo-labels on weakly augmented instances and train models on their corresponding strongly augmented variants with high-confidence results. However, such methods are limited in excluding samples with low-confidence pseudo-labels and under-utilization of the label information. In this paper, we emphasize the cruciality of the label information and propose a Label-guided Self-training approach to Semi-supervised Learning (LaSSL), which improves pseudo-label generations from two mutually boosted strategies. First, with the ground-truth labels and iteratively-polished pseudo-labels, we explore instance relations among all samples and then minimize a class-aware contrastive loss to learn discriminative feature representations that make same-class samples gathered and different-class samples scattered. Second, on top of improved feature representations, we propagate the label information to the unlabeled samples across the potential data manifold at the feature-embedding level, which can further improve the labelling of samples with reference to their neighbours. These two strategies are seamlessly integrated and mutually promoted across the whole training process. We evaluate LaSSL on several classification benchmarks under partially labeled settings and demonstrate its superiority over the state-of-the-art approaches.

9 citations

Journal ArticleDOI
TL;DR: In this paper , the authors introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems, which can be extended to learn not only the solutions of one particular differential equation but also the solutions to a class of problems.

9 citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
202311,771
202223,753
2021380
2020186
201962