scispace - formally typeset
H

Hui Xiong

Researcher at Rutgers University

Publications -  564
Citations -  24993

Hui Xiong is an academic researcher from Rutgers University. The author has contributed to research in topics: Computer science & Cluster analysis. The author has an hindex of 69, co-authored 470 publications receiving 16776 citations. Previous affiliations of Hui Xiong include Hong Kong University of Science and Technology & National University of Singapore.

Papers
More filters
Journal ArticleDOI

A Comprehensive Survey on Transfer Learning

TL;DR: Transfer learning aims to improve the performance of target learners on target domains by transferring the knowledge contained in different but related source domains as discussed by the authors, in which the dependence on a large number of target-domain data can be reduced for constructing target learners.
Posted Content

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting.

TL;DR: An efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics: a self-attention mechanism, which achieves $O(L \log L)$ in time complexity and memory usage, and has comparable performance on sequences' dependency alignment.
Proceedings ArticleDOI

Understanding of Internal Clustering Validation Measures

TL;DR: A detailed study of 11 widely used internal clustering validation measures for crisp clustering and shows that S\_Dbw is the only internal validation measure which performs well in all five aspects, while other measures have certain limitations in different application scenarios.
Journal ArticleDOI

Discovering colocation patterns from spatial data sets: a general approach

TL;DR: A transaction-free approach to mine colocation patterns by using the concept of proximity neighborhood and a new interest measure, a participation index, is presented which possesses an antimonotone property which can be exploited for computational efficiency.
Proceedings Article

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

TL;DR: Informer as discussed by the authors proposes a probSparse self-attention mechanism, which achieves O(L log L) in time complexity and memory usage, and has comparable performance on sequences' dependency alignment.