Y
Yi Zhen
Researcher at Georgia Institute of Technology
Publications - 24
Citations - 1060
Yi Zhen is an academic researcher from Georgia Institute of Technology. The author has contributed to research in topics: Hash function & Nearest neighbor search. The author has an hindex of 13, co-authored 23 publications receiving 988 citations. Previous affiliations of Yi Zhen include Duke University & University of Georgia.
Papers
More filters
Proceedings ArticleDOI
A probabilistic model for multimodal hash function learning
Yi Zhen,Dit-Yan Yeung +1 more
TL;DR: This paper proposes a probabilistic model, called MLBE, to learn hash functions from multimodal data automatically, and devise an efficient algorithm for the learning of binary latent factors which corresponds to hash function learning.
Proceedings Article
Co-Regularized Hashing for Multimodal Data
Yi Zhen,Dit-Yan Yeung +1 more
TL;DR: This paper proposes a novel multimodal hash function learning method, called Co-Regularized Hashing (CRH), based on a boosted co-regularization framework, and empirically compares CRH with two state-of-the-art multi-modal hash functions learning methods on two publicly available data sets.
Proceedings ArticleDOI
TagiCoFi: tag informed collaborative filtering
Yi Zhen,Wu-Jun Li,Dit-Yan Yeung +2 more
TL;DR: This paper proposes a novel framework, called tag informed collaborative filtering (TagiCoFi), to seamlessly integrate tagging information into the CF procedure, and demonstrates that TagiCoFi outperforms its counterpart which discards the tagging information even when it is available, and achieves state-of-the-art performance.
Journal ArticleDOI
Spectral Multimodal Hashing and Its Application to Multimedia Retrieval
TL;DR: This paper proposes a new hashing-based method for fast multimodal multimedia retrieval based on spectral analysis of the correlation matrix of different modalities and develops an efficient algorithm that learns some parameters from the data distribution for obtaining the binary codes.
Proceedings Article
Parametric local multimodal hashing for cross-view similarity search
TL;DR: A novel multi-modal HFL method, called Parametric Local Multimodal Hashing (PLMH), which learns a set of hash functions to locally adapt to the data structure of each modality to balance locality and computational efficiency.