scispace - formally typeset
Open AccessProceedings Article

A multi-label least-squares hashing for scalable image search

Reads0
Chats0
TLDR
A Multi-label Least-Squares Hashing (MLSH) method for multi-label data hashing, which outperforms several state-of-the-art hashing methods including supervised and unsupervised methods.
Abstract
Recently, hashing methods have attracted more and more attentions for their effectiveness in large scale data search, e.g., images and videos data. etc. For different s-cenarios, unsupervised, supervised and semi-supervised hashing methods have been proposed. Especially, when semantic information is available, supervised hashing methods show better performance than unsupervised ones. In many practical applications, one sample usually has more than one label, which has been considered by multi-label learning. However, few supervised hashing methods consider such scenario. In this paper, we propose a Multi-label Least-Squares Hashing (MLSH) method for multi-label data hashing. It can directly work well on multi-label data; moreover, unlike other hashing methods which directly learn hashing function-s on original data, MLSH first utilizes the equivalen-t form of CCA and Least-Squares to project original multi-label data into lower-dimensional space; then, in the lower-dimensional space, it learns the project matrix and gets final binary codes of data. MLSH is tested on NUS-WIDE and CIFAR-100 which are widely used for searching task. The results show that MLSH outperforms several state-of-the-art hashing methods including supervised and unsupervised methods.

read more

Citations
More filters

Pattern Recognition and Machine Learning

TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Proceedings ArticleDOI

Supervised Robust Discrete Multimodal Hashing for Cross-Media Retrieval

TL;DR: A novel supervised hashing framework for cross-modal retrieval, i.e., Supervised Robust Discrete Multimodal Hashing (SRDMH), which tries to make final binary codes preserve label information as same as that in original data so that it can leverage more label information to supervise the binary codes learning.
Proceedings ArticleDOI

Discrete Multi-view Hashing for Effective Image Retrieval

TL;DR: A novel hashing method, i.e., Discrete Multi-view Hashing (DMVH), which can work on multi-view data directly and make full use of rich information in multi-View data, and a novel approach to construct similarity matrix, which can not only preserve local similarity structure, but also keep semantic similarity between data points.
Proceedings ArticleDOI

Dictionary Learning Based Hashing for Cross-Modal Retrieval

TL;DR: DLCMH learns dictionaries and generates sparse representation for each instance, which is more suitable to be projected to latent space and outperforms or is comparable to several state-of-the-art hashing models.
Journal ArticleDOI

Linear unsupervised hashing for ANN search in Euclidean space

TL;DR: An unsupervised hashing method - Unsupervised Euclidean Hashing (USEH), which learns and generates hashing codes to preserve the Euclidan distance relationship between data and is comparable to state-of-the-art unsuper supervised hashing methods.
References
More filters
Book

Pattern Recognition and Machine Learning

TL;DR: Probability Distributions, linear models for Regression, Linear Models for Classification, Neural Networks, Graphical Models, Mixture Models and EM, Sampling Methods, Continuous Latent Variables, Sequential Data are studied.
Book

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

TL;DR: In this paper, the authors describe the important ideas in these areas in a common conceptual framework, and the emphasis is on concepts rather than mathematics, with a liberal use of color graphics.
Dissertation

Learning Multiple Layers of Features from Tiny Images

TL;DR: In this paper, the authors describe how to train a multi-layer generative model of natural images, using a dataset of millions of tiny colour images, described in the next section.

Pattern Recognition and Machine Learning

TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Book ChapterDOI

Relations Between Two Sets of Variates

TL;DR: The concept of correlation and regression may be applied not only to ordinary one-dimensional variates but also to variates of two or more dimensions as discussed by the authors, where the correlation of the horizontal components is ordinarily discussed, whereas the complex consisting of horizontal and vertical deviations may be even more interesting.
Related Papers (5)