Open AccessPosted Content
Supervised Deep Hashing for Hierarchical Labeled Data
Reads0
Chats0
TLDR
Zhang et al. as mentioned in this paper proposed a novel deep hashing method, called supervised hierarchical deep hashing (SHDH), to perform hash code learning for hierarchical labeled data by weighting each layer, and design a deep convolutional neural network to obtain a hash code for each data point.Abstract:
Recently, hashing methods have been widely used in large-scale image retrieval. However, most existing hashing methods did not consider the hierarchical relation of labels, which means that they ignored the rich information stored in the hierarchy. Moreover, most of previous works treat each bit in a hash code equally, which does not meet the scenario of hierarchical labeled data. In this paper, we propose a novel deep hashing method, called supervised hierarchical deep hashing (SHDH), to perform hash code learning for hierarchical labeled data. Specifically, we define a novel similarity formula for hierarchical labeled data by weighting each layer, and design a deep convolutional neural network to obtain a hash code for each data point. Extensive experiments on several real-world public datasets show that the proposed method outperforms the state-of-the-art baselines in the image retrieval task.read more
Citations
More filters
Posted Content
Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-wise Loss
Xuefei Zhe,Shifeng Chen,Hong Yan +2 more
TL;DR: This model is motivated by deep metric learning that directly takes semantic labels as supervised information in training and generates corresponding discriminant hashing code that preserves semantic variations while penalizes the overlapping part of different classes in the embedding space.
Journal ArticleDOI
Piecewise supervised deep hashing for image retrieval
TL;DR: A novel hash code generation method based on convolutional neural network (CNN), called the piecewise supervised deep hashing (PSDH) method to directly use a latent layer data and the output layer result of the classification network to generate a two-segment hash code for every input image.
Posted Content
Semantic Hierarchy Preserving Deep Hashing for Large-scale Image Retrieval
TL;DR: This paper presents an effective algorithm to train a deep hashing model that can preserve a semantic hierarchy structure for large-scale image retrieval and achieves state-of-the-art results in terms of hierarchical retrieval.
Posted Content
SHREWD: Semantic Hierarchy-based Relational Embeddings for Weakly-supervised Deep Hashing.
Heikki Arponen,Tom E. Bishop +1 more
TL;DR: This work builds upon the idea of using semantic hierarchies to form distance metrics between all available sample labels to promote similar distances between the deep neural network embeddings, and introduces an empirical Kullback-Leibler divergence loss term to promote binarization and uniformity of theembeddings.
Journal ArticleDOI
Online Enhanced Semantic Hashing: Towards Effective and Efficient Retrieval for Streaming Multi-Modal Data
TL;DR: Wang et al. as mentioned in this paper proposed a new model, termed Online enhAnced SemantIc haShing (OASIS), which designed novel semantic-enhanced representation for data, which could help handle the new coming classes and thereby construct the enhanced semantic objective function.
References
More filters
Proceedings ArticleDOI
ImageNet: A large-scale hierarchical image database
TL;DR: A new database called “ImageNet” is introduced, a large-scale ontology of images built upon the backbone of the WordNet structure, much larger in scale and diversity and much more accurate than the current image datasets.
Proceedings ArticleDOI
Locality-sensitive hashing scheme based on p-stable distributions
TL;DR: A novel Locality-Sensitive Hashing scheme for the Approximate Nearest Neighbor Problem under lp norm, based on p-stable distributions that improves the running time of the earlier algorithm and yields the first known provably efficient approximate NN algorithm for the case p<1.
Proceedings ArticleDOI
Iterative quantization: A procrustean approach to learning binary codes
Yunchao Gong,Svetlana Lazebnik +1 more
TL;DR: A simple and efficient alternating minimization scheme for finding a rotation of zero- centered data so as to minimize the quantization error of mapping this data to the vertices of a zero-centered binary hypercube is proposed.
Proceedings ArticleDOI
Kernelized locality-sensitive hashing for scalable image search
Brian Kulis,Kristen Grauman +1 more
TL;DR: It is shown how to generalize locality-sensitive hashing to accommodate arbitrary kernel functions, making it possible to preserve the algorithm's sub-linear time similarity search guarantees for a wide class of useful similarity functions.
Proceedings Article
Minimal Loss Hashing for Compact Binary Codes
Mohammad Norouzi,David M. Blei +1 more
TL;DR: The formulation is based on structured prediction with latent variables and a hinge-like loss function that is efficient to train for large datasets, scales well to large code lengths, and outperforms state-of-the-art methods.