scispace - formally typeset
Open AccessProceedings Article

Boosting Complementary Hash Tables for Fast Nearest Neighbor Search

Reads0
Chats0
TLDR
Extensive experiments carried out on two popular tasks including Euclidean and semantic nearest neighbor search demonstrate that the proposed boosted complementary hash-tables method enjoys the strong table complementarity and significantly outperforms the state-of-the-arts.
Abstract
Hashing has been proven a promising technique for fast nearest neighbor search over massive databases. In many practical tasks it usually builds multiple hash tables for a desired level of recall performance. However, existing multi-table hashing methods suffer from the heavy table redundancy, without strong table complementarity and effective hash code learning. To address the problem, this paper proposes a multi-table learning method which pursues a specified number of complementary and informative hash tables from a perspective of ensemble learning. By regarding each hash table as a neighbor prediction model, the multi-table search procedure boils down to a linear assembly of predictions stemming from multiple tables. Therefore, a sequential updating and learning framework is naturally established in a boosting mechanism, theoretically guaranteeing the table complementarity and algorithmic convergence. Furthermore, each boosting round pursues the discriminative hash functions for each table by a discrete optimization in the binary code space. Extensive experiments carried out on two popular tasks including Euclidean and semantic nearest neighbor search demonstrate that the proposed boosted complementary hash-tables method enjoys the strong table complementarity and significantly outperforms the state-of-the-arts.

read more

Citations
More filters
Journal ArticleDOI

Shared Predictive Cross-Modal Deep Quantization

TL;DR: In this article, a shared predictive deep quantization (SPDQ) approach is proposed to explicitly formulates a shared subspace across different modalities and two private subspaces for individual modalities, and representations in the shared sub-space and the private sub-spaces are learned simultaneously by embedding them to a reproducing kernel Hilbert space.
Journal ArticleDOI

Adversarial Examples for Hamming Space Search

TL;DR: This work proposes hash adversary generation (HAG), a novel method of crafting adversarial examples for Hamming space search whose nearest neighbors from a targeted hashing model are semantically irrelevant to the original queries.
Journal ArticleDOI

Two-Stream Deep Hashing With Class-Specific Centers for Supervised Image Search

TL;DR: This work designs a neural network that leverages label information and outputs a unified binary representation for each class and designs an image network to learn hash codes from images and force these hash codes to be close to the corresponding class-specific centers.
Journal ArticleDOI

Hash Bit Selection for Nearest Neighbor Search

TL;DR: This paper posing an optimal hash bit selection problem, in which an optimal subset of hash bits are selected from a pool of candidate bits generated by different features, algorithms, or parameters, adopts the bit reliability and their complementarity as the selection criteria that can be carefully tailored for hashing performance in different tasks.
Journal ArticleDOI

Global and local semantics-preserving based deep hashing for cross-modal retrieval

TL;DR: A large margin is enforced between similar hash codes and dissimilar hash codes from an inter-modal view to learn discriminative hash codes, which can well preserve local semantic structure and the global semantic structure can be preserved into the hash codes.
References
More filters
Journal ArticleDOI

Additive Logistic Regression : A Statistical View of Boosting

TL;DR: This work shows that this seemingly mysterious phenomenon of boosting can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood, and develops more direct approximations and shows that they exhibit nearly identical results to boosting.
Proceedings ArticleDOI

Locality-sensitive hashing scheme based on p-stable distributions

TL;DR: A novel Locality-Sensitive Hashing scheme for the Approximate Nearest Neighbor Problem under lp norm, based on p-stable distributions that improves the running time of the earlier algorithm and yields the first known provably efficient approximate NN algorithm for the case p<1.
Proceedings Article

Spectral Hashing

TL;DR: The problem of finding a best code for a given dataset is closely related to the problem of graph partitioning and can be shown to be NP hard and a spectral method is obtained whose solutions are simply a subset of thresholded eigenvectors of the graph Laplacian.
Proceedings ArticleDOI

Iterative quantization: A procrustean approach to learning binary codes

TL;DR: A simple and efficient alternating minimization scheme for finding a rotation of zero- centered data so as to minimize the quantization error of mapping this data to the vertices of a zero-centered binary hypercube is proposed.
Proceedings Article

Hashing with Graphs

TL;DR: This paper proposes a novel graph-based hashing method which automatically discovers the neighborhood structure inherent in the data to learn appropriate compact codes and describes a hierarchical threshold learning procedure in which each eigenfunction yields multiple bits, leading to higher search accuracy.
Related Papers (5)