Topic
Locality-sensitive hashing
About: Locality-sensitive hashing is a research topic. Over the lifetime, 1894 publications have been published within this topic receiving 69362 citations.
Papers published on a yearly basis
Papers
More filters
•
27 Feb 2009TL;DR: In this paper, a method and apparatus for a system and process for generating a hashing value using any number of cryptographic hashing functions is presented, where the hashing process receives an input value to be hashed.
Abstract: A method and apparatus for a system and process for generating a hashing value using any number of cryptographic hashing functions. The hashing process receives an input value to be hashed. The input value is cyptographically hashed and augmented. The augmented value is then cryptographically hashed. The process then iteratively applies a set of non-linear functions to these values. Each iteration maintaining a ‘left half’ and ‘right half.’ After the last iteration, the left and right portions are concatenated to form a hash value that is output.
15 citations
••
TL;DR: A comprehensive survey of image hashing is given, which presents an overview of various image hashing schemes and discusses their advantages and limitations in terms of security, robustness, and discrimination under different types of operations on the image.
Abstract: The traditional cryptographic hash functions are sensitive to even one-bit difference of the input message. While multimedia data always undergo compression or other signal processing operations, which lead to the unsuitability of multimedia authentication using cryptographic hash. The image hashing has emerged recently which captures visual essentials for robust image authentication. In this paper, we give a comprehensive survey of image hashing. We present an overview of various image hashing schemes and discuss their advantages and limitations in terms of security, robustness, and discrimination under different types of operations on the image.
15 citations
••
01 Jul 2017TL;DR: This paper proposes a supervised cross-modal hashing without relaxation (SCMH-WR), which can not only make use of label information, but also generate the final binary codes directly, i.e., without relaxing binary constraints.
Abstract: Recently, hashing based approximate nearest neighbor search has attracted much attention in large scale data search task. Moreover, some cross-modal hashing methods have also been proposed to perform efficient search of different modalities. However, there are still some problems to be further considered. For example, some of them cannot make use of label information, which contains helpful information to generate hash codes; some of them firstly relax binary constraints during optimization, then threshold continuous outputs to binary, which could generate large quantization error. To consider these problems, in this paper, we propose a supervised cross-modal hashing without relaxation (SCMH-WR). It can not only make use of label information, but also generate the final binary codes directly, i.e., without relaxing binary constraints. Specifically, it maps different modalities into a common low-dimension subspace with preserving the similarity of labels; at the same time, it learns a rotation matrix to minimize the quantization error and gets the final binary codes. In addition, an iterative algorithm is proposed to tackle the optimization problem. SCMH-WR is tested on three benchmark data sets. Experimental results demonstrate that SCMH-WR outperforms state-of-the-art hashing methods for cross-modal search task.
15 citations
••
25 Feb 2012TL;DR: LHlf is a new hash table designed to allow very high levels of concurrency and adopts recursive split-ordering of the items within a bucket to be able to split and merge lists in a lock free manner.
Abstract: LHlf is a new hash table designed to allow very high levels of concurrency. The table is lock free and grows and shrinks auto-matically according to the number of items in the table. Insertions, lookups and deletions are never blocked. LHlf is based on linear hashing but adopts recursive split-ordering of the items within a bucket to be able to split and merge lists in a lock free manner. LHlf is as fast as the best previous lock-free design and in addition it offers stable performance, uses less space, and supports both expansions and contractions.
15 citations
•
TL;DR: In this paper, Andoni, Indyk, Laarhoven, Razenshteyn, and Schmidt proposed a variant of cross-polytope locality sensitive hashing with respect to angular distance which is provably optimal in asymptotic sensitivity.
Abstract: We provide a variant of cross-polytope locality sensitive hashing with respect to angular distance which is provably optimal in asymptotic sensitivity and enjoys $\mathcal{O}(d \ln d )$ hash computation time. Building on a recent result (by Andoni, Indyk, Laarhoven, Razenshteyn, Schmidt, 2015), we show that optimal asymptotic sensitivity for cross-polytope LSH is retained even when the dense Gaussian matrix is replaced by a fast Johnson-Lindenstrauss transform followed by discrete pseudo-rotation, reducing the hash computation time from $\mathcal{O}(d^2)$ to $\mathcal{O}(d \ln d )$. Moreover, our scheme achieves the optimal rate of convergence for sensitivity. By incorporating a low-randomness Johnson-Lindenstrauss transform, our scheme can be modified to require only $\mathcal{O}(\ln^9(d))$ random bits
15 citations