scispace - formally typeset
Journal ArticleDOI

Non-Expansive Hashing

Nathan Linial, +1 more
- 01 Jan 1998 - 
- Vol. 18, Iss: 1, pp 121-132
Reads0
Chats0
TLDR
A non-expansive hashing scheme wherein any set of size from a large universe may be stored in a memory of size (any, and ), and where retrieval takes operations.
Abstract
hashing scheme, similar inputs are stored in memory locations which are close. We develop a non-expansive hashing scheme wherein any set of size from a large universe may be stored in a memory of size (any , and ), and where retrieval takes operations. We explain how to use non-expansive hashing schemes for efficient storage and retrieval of noisy data. A dynamic version of this hashing scheme is presented as well.

read more

Citations
More filters
Proceedings ArticleDOI

Similarity estimation techniques from rounding algorithms

TL;DR: It is shown that rounding algorithms for LPs and SDPs used in the context of approximation algorithms can be viewed as locality sensitive hashing schemes for several interesting collections of objects.
Proceedings ArticleDOI

Efficient search for approximate nearest neighbor in high dimensional spaces

TL;DR: Significantly improving and extending recent results of Kleinberg, data structures whose size is polynomial in the size of the database and search algorithms that run in time nearly linear or nearly quadratic in the dimension are constructed.
Proceedings ArticleDOI

Uniform hashing in constant time and linear space

TL;DR: This paper presents an almost ideal solution to this problem: a hash function that, on any set of n inputs, behaves like a truly random function with high probability, can be evaluated in constant time on a RAM, and can be stored in O(n) words, which is optimal.
Journal ArticleDOI

Simulating Uniform Hashing in Constant Time and Optimal Space

Anna Östlin, +1 more
- 05 Jun 2002 - 
TL;DR: In this paper it is shown how to implement hash functions that can be evaluated on a RAM in constant time, and behave like truly random functions on any set of n inputs, with high probability.
Proceedings Article

Fast local searches and updates in bounded universes.

TL;DR: In this paper, it was shown how to perform predecessor searches in O( log log Δ ) expected time, where Δ is the difference between the element being searched for and its predecessor in the structure.
References
More filters
Journal ArticleDOI

Universal classes of hash functions

TL;DR: An input independent average linear time algorithm for storage and retrieval on keys that makes a random choice of hash function from a suitable class of hash functions.
Book

Sparse Distributed Memory

TL;DR: Pentti Kanerva's Sparse Distributed Memory presents a mathematically elegant theory of human long term memory that resembles the cortex of the cerebellum, and provides an overall perspective on neural systems.
Journal ArticleDOI

Storing a Sparse Table with 0(1) Worst Case Access Time

TL;DR: A data structure for representing a set of n items from a universe of m items, which uses space n+o(n) and accommodates membership queries in constant time and is easy to implement.
Proceedings ArticleDOI

Dynamic perfect hashing: upper and lower bounds

TL;DR: In this article, a randomized algorithm with O(1) worst-case time for lookup and O( 1) amortized expected time for insertion and deletion was given for the dictionary problem.
Proceedings ArticleDOI

Multi-index hashing for information retrieval

TL;DR: A technique for building hash indices for a large dictionary of strings that permits robust retrieval of strings from the dictionary even when the query pattern has a significant number of errors.