scispace - formally typeset
Search or ask a question
Topic

Feature hashing

About: Feature hashing is a research topic. Over the lifetime, 993 publications have been published within this topic receiving 51462 citations.


Papers
More filters
Patent
26 Feb 2013
TL;DR: In this article, a device is configured to determine a contour vector that delineates an object in an image from a remaining portion of the image, and to generate a hash value by applying a hash function to the contour vectors.
Abstract: A device is configured to determine a contour vector that delineates an object in an image from a remaining portion of the image, and to generate a hash value by applying a hash function to the contour vector. The device is configured to compare the hash value to a previous hash value generated by applying the hash function to a previous contour vector, where the previous contour vector is determined prior to determining the contour vector. The device is configured to determine that the hash value matches the previous hash value and, based on determining that the hash value matches the previous hash value, segment the image using the contour vector.
Proceedings ArticleDOI
01 Oct 2012
TL;DR: A different key-dependent robust speech hashing based upon speech construction model is proposed in this article, based on the essential frequency series, which is found to perform very adequately in identification and verification tests and to be very robust to a large range of attacks.
Abstract: Conceptual hash functions provide a tool for fast and reliable identification of content authentication. Robust hashing for multimedia authentication is an emergent research area. A different key-dependent robust speech hashing based upon speech construction model is proposed in this article. The proposed hash function is based on the essential frequency series. Robust hash is calculated based on linear spectrum frequencies which model the verbal territory. The correlation between LSFs is decoupled by discrete wavelet transformation (DWT). A randomization structure controlled by a secret key is used in hash generation for random feature selection. The hash function is key-dependent and collision resistant. Temporarily, it is extremely robust to content protective operations besides having high accuracy of tampering localization. They are found, the first, to perform very adequately in identification and verification tests, and the second, to be very robust to a large range of attacks. Furthermore, it can be addressed the issue of security of hashes and proposed a keying technique, and thereby a key-dependent hash function.
Posted Content
TL;DR: In this paper, two new LSH families for the angular distance based on feature hashing were proposed. But the performance of these new families was not as good as other LSH family for the Euclidean distance.
Abstract: In this paper we propose the creation of generic LSH families for the angular distance based on Johnson-Lindenstrauss projections. We show that feature hashing is a valid J-L projection and propose two new LSH families based on feature hashing. These new LSH families are tested on both synthetic and real datasets with very good results and a considerable performance improvement over other LSH families. While the theoretical analysis is done for the angular distance, these families can also be used in practice for the euclidean distance with excellent results [2]. Our tests using real datasets show that the proposed LSH functions work well for the euclidean distance.
Journal ArticleDOI
TL;DR: AMSH as mentioned in this paper proposes an adaptive margin matrices to enlarge the gap between positive and negative bits, which improves the discrimination and robustness of hash functions without the strict one-to-one data correspondence requirement.
Abstract: In recent years, Cross-Modal Hashing (CMH) has attracted much attention due to its fast query speed and efficient storage. Previous studies have achieved promising results for Cross-Modal Retrieval (CMR) by discovering discriminative hash codes and modality-specific hash functions. Nonetheless, most existing CMR works are subjected to some restrictions: 1) It is assumed that data of different modalities are fully paired, which is impractical in real applications due to sample missing and false data alignment, and 2) binary regression targets including the label matrix and binary codes are too rigid to effectively learn semantic-preserving hash codes and hash functions. To address these problems, this paper proposes an Adaptive Marginalized Semantic Hashing (AMSH) method which not only enhances the discrimination of latent representations and hash codes by adaptive margins, but can also be used for both paired and unpaired CMR. As a two-step method, in the first step, AMSH generates semantic-aware modality-specific latent representations with adaptively marginalized labels, thereby enlarging the distances between different classes, and exploiting the labels to preserve the inter-modal and intra-modal semantic similarities into latent representations and hash codes. In the second step, adaptive margin matrices are embedded into the hash codes, and enlarge the gaps between positive and negative bits, which improves the discrimination and robustness of hash functions. On this basis, AMSH generates similarity-preserving hash codes and robust hash functions without the strict one-to-one data correspondence requirement. Experiments are conducted on several benchmark datasets to demonstrate the superiority and flexibility of AMSH over some state-of-the-art CMR methods. The source code is available at https://github.com/LKYLKYZ/AMSH .

Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
84% related
Convolutional neural network
74.7K papers, 2M citations
84% related
Feature (computer vision)
128.2K papers, 1.7M citations
84% related
Deep learning
79.8K papers, 2.1M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202333
202289
202111
202016
201916
201838