Open AccessPosted Content
Circulant Binary Embedding
TLDR
In this article, the authors proposed Circulant Binary Embedding (CBE) which generates binary codes by projecting the data with a circulant matrix, which enables the use of Fast Fourier Transformation to speed up the computation.Abstract:
Binary embedding of high-dimensional data requires long codes to preserve the discriminative power of the input space. Traditional binary coding methods often suffer from very high computation and storage costs in such a scenario. To address this problem, we propose Circulant Binary Embedding (CBE) which generates binary codes by projecting the data with a circulant matrix. The circulant structure enables the use of Fast Fourier Transformation to speed up the computation. Compared to methods that use unstructured matrices, the proposed method improves the time complexity from $\mathcal{O}(d^2)$ to $\mathcal{O}(d\log{d})$, and the space complexity from $\mathcal{O}(d^2)$ to $\mathcal{O}(d)$ where $d$ is the input dimensionality. We also propose a novel time-frequency alternating optimization to learn data-dependent circulant projections, which alternatively minimizes the objective in original and Fourier domains. We show by extensive experiments that the proposed approach gives much better performance than the state-of-the-art approaches for fixed time, and provides much faster computation with no performance degradation for fixed number of bits.read more
Citations
More filters
Journal ArticleDOI
Recent advances in convolutional neural networks
Jiuxiang Gu,Zhenhua Wang,Jason Kuen,Lianyang Ma,Amir Shahroudy,Bing Shuai,Ting Liu,Xingxing Wang,Gang Wang,Jianfei Cai,Tsuhan Chen +10 more
TL;DR: A broad survey of the recent advances in convolutional neural networks can be found in this article, where the authors discuss the improvements of CNN on different aspects, namely, layer design, activation function, loss function, regularization, optimization and fast computation.
Posted Content
Recent Advances in Convolutional Neural Networks
Jiuxiang Gu,Zhenhua Wang,Jason Kuen,Lianyang Ma,Amir Shahroudy,Bing Shuai,Ting Liu,Xingxing Wang,Li Wang,Gang Wang,Jianfei Cai,Tsuhan Chen +11 more
TL;DR: This paper details the improvements of CNN on different aspects, including layer design, activation function, loss function, regularization, optimization and fast computation, and introduces various applications of convolutional neural networks in computer vision, speech and natural language processing.
Journal ArticleDOI
A Survey on Learning to Hash
TL;DR: In this paper, a comprehensive survey of the learning to hash algorithms is presented, categorizing them according to the manners of preserving the similarities into: pairwise similarity preserving, multi-wise similarity preservation, implicit similarity preserving and quantization, and discuss their relations.
Proceedings Article
Deep Hashing Network for efficient similarity retrieval
TL;DR: A novel Deep Hashing Network (DHN) architecture for supervised hashing is proposed, in which good image representation tailored to hash coding and formally control the quantization error are jointly learned.
Posted Content
Hashing for Similarity Search: A Survey
TL;DR: This paper presents a survey on one of the main solutions to approximate search, hashing, which has been widely studied since the pioneering work locality sensitive hashing, and divides the hashing algorithms into two main categories.
References
More filters
Proceedings ArticleDOI
ImageNet: A large-scale hierarchical image database
TL;DR: A new database called “ImageNet” is introduced, a large-scale ontology of images built upon the backbone of the WordNet structure, much larger in scale and diversity and much more accurate than the current image datasets.
Proceedings Article
Spectral Hashing
TL;DR: The problem of finding a best code for a given dataset is closely related to the problem of graph partitioning and can be shown to be NP hard and a spectral method is obtained whose solutions are simply a subset of thresholded eigenvectors of the graph Laplacian.
Journal ArticleDOI
Product Quantization for Nearest Neighbor Search
TL;DR: This paper introduces a product quantization-based approach for approximate nearest neighbor search to decompose the space into a Cartesian product of low-dimensional subspaces and to quantize each subspace separately.
Proceedings ArticleDOI
Similarity estimation techniques from rounding algorithms
TL;DR: It is shown that rounding algorithms for LPs and SDPs used in the context of approximation algorithms can be viewed as locality sensitive hashing schemes for several interesting collections of objects.
Book
Toeplitz and circulant matrices
TL;DR: The fundamental theorems on the asymptotic behavior of eigenvalues, inverses, and products of banded Toeplitz matrices and Toepler matrices with absolutely summable elements are derived in a tutorial manner in the hope of making these results available to engineers lacking either the background or endurance to attack the mathematical literature on the subject.