scispace - formally typeset
Open AccessJournal ArticleDOI

Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance

Reads0
Chats0
TLDR
A statistical view of the texture retrieval problem is presented by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme that leads to a new wavelet-based texture retrieval method that is based on the accurate modeling of the marginal distribution of wavelet coefficients using generalized Gaussian density (GGD).
Abstract
We present a statistical view of the texture retrieval problem by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme. We show that using a consistent estimator of texture model parameters for the FE step followed by computing the Kullback-Leibler distance (KLD) between estimated models for the SM step is asymptotically optimal in term of retrieval error probability. The statistical scheme leads to a new wavelet-based texture retrieval method that is based on the accurate modeling of the marginal distribution of wavelet coefficients using generalized Gaussian density (GGD) and on the existence a closed form for the KLD between GGDs. The proposed method provides greater accuracy and flexibility in capturing texture information, while its simplified form has a close resemblance with the existing methods which uses energy distribution in the frequency domain to identify textures. Experimental results on a database of 640 texture images indicate that the new method significantly improves retrieval rates, e.g., from 65% to 77%, compared with traditional approaches, while it retains comparable levels of computational complexity.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Image retrieval: Ideas, influences, and trends of the new age

TL;DR: Almost 300 key theoretical and empirical contributions in the current decade related to image retrieval and automatic image annotation are surveyed, and the spawning of related subfields are discussed, to discuss the adaptation of existing image retrieval techniques to build systems that can be useful in the real world.
Book

Modern image quality assessment

TL;DR: This book is about objective image quality assessment to provide computational models that can automatically predict perceptual image quality and to provide new directions for future research by introducing recent models and paradigms that significantly differ from those used in the past.
Journal ArticleDOI

Local Tetra Patterns: A New Feature Descriptor for Content-Based Image Retrieval

TL;DR: A novel image indexing and retrieval algorithm using local tetra patterns (LTrPs) for content-based image retrieval (CBIR) that encodes the relationship between the referenced pixel and its neighbors, based on the directions that are calculated using the first-order derivatives in vertical and horizontal directions.
Journal ArticleDOI

Directional multiscale modeling of images using the contourlet transform

TL;DR: This study reveals the highly non-Gaussian marginal statistics and strong interlocation, interscale, and interdirection dependencies of contourlet coefficients and finds that conditioned on the magnitudes of their generalized neighborhood coefficients, contours coefficients can be approximately modeled as Gaussian random variables.
Proceedings ArticleDOI

Content-based image retrieval: approaches and trends of the new age

TL;DR: Some of the key contributions in the current decade related to image retrieval and automated image annotation are discussed, spanning 120 references, and a study on the trends in volume and impact of publications in the field with respect to venues/journals and sub-topics is concluded.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI

A theory for multiresolution signal decomposition: the wavelet representation

TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Book

Ten lectures on wavelets

TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Journal ArticleDOI

Fundamentals of statistical signal processing: estimation theory

TL;DR: The Fundamentals of Statistical Signal Processing: Estimation Theory as mentioned in this paper is a seminal work in the field of statistical signal processing, and it has been used extensively in many applications.
Journal ArticleDOI

Ten Lectures on Wavelets

TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Related Papers (5)