scispace - formally typeset
Search or ask a question
Topic

Quantization (image processing)

About: Quantization (image processing) is a research topic. Over the lifetime, 7977 publications have been published within this topic receiving 126632 citations.


Papers
More filters
Patent
19 May 1992
TL;DR: In this paper, a method of finding a most likely match for a target facial image within a data base of stored facial images comprising determining a score for each data base image as a function of closeness of a quantization of selected facial features between each dataset and the target image and ordering the data base for sequential processing according to the potential value score in descending order.
Abstract: A method of finding a most likely match for a target facial image within a data base of stored facial images comprising determining a score for each data base image as a function of closeness of a quantization of selected facial features between each data base image and the target image and ordering the data base for sequential processing according to the potential value score in descending order, sequentially processing each data base image starting from the highest potential value score by an image comparison process to establish a correlation score for each comparison, and applying one or more decision rules to each comparison to reach a decision.

157 citations

Proceedings ArticleDOI
14 Dec 2005
TL;DR: An image based steganography that combines Least Significant Bit (LSB), Discrete Cosine Transform (DCT), and compression techniques on raw images to enhance the security of the payload is presented.
Abstract: Steganography is an important area of research in recent years involving a number of applications. It is the science of embedding information into the cover image viz., text, video, and image (payload) without causing statistically significant modification to the cover image. The modern secure image steganography presents a challenging task of transferring the embedded information to the destination without being detected. In this paper we present an image based steganography that combines Least Significant Bit(LSB), Discrete Cosine Transform(DCT), and compression techniques on raw images to enhance the security of the payload. Initially, the LSB algorithm is used to embed the payload bits into the cover image to derive the stego-image. The stego-image is transformed from spatial domain to the frequency domain using DCT. Finally quantization and runlength coding algorithms are used for compressing the stego-image to enhance its security. It is observed that secure images with low MSE and BER are transferred without using any password, in comparison with earlier works.

155 citations

Journal ArticleDOI
TL;DR: Three fast search routines to be used in the encoding phase of vector quantization (VQ) image compression systems are presented and show that the proposed algorithms need only 3-20% of the number of mathematical operations required by a full search.
Abstract: Three fast search routines to be used in the encoding phase of vector quantization (VQ) image compression systems are presented. These routines, which are based on geometric considerations, provide the same results as an exhaustive (or full) search. Examples show that the proposed algorithms need only 3-20% of the number of mathematical operations required by a full search and fewer than 50% of the operations required by recently proposed alternatives. >

154 citations

Patent
Yung-Lyul Lee1, Hyun Wook Park1
09 Oct 1998
TL;DR: In this article, an image data post-processing method for reducing quantization effect induced when image data compressed based on a block is decoded, and an apparatus therefor is presented.
Abstract: An image data post-processing method for reducing quantization effect induced when image data compressed based on a block is decoded, and an apparatus therefor. The image data post-processing method includes the steps of: (a) detecting semaphore representing whether or not post-processing is required, using distribution of inverse quantization coefficients of inverse-quantized image data and a motion vector representing the difference between the blocks of a previous video object plane (VOP) and blocks of a current VOP; and (b) filtering the decoded image data corresponding to the semaphore by a predetermined method, if it is determined by checking the detected semaphore that post-processing is required. Therefore, the quantization effect can be reduced by using the semaphore and an adaptive filter, and the amount of computation for filtering is also reduced. Also, the complexity of the hardware is reduced by a parallel process without multiplication and division.

154 citations

Journal ArticleDOI
TL;DR: A hybrid three-dimensional wavelet transform for spectral and spatial decorrelation in the framework of Part 2 of the JPEG 2000 standard is employed, which provides competitive performance with respect to state-of-the-art techniques.
Abstract: In this letter we propose a new technique for progressive coding of hyperspectral data. Specifically, we employ a hybrid three-dimensional wavelet transform for spectral and spatial decorrelation in the framework of Part 2 of the JPEG 2000 standard. Both onboard and on-the-ground compression are addressed. The resulting technique is compliant with the JPEG 2000 family of standards and provides competitive performance with respect to state-of-the-art techniques.

153 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
84% related
Image segmentation
79.6K papers, 1.8M citations
84% related
Feature (computer vision)
128.2K papers, 1.7M citations
84% related
Image processing
229.9K papers, 3.5M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20228
2021354
2020283
2019294
2018259
2017295