scispace - formally typeset
Search or ask a question
Author

Gajanan K. Kharate

Bio: Gajanan K. Kharate is an academic researcher. The author has contributed to research in topics: Second-generation wavelet transform & Wavelet packet decomposition. The author has an hindex of 2, co-authored 3 publications receiving 77 citations.

Papers
More filters
Posted Content
TL;DR: It is proposed that proper selection of mother wavelet on the basis of nature of images, improve the quality as well as compression ratio remarkably, and the enhanced run length encoding technique is suggested provides better results than RLE.
Abstract: In Image Compression, the researchers’ aim is to reduce the number of bits required to represent an image by removing the spatial and spectral redundancies. Recently discrete wavelet transform and wavelet packet has emerged as popular techniques for image compression. The wavelet transform is one of the major processing components of image compression. The result of the compression changes as per the basis and tap of the wavelet used. It is proposed that proper selection of mother wavelet on the basis of nature of images, improve the quality as well as compression ratio remarkably. We suggest the novel technique, which is based on wavelet packet best tree based on Threshold Entropy with enhanced run-length encoding. This method reduces the time complexity of wavelet packets decomposition as complete tree is not decomposed. Our algorithm selects the sub-bands, which include significant information based on threshold entropy. The enhanced run length encoding technique is suggested provides better results than RLE. The result when compared with JPEG-2000 proves to be better.

46 citations

Journal ArticleDOI
TL;DR: Compression performance of Daubechies, Biorthogonal, Coiflets and other wavelets along with results for different frequency images are compared and it is proposed that proper selection of mother wavelet on the basis of nature of images, improve the quality as well as compression ratio remarkably.
Abstract: Recently discrete wavelet transform and wavelet packet has emerged as popular techniques for image compression. The wavelet transform is one of the major processing components of image compression. The results of the compression change as per the basis and tap of the wavelet used. This paper compares compression performance of Daubechies, Biorthogonal, Coiflets and other wavelets along with results for different frequency images. Based on the result, it is proposed that proper selection of mother wavelet on the basis of nature of images, improve the quality as well as compression ratio remarkably. The prime objective is to select the proper mother wavelet during the transform phase to compress the color image. This paper includes the discussion on principles of image compression, image compression methodology, the basics of wavelet and orthogonal wavelet transforms, the selection of discrete wavelet transform with results and conclusion.

30 citations

Journal ArticleDOI
TL;DR: The quality of image is better and time for transmission is less than that of conventional approaches, and simulation results show the validity of the proposed approach.
Abstract: The Internet has become an indispensable component of today’s transacting world. Though a powerful medium, the Internet cannot always quickly transfer web page having image in its current form. Such web pages not only take long time to reach their destination but sometimes completely slow down or block other traffic on the network. To improve response time, in this paper we propose real time technique as an efficient way out for. We propose to transmit low resolution (LR) image at the transmitter, and show high resolution (HR) image at receiver. At transmitting end, special filter is applied to HR image to yield LR image and high frequency components. These high frequency components are used to build a basis function, which is optimal in size, and is sent along with LR image. HR image is reconstructed using these two at receivers end. For proposed approach the quality of image is better and time for transmission is less than that of conventional approaches. Simulation results show the validity of our approach.

2 citations


Cited by
More filters
01 Jan 2012
TL;DR: Experimental results demonstrate that the proposed technique provides sufficient high compression ratios compared to other compression techniques.
Abstract: Image compression is a key technology in transmission and storage of digital images because of vast data associated with them. This research suggests a new image compression scheme with pruning proposal based on discrete wavelet transformation (DWT). The effectiveness of the algorithm has been justified over some real images, and the performance of the algorithm has been compared with other common compression standards. The algorithm has been implemented using Visual C++ and tested on a Pentium Core 2 Duo 2.1 GHz PC with 1 GB RAM. Experimental results demonstrate that the proposed technique provides sufficient high compression ratios compared to other compression techniques.

72 citations

Journal ArticleDOI
TL;DR: The results proved that the denoised images using DTCWT (Dual Tree Complex Wavelet Transform) with Wiener filter have a better balance between smoothness and accuracy than the DWT and are less redundant than SWT (StationaryWavelet Transform).
Abstract: Image denoising is the process to remove the noise from the image naturally corrupted by the noise. The wavelet method is one among various methods for recovering infinite dimensional objects like curves, densities, images, etc. The wavelet techniques are very effective to remove the noise because of their ability to capture the energy of a signal in few energy transform values. The wavelet methods are based on shrinking the wavelet coefficients in the wavelet domain. We propose in this paper, a denoising approach basing on dual tree complex wavelet and shrinkage with the Wiener filter technique (where either hard or soft thresholding operators of dual tree complex wavelet transform for the denoising of medical images are used). The results proved that the denoised images using DTCWT (Dual Tree Complex Wavelet Transform) with Wiener filter have a better balance between smoothness and accuracy than the DWT and are less redundant than SWT (StationaryWavelet Transform). We used the SSIM (Structural Similarity Index Measure) along with PSNR (Peak Signal to Noise Ratio) and SSIM map to assess the quality of denoised images.

61 citations

Journal Article
TL;DR: A combination of DCT and fractal image compression techniques is proposed, employed to compress the color image while the fractal images compression is employed to evade the repetitive compressions of analogous blocks.
Abstract: Digital images are often used in several domains. Large amount of data is necessary to represent the digital images so the transmission and storage of such images are time-consuming and infeasible. Hence the information in the images is compressed by extracting only the visible elements. Normally the image compression technique can reduce the storage and transmission costs. During image compression, the size of a graphics file is reduced in bytes without disturbing the quality of the image beyond an acceptable level. Several methods such as Discrete Cosine Transform (DCT), DWT, etc. are used for compressing the images. But, these methods contain some blocking artifacts. In order to overcome this difficulty and to compress the image efficiently, a combination of DCT and fractal image compression techniques is proposed. DCT is employed to compress the color image while the fractal image compression is employed to evade the repetitive compressions of analogous blocks. Analogous blocks are found by using the Euclidean distance measure. Here, the given image is encoded by means of Huffman encoding technique. The implementation result shows the effectiveness of the proposed scheme in compressing the color image. Also a comparative analysis is performed to prove that our system is competent to compress the images in terms of Peak Signal to Noise Ratio (PSNR), Structural Similarity Index (SSIM) and Universal Image Quality Index (UIQI) measurements.

54 citations

Proceedings ArticleDOI
04 Aug 2021
TL;DR: In this article, the authors apply the Tree Seed Algorithm as a characteristic enlivened high-level algorithm to improve the performance of fractal image compression (FIC).
Abstract: In the field of Image compression and particularly for the process of image encoding, Fractal image compression (FIC) technique plays a vital role. This technique is based on fractals present in an image and also it is capable of generating the copying blocks dependent on numerical changes. The only drawback in the Fractal image compression is that the time taken during the encoding process when the data is large. Thus, it is necessary to optimize the encoding process for efficient resource utilization as optimization algorithms are generally known for converging behavior. This research work attempts to apply the Tree Seed Algorithm as a characteristic enlivened high-level algorithm to improve FIC. Output shows the improvement of such calculation under various types (encoding time, pressure proportion, top sign to clamor proportion, and mean square error. Moreover, an examination with a portion of the current techniques underlines this addition.

39 citations

Journal ArticleDOI
11 Oct 2019-Symmetry
TL;DR: This paper presents a detailed analysis of run-length, entropy and dictionary based lossless image compression algorithms with a common numeric example for a clear comparison and measures the performance of the state-of-the-art techniques.
Abstract: Modern daily life activities result in a huge amount of data, which creates a big challenge for storing and communicating them. As an example, hospitals produce a huge amount of data on a daily basis, which makes a big challenge to store it in a limited storage or to communicate them through the restricted bandwidth over the Internet. Therefore, there is an increasing demand for more research in data compression and communication theory to deal with such challenges. Such research responds to the requirements of data transmission at high speed over networks. In this paper, we focus on deep analysis of the most common techniques in image compression. We present a detailed analysis of run-length, entropy and dictionary based lossless image compression algorithms with a common numeric example for a clear comparison. Following that, the state-of-the-art techniques are discussed based on some bench-marked images. Finally, we use standard metrics such as average code length (ACL), compression ratio (CR), pick signal-to-noise ratio (PSNR), efficiency, encoding time (ET) and decoding time (DT) in order to measure the performance of the state-of-the-art techniques.

37 citations