scispace - formally typeset
Search or ask a question

Showing papers by "Tanuja Sarode published in 2010"


Posted Content
TL;DR: The proposed algorithm gives less distortion as compared to well known Linde Buzo Gray (LBG) algorithm and Kekre’s Proportionate Error (KPE) Algorithm by introducing new orientation every time to split the clusters.
Abstract: —The paper presents new clustering algorithm. The proposed algorithm gives less distortion as compared to well known Linde Buzo Gray (LBG) algorithm and Kekre’s Proportionate Error (KPE) Algorithm. Constant error is added every time to split the clusters in LBG, resulting in formation of cluster in one direction which is 135 0 in 2-dimensional case. Because of this reason clustering is inefficient resulting in high MSE in LBG. To overcome this drawback of LBG proportionate error is added to change the cluster orientation in KPE. Though the cluster orientation in KPE is changed its variation is limited to ± 45 0 over 135 . The proposed algorithm takes care of this problem by introducing new orientation every time to split the clusters. The proposed method reduces PSNR by 2db to 5db for codebook size 128 to 1024 with respect to LBG. Keywords-component; Vector Quantization; Codebook; Codevector; Encoding; Compression. I. I NTRODUCTION Exhaustive Search (ES) method gives the optimal result at the World Wide Web Applications have extensively grown since last few decades and it has become requisite tool for education, communication, industry, amusement etc. All these applications are multimedia-based applications consisting of images and videos. Images/videos require enormous volume of data items, creating a serious problem as they need higher channel bandwidth for efficient transmission. Further high degree of redundancies is observed in digital images. Thus the need for image compression arises for resourceful storage and transmission. Image compression is classified into two categories, lossless image compression and lossy image compression technique. Vector quantization (VQ) is one of the lossy data compression techniques[1], [2] and has been used in number of applications, like pattern recognition [3], speech recognition and face detection [4], [5], image segmentation [6-9], speech data compression [10], Content Based Image Retrieval (CBIR) [11], [12], Face recognition[13], [14] iris recognition[15], tumor detection in mammography images [29] etc. VQ is a mapping function which maps k-dimensional vector space to a finite set CB = {C

36 citations


Proceedings ArticleDOI
26 Feb 2010
TL;DR: The new technique for image retrieval using the color-texture features extracted from images based on vector quantization with Kekre's fast codebook generation is proposed, which gives better discrimination capability for CBIR.
Abstract: The new technique for image retrieval using the color-texture features extracted from images based on vector quantization with Kekre's fast codebook generation is proposed. This gives better discrimination capability for CBIR. Here the database image is divided into 2x2 pixel windows to obtain 12 color descriptors per window (Red, Green and Blue per pixel) to form a vector. Collection of all such vectors is a training set. Then the Kekre's Fast Codebook Generation (KFCG) is applied on this set to get 16 codevectors. The Walsh transform is applied on each column of the codebook, followed by Kekre's transform applied on each row of the Walsh transformed codebook. This transform vector then is used as the image signature (feature vector) for image retrieval. The method takes lesser computations as compared to conventional Walsh applied on complete image. The method gives the color-texture features of the image database at reduced feature set size. Proposed method gives better precision and recall as compared to full Walsh based CBIR. Proposed method avoids resizing of images which is required for any transform based feature extraction method.

30 citations


Posted Content
TL;DR: This paper proposed segmentation using vector quantization technique using Linde Buzo-Gray algorithm (LBG) for segmentation of MRI images and displayed results of watershed segmentation and Entropy using Gray Level Co-occurrence Matrix along with this method.
Abstract: Segmenting a MRI images into homogeneous texture regions representing disparate tissue types is often a useful preprocessing step in the computer-assisted detection of breast cancer That is why we proposed new algorithm to detect cancer in mammogram breast cancer images In this paper we proposed segmentation using vector quantization technique Here we used Linde Buzo-Gray algorithm (LBG) for segmentation of MRI images Initially a codebook of size 128 was generated for MRI images These code vectors were further clustered in 8 clusters using same LBG algorithm These 8 images were displayed as a result This approach does not leads to over segmentation or under segmentation For the comparison purpose we displayed results of watershed segmentation and Entropy using Gray Level Co-occurrence Matrix along with this method

27 citations


Journal ArticleDOI
TL;DR: A vector quantization segmentation method to detect cancerous mass from mammogram images to improve radiologist’s diagnostic performance.
Abstract: Breast cancer is one of the major causes of death among women. An improvement of early diagnostic techniques is critical for women’s quality of life. Mammography is the main test used for screening and early diagnosis. Contrast-enhanced magnetic resonance of the breast is the most attractive alternative to standard mammography. This paper presents a vector quantization segmentation method to detect cancerous mass from mammogram images. In order to increase radiologist’s diagnostic performance, several computer-aided diagnosis (CAD) schemes have been developed to improve the detection of primary signatures of this disease: masses and microcalcifications.

15 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used probability of mammographic image as input for vector quantization and used Kekre's Proportionate Error (KPE) algorithm is used and codebook of size 128 is formed.
Abstract: Mammography is well known method for detection of breast tumors. Early detection and removal of the primary tumor is an essential and effective method to enhance survival rate and reduce mortality. In this paper, proposed algorithm uses probability of mammographic image as input for vector quantization .For region forming Kekre's Proportionate Error (KPE) algorithm is used and codebook of size 128 is formed .Further this 128 clusters were used for region merging using KPE algorithm for reclustering .To separate tumor ,post processing is done by morphological operations. For this tumor sectional area is calculated and center point is compared with LBG algorithm for segmentation of mammographic images.

12 citations


Journal ArticleDOI
TL;DR: Comparison of all three transformation techniques on spectrograms shows that numbers of mathematical computations required for Walsh transform is much lesser than number of mathematical computation required in case of DCT on spectrogram, whereas, use of Haar transform drastically reduces the number of Mathematical computation with almost equal identification rate.
Abstract: This paper aims to provide different approaches to text dependent speaker identification using various transformation techniques such as DCT, Walsh and Haar transform along with use of spectrograms. Set of spectrograms obtained from speech samples is used as image database for the study undertaken. This image database is then subjected to various transforms. Using Euclidean distance as measure of similarity, most appropriate speaker match is obtained which is declared to be identified speaker. Each transform is applied to spectrograms in two different ways: on full image and on Row Mean of an image. In both the ways, effect of different number of coefficients of transformed image is observed. Further, comparison of all three transformation techniques on spectrograms in both the ways shows that numbers of mathematical computations required for Walsh transform is much lesser than number of mathematical computations required in case of DCT on spectrograms. Whereas, use of Haar transform on spectrograms drastically reduces the number of mathematical computation with almost equal identification rate. Transformation techniques on Row Mean give better identification rate than transformation technique on full image.

11 citations


Journal ArticleDOI
TL;DR: This paper proposes a simple yet effective technique for fingerprint identification that is image-based in which feature vectors of a fingerprint are extracted after sectorization of the cepstrum of a fingerprints and matched with those stored in the database.
Abstract: Using biometrics to verify a person’s identity has several advantages over the present practices of personal identification numbers (PINs) and passwords. Minutiae-based automated fingerprint identification systems are more popular, but they are more computationally complex and time consuming. In this paper we propose a simple yet effective technique for fingerprint identification. This method is image-based in which feature vectors of a fingerprint are extracted after sectorization of the cepstrum of a fingerprint. They are matched with those stored in the database. The experimental results show that this algorithm could correctly identify fingerprints with accuracy more than 96% in case of larger number of sectors.

11 citations


Journal ArticleDOI
TL;DR: Experimental results indicate that the proposed scheme provides 100% hiding capacity or more that means secret message can be of same or more size than cover image and better image quality compared with existing schemes based on VQ compressed images.
Abstract: Many researchers have studied reversible data hiding techniques in recent years and most have proposed reversible data hiding schemes that guarantees only that the original cover image can be reconstructed completely. Once the secret data are embedded in the compression domain and the receiver wants to store the cover image in a compression mode to save storage space, the receiver must extract the secret data, reconstruct the cover image, and compress the cover image again to generate compression codes. In this paper, we propose a novel data hiding method based on VQ compressed images. Codebooks of secret message & cover images are combined using shuffle algorithm. Experimental results indicate that our proposed scheme provides 100% hiding capacity or more that means secret message can be of same or more size than cover image and better image quality compared with existing schemes based on VQ compressed images. The technique is robust against stegaanalysis technique.

7 citations


Proceedings ArticleDOI
26 Feb 2010
TL;DR: This paper proposed segmentation using vector quantization technique for segmentation of mammographic images using Kekre's Fast Codebook Generation algorithm (KFCG) and results are shown for comparison.
Abstract: Segmenting a mammographic images into homogeneous texture regions representing disparate tissue types is often a useful preprocessing step in the computer-assisted detection of breast cancer. That is why we proposed new algorithm to detect cancer in mammogram breast cancer images. In this paper we proposed segmentation using vector quantization technique. Here we used Kekre's Fast Codebook Generation algorithm (KFCG) for segmentation of mammographic images. Initially a codebook of size 128 was generated for mammographic images. These code vectors were further clustered in 8 clusters using same KFCG algorithm. Eight segmented images were obtained for each code vector. These 8 images were displayed as a result. This approach does not leads to over segmentation or under segmentation, as is the case for watershed segmentation and entropy segmentation using Gray Level Co-occurrence Matrix. Results of these algorithms are shown for comparison.

7 citations


Journal ArticleDOI
TL;DR: This paper is proposing modified genetic algorithm giving the optimal value, but it depends on the initial selection of the codevectors, and it is obvious that KFCG codebook gives slightly less minimized error as compared to LBG, indicating that KFGC codebook is closer to the optimum.
Abstract: Vector Quantization (VQ) is lossy data compression technique and has various applications. Key to VQ is a good codebook. Once the codebook size is fixed, then the mean square error (MSE) reaches a value, beyond which it cannot be reduced by using codebook generation algorithms. In this paper, we are proposing modified genetic algorithm giving the optimal value, but it depends on the initial selection of the codevectors. Hence, it takes extremely huge time to give optimal value. For demonstration, we have used codebooks obtained from Linde Buzo and Gray (LBG) and Kekre’s Fast Codebook Generation (KFCG) algorithms. It is observed that the optimal error obtained from both LBG and KFCG is almost the same, indicating that they have converged to an optimal value. From the results, it is obvious that KFCG codebook gives slightly less minimized error as compared to LBG, indicating that KFCG codebook is closer to the optimum. The proposed method is general and can be applied to any clustering algorithm.

5 citations


01 Jan 2010
TL;DR: Different approaches to text dependent speaker identification using DCT, Walsh and Haar transform along with use of spectrograms are provided to provide different identification rates with reduced computational complexity.
Abstract: This paper aims to provide different approaches to text dependent speaker identification using DCT, Walsh and Haar transform along with use of spectrograms. Spectrograms obtained from speech samples are used as image database for the study undertaken. This image database is then subjected to various transforms. Using Euclidean distance as measure of similarity, most appropriate speaker match is obtained and is declared as identified speaker. Each transform is applied to spectrograms in two different ways: on full image and on image blocks. In both the ways, effect of different number of coefficients of transformed image is observed. Haar transform on full image reduces multiplications required by DCT and Walsh by 28 times whereas applying Haar transform on image blocks requires 18 times less mathematical computations as compared to DCT and Walsh on image blocks. Transforms when applied to image blocks, yield better or equal identification rates with reduced computational complexity.

Proceedings ArticleDOI
26 Feb 2010
TL;DR: The paper introduces the use of Kekre's Fast code book generation (KFCG) algorithm to generate codebook in Kekres LUV Color space and generate color palette and it is observed that the proposed method gives good acceptable results.
Abstract: This Paper presents the novel colorization method for coloring gray scale digital image. The paper introduces the use of Kekre's Fast code book generation (KFCG) algorithm to generate codebook in Kekre's LUV Color space and generate color palette. It is observed that the proposed method gives good acceptable results. This algorithm needs a similar color image of same class for coloring gray scale image. The color palettes are prepared by using the color image through KFCG algorithm and then transferred them to gray scale image.