scispace - formally typeset
Search or ask a question

Showing papers by "James Z. Wang published in 1998"


Journal ArticleDOI
TL;DR: WBIIS (Wavelet-Based Image Indexing and Searching), a new image indexing and retrieval algorithm with partial sketch image searching capability for large image databases, which performs much better in capturing coherence of image, object granularity, local color/texture, and bias avoidance than traditional color layout algorithms.
Abstract: This paper describes WBIIS (Wavelet-Based Image Indexing and Searching), a new image indexing and retrieval algorithm with partial sketch image searching capability for large image databases. The algorithm characterizes the color variations over the spatial extent of the image in a manner that provides semantically meaningful image comparisons. The indexing algorithm applies a Daubechies' wavelet transform for each of the three opponent color components. The wavelet coefficients in the lowest few frequency bands, and their variances, are stored as feature vectors. To speed up retrieval, a two-step procedure is used that first does a crude selection based on the variances, and then refines the search by performing a feature vector match between the selected images and the query. For better accuracy in searching, two-level multiresolution matching may also be used. Masks are used for partial-sketch queries. This technique performs much better in capturing coherence of image, object granularity, local color/texture, and bias avoidance than traditional color layout algorithms. WBIIS is much faster and more accurate than traditional algorithms. When tested on a database of more than 10 000 general-purpose images, the best 100 matches were found in 3.3 seconds.

346 citations


Proceedings ArticleDOI
05 Oct 1998
TL;DR: This paper describes RIME (Replicated IMage dEtector), an alternative approach to watermarking for detecting unauthorized image copying on the Internet and shows that it can detect image copies effectively.
Abstract: This paper describes RIME (Replicated IMage dEtector), an alternative approach to watermarking for detecting unauthorized image copying on the Internet. RIME profiles internet images and stores the feature vectors of the images and their URLs in its repository. When a copy detection request is received, RIME matches the requested image's feature vector with the vectors stored in the repository and returns a list of suspect URLs. RIME characterizes each image using Daubechies' wavelets. The wavelet coefficients are stored as the feature vector. RIME uses a multidimensional extensible hashing scheme to index these high-dimensional feature vectors. Our preliminary result shows that it can detect image copies effectively: It can find the top suspects and copes well with image format conversion, resampling, and requantization.

150 citations


Journal ArticleDOI
TL;DR: WIPE(TM) (Wavelet Image Pornography Elimination), a system capable of classifying an image as objectionable or benign, is described, which is practical for real-world applications and has demonstrated 96% sensitivity over a test set of 1076 digital photographs found on objectionable news groups.

132 citations



Book ChapterDOI
08 Sep 1998
TL;DR: The system uses WIPEℳ (Wavelet Image Pornography Elimination) and statistics to provide robust classification of on-line objectionable World Wide Web sites, and has demonstrated 97% sensitivity and 97% specificity in classifying a Web site based solely on images.
Abstract: This paper describes IBCOW (Image-based Classification of Objectionable Websites), a system capable of classifying a website as objectionable or benign based on image content. The system uses WIPEℳ (Wavelet Image Pornography Elimination) and statistics to provide robust classification of on-line objectionable World Wide Web sites. Semantically-meaningful feature vector matching is carried out so that comparisons between a given on-line image and images marked as ”objectionable” and ”benign” in a training set can be performed efficiently and effectively in the WIPE module. If more than a certain number of images sampled from a site is found to be objectionable, then the site is considered to be objectionable. The statistical analysis for determining the size of the image sample and the threshold number of objectionable images is given in this paper. The system is practical for real-world applications, classifying a Web site at a speed of less than 2 minutes each, including the time to compute the feature vector for the images downloaded from the site, on a Pentium Pro PC. Besides its exceptional speed, it has demonstrated 97% sensitivity and 97% specificity in classifying a Web site based solely on images. Both the sensitivity and the specificity in real-world applications is expected to be higher because our performance evaluation is relatively conservative and surrounding text can be used to assist the classification process.

59 citations


Book ChapterDOI
01 Jan 1998
TL;DR: A progressive transmission algorithm with automatic security filtering features for on-line medical image distribution using Daubechies’ wavelets has been developed and is discussed in this chapter, which addresses the family of progressive image compression algorithms.
Abstract: With the rapid expansion of computer networks, communication and storage of medical information has entered a new era. Teleclinical practice and computer digital storage are two important medical information technologies that have made the issue of data compression of crucial importance. Efficiently compressing data is the key to making teleclinical practice feasible, since the bandwidth provided by computer media is too limited for the huge amount of medical data that must be transmitted. Because of the high compressibility of the medical images, data compression is also desirable for digital storage despite the availability of inexpensive hardware for mass storage. This chapter addresses the family of progressive image compression algorithms. The progressive property is preferred because it allows users to gradually recover images from low to high quality images and to stop at any desired bit rate, including lossless recovery. A progressive transmission algorithm with automatic security filtering features for on-line medical image distribution using Daubechies’ wavelets has been developed and is discussed in this chapter. The system is practical for real-world applications, processing and coding each 12-bit image of size 512 × 512 within 2 seconds on a Pentium Pro. Besides its exceptional speed, the security filter has demonstrated a remarkable accuracy in detecting sensitive textual information within current or digitized previous medical images.

18 citations


Journal Article
TL;DR: WaveMark as mentioned in this paper is a wavelet-based multiresolution digital watermarking system for color images, which uses discrete wavelet transforms and error-correcting coding schemes to provide robust watermark.
Abstract: As more and more digital images are distributed on-line via the Internet and WWW, many copyright owners are concerned about protecting the copyright of digital images. This paper describes WaveMark, a novel wavelet-based multiresolution digital watermarking system for color images. The algorithm in WaveMark uses discrete wavelet transforms and error- correcting coding schemes to provide robust watermarking of digital images. Unlike other wavelet-based algorithms, our watermark recovery procedure does not require a match with an uncorrupted original image. Our algorithm uses Daubechies' advanced wavelets and extended Hamming codes to deal with problems associated with JPEG compression and random additive noise. In addition, the algorithm is able to sustain intentional disturbances introduced by professional robustness testing programs such as StirMark. The use of Daubechies' advanced wavelets makes the watermarked images more perceptively faithful than the images watermarked with the Haar wavelet transform. The watermark is adaptively applied to different areas of the image, based on the smoothness of the areas, to increase robustness within the limits of perception. The system is practical for real-world applications, encoding or decoding images at the speed of less than one second each on a Pentium Pro PC.

15 citations


01 Jan 1998
TL;DR: The question of the extent to which judgments of similarity/identity can be made essentially error-free in support of obtaining a relatively dense depth model of a natural outdoor scene is explored.
Abstract: Normal human vision is nearly infallible in modeling the visually sensed physical environment in which it evolved. In contrast, most currently available computer vision systems fall far short of human performance in this task, and further, they are generally not capable of being able to assert the correctness of their judgments. In computerized stereo matching systems, correctness of the similarity/identity-matching is almost never guaranteed. In this paper, we explore the question of the extent to which judgments of similarity/identity can be made essentially error-free in support of obtaining a relatively dense depth model of a natural outdoor scene. We argue for the necessity of simultaneously producing a crude scene-specific semantic "overlay". For our experiments, we designed a wavelet-based stereo matching algorithm and use "classification-trees" to create a primitive semantic overlay of the scene. A series of mutually independent filters has been designed and implemented based on the study of different error sources. Photometric appearance, camera imaging geometry and scene constraints are utilized in these filters. When tested on different sets of stereo images, our system has demonstrated above 97 correctness on asserted matches. Finally, we provide a principled basis for relatively dense depth recovery.

15 citations


Proceedings Article
01 Jan 1998
TL;DR: A progressive transmission algorithm with automatic security filtering features for on-line medical image distribution using Daubechies' wavelets has been developed and is discussed in this paper.
Abstract: Because of the high compressibility of the medical images, data compression is desirable for digital storage despite the availability of inexpensive hardware for mass storage. A progressive transmission algorithm with automatic security filtering features for on-line medical image distribution using Daubechies' wavelets has been developed and is discussed in this paper. The system is practical for real-world applications, processing and coding each 12-bit image of size 512 512 within 2 seconds on a Pentium Pro. Besides its exceptional speed, the security filter has demonstrated a remarkable accuracy in detecting sensitive textual information within current or digitized previous medical images. The algorithm is of linear run time.

8 citations


01 Jan 1998
TL;DR: The system is practical for real-world applications, classifying a Web site at a speed of less than 2 minutes each, including the time to compute the feature vector for the images downloaded from the site, on a Pentium Pro PC.
Abstract: This paper describes IBCOW (Image-based Classification of Objectionable Wa system capable of classifying a website as objectionable or benign based on image content. The system uses WIPE TM (Wavelet Image Pornography Elimination) and statistics to provide robust classification of on-line objectionable World Wide Web sites. Semantically-meaningful feature vector matching is carried out so that comparisons between a given on-line image and images marked as "objectionable" and "benign" in a training set can be performed efficiently and effectively in the WIPE module. If more than a certain number of images sampled from a site is found to be objectionable, then the site is considered to be objectionable. The statistical analysis for determining the size of the image sample and the threshold number of objectionable images is given in this paper. The system is practical for real-world applications, classifying a Web site at a speed of less than 2 minutes each, including the time to compute the feature vector for the images downloaded from the site, on a Pentium Pro PC. Besides its exceptional speed, it has demonstrated 97 sensitivity and 97 specificity in classifying a Web site based solely on images. Both the sensitivity and the specificity in real-world applications is expected to be higher because our performance evaluation is relatively conservative and surrounding text can be used to assist the classification process.

6 citations