scispace - formally typeset
Search or ask a question
Author

John Todd

Bio: John Todd is an academic researcher. The author has an hindex of 1, co-authored 1 publications receiving 1855 citations.

Papers
More filters

Cited by
More filters
Journal ArticleDOI
TL;DR: This scheme can remove salt-and-pepper-noise with a noise level as high as 90% and show a significant improvement compared to those restored by using just nonlinear filters or regularization methods only.
Abstract: This paper proposes a two-phase scheme for removing salt-and-pepper impulse noise. In the first phase, an adaptive median filter is used to identify pixels which are likely to be contaminated by noise (noise candidates). In the second phase, the image is restored using a specialized regularization method that applies only to those selected noise candidates. In terms of edge preservation and noise suppression, our restored images show a significant improvement compared to those restored by using just nonlinear filters or regularization methods only. Our scheme can remove salt-and-pepper-noise with a noise level as high as 90%.

1,078 citations

Journal ArticleDOI
24 Apr 2014-Cell
TL;DR: CUBIC enables time-course expression profiling of whole adult brains with single-cell resolution and develops a whole-brain cell-nuclear counterstaining protocol and a computational image analysis pipeline that enable the visualization and quantification of neural activities induced by environmental stimulation.

1,070 citations

Journal ArticleDOI
TL;DR: A unified theory of neighborhood filters and reliable criteria to compare them to other filter classes are presented and it will be demonstrated that computing trajectories and restricting the neighborhood to them is harmful for denoising purposes and that space-time NL-means preserves more movie details.
Abstract: Neighborhood filters are nonlocal image and movie filters which reduce the noise by averaging similar pixels. The first object of the paper is to present a unified theory of these filters and reliable criteria to compare them to other filter classes. A CCD noise model will be presented justifying the involvement of neighborhood filters. A classification of neighborhood filters will be proposed, including classical image and movie denoising methods and discussing further a recently introduced neighborhood filter, NL-means. In order to compare denoising methods three principles will be discussed. The first principle, "method noise", specifies that only noise must be removed from an image. A second principle will be introduced, "noise to noise", according to which a denoising method must transform a white noise into a white noise. Contrarily to "method noise", this principle, which characterizes artifact-free methods, eliminates any subjectivity and can be checked by mathematical arguments and Fourier analysis. "Noise to noise" will be proven to rule out most denoising methods, with the exception of neighborhood filters. This is why a third and new comparison principle, the "statistical optimality", is needed and will be introduced to compare the performance of all neighborhood filters. The three principles will be applied to compare ten different image and movie denoising methods. It will be first shown that only wavelet thresholding methods and NL-means give an acceptable method noise. Second, that neighborhood filters are the only ones to satisfy the "noise to noise" principle. Third, that among them NL-means is closest to statistical optimality. A particular attention will be paid to the application of the statistical optimality criterion for movie denoising methods. It will be pointed out that current movie denoising methods are motion compensated neighborhood filters. This amounts to say that they are neighborhood filters and that the ideal neighborhood of a pixel is its trajectory. Unfortunately the aperture problem makes it impossible to estimate ground true trajectories. It will be demonstrated that computing trajectories and restricting the neighborhood to them is harmful for denoising purposes and that space-time NL-means preserves more movie details.

763 citations

Journal ArticleDOI
TL;DR: This paper proposes a new method, known as brightness preserving dynamic histogram equalization (BPDHE), which is an extension to HE that can produce the output image with the meanintensity almost equal to the mean intensity of the input, thus fulfill the requirement of maintaining the mean brightness of the image.
Abstract: Histogram equalization (HE) is one of the common methods used for improving contrast in digital images. However, this technique is not very well suited to be implemented in consumer electronics, such as television, because the method tends to introduce unnecessary visual deterioration such as the saturation effect. One of the solutions to overcome this weakness is by preserving the mean brightness of the input image inside the output image. This paper proposes a new method, known as brightness preserving dynamic histogram equalization (BPDHE), which is an extension to HE that can produce the output image with the mean intensity almost equal to the mean intensity of the input, thus fulfill the requirement of maintaining the mean brightness of the image. First, the method smoothes the input histogram with one dimensional Gaussian filter, and then partitions the smoothed histogram based on its local maximums. Next, each partition will be assigned to a new dynamic range. After that, the histogram equalization process is applied independently to these partitions, based on this new dynamic range. For sure, the changes in dynamic range, and also histogram equalization process will alter the mean brightness of the image. Therefore, the last step in this method is to normalize the output image to the input mean brightness. Our results from 80 test images shows that this method outperforms other present mean brightness preserving histogram equalization methods. In most cases, BPDHE successfully enhance the image without severe side effects, and at the same time, maintain the mean input brightness1.

739 citations

Book ChapterDOI
01 Nov 2008
TL;DR: Content-based image retrieval (CBIR), emerged as a promising mean for retrieving images and browsing large images databases and is the process of retrieving images from a collection based on automatically extracted features.
Abstract: "A picture is worth one thousand words". This proverb comes from Confucius a Chinese philosopher before about 2500 years ago. Now, the essence of these words is universally understood. A picture can be magical in its ability to quickly communicate a complex story or a set of ideas that can be recalled by the viewer later in time. Visual information plays an important role in our society, it will play an increasingly pervasive role in our lives, and there will be a growing need to have these sources processed further. The pictures or images are used in many application areas like architectural and engineering design, fashion, journalism, advertising, entertainment, etc. Thus it provides the necessary opportunity for us to use the abundance of images. However, the knowledge will be useless if one can't _nd it. In the face of the substantive and increasing apace images, how to search and to retrieve the images that we interested with facility is a fatal problem: it brings a necessity for image retrieval systems. As we know, visual features of the images provide a description of their content. Content-based image retrieval (CBIR), emerged as a promising mean for retrieving images and browsing large images databases. CBIR has been a topic of intensive research in recent years. It is the process of retrieving images from a collection based on automatically extracted features.

727 citations