scispace - formally typeset
Search or ask a question
Topic

Bilateral filter

About: Bilateral filter is a research topic. Over the lifetime, 3500 publications have been published within this topic receiving 75582 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A comprehensive comparative study of image denoising techniques relying on Anisotropic Diffusion filter, Wiener filter, TV (Total Variation), NLM (Non-Local Means, NLM), Bilateral filtering is presented.

20 citations

Patent
10 Mar 2003
TL;DR: In this article, a method for encoding pixels of digital or digitized images, i.e. images consisting of a set of image dots, named pixels in 2D images and voxels in 3D images, was proposed.
Abstract: A method for encoding pixels of digital or digitized images, i.e. images consisting of a set of image dots, named pixels in two-dimensional images and voxels in three-dimensional images, each of said pixels or voxels being represented by a set of values which correspond to a visual aspect of the pixel on a display screen or in a printed image. According to the invention, the pixels or voxels (5, 14) of at least one portion of interest of the digital or digitized image or each pixel or voxel (5, 14) of the set of pixels or voxels which form the image is uniquely identified with a vector whose components are given by the data of the pixels or voxels to be encoded (5, 14) and by the data of at least one or at least some or of all of the pixels (1, 2, 3, 4, 6, 7, 8, 9; 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 15, 16, 17, 18, 19, 20, 21, 22, 23 ,24, 25, 26, 27) around the pixels to be encoded and arranged within a predetermined subset of pixels or voxels included in the whole set of pixels or voxels which form the image.

20 citations

Proceedings ArticleDOI
TL;DR: The modified McKinnon-Bates (MKB) algorithm was implemented on a graphical processing unit (GPU) to maximize efficiency as mentioned in this paper, and the results showed that a nearly 4x improvement in SNR was obtained compared to the conventional FDK phase-correlated reconstruction.
Abstract: A challenge in using on-board cone beam computed tomography (CBCT) to image lung tumor motion prior to radiation therapy treatment is acquiring and reconstructing high quality 4D images in a sufficiently short time for practical use. For the 1 minute rotation times typical of Linacs, severe view aliasing artifacts, including streaks, are created if a conventional phase-correlated FDK reconstruction is performed. The McKinnon-Bates (MKB) algorithm provides an efficient means of reducing streaks from static tissue but can suffer from low SNR and other artifacts due to data truncation and noise. We have added truncation correction and bilateral nonlinear filtering to the MKB algorithm to reduce streaking and improve image quality. The modified MKB algorithm was implemented on a graphical processing unit (GPU) to maximize efficiency. Results show that a nearly 4x improvement in SNR is obtained compared to the conventional FDK phase-correlated reconstruction and that high quality 4D images with 0.4 second temporal resolution and 1 mm 3 isotropic spatial resolution can be reconstructed in less than 20 seconds after data acquisition completes.

20 citations

Journal ArticleDOI
TL;DR: In this paper, a vectorization pattern with kernel subsampling was proposed for general finite impulse response image filtering, which was shown to be effective for various filters, such as Gaussian range filtering, bilateral filtering, adaptive Gaussian filtering, randomly-kernel-subsampled Gaussian ranges filtering, and randomly kernel-sub-sampled bilateral filtering.
Abstract: This study examines vectorized programming for finite impulse response image filtering. Finite impulse response image filtering occupies a fundamental place in image processing, and has several approximated acceleration algorithms. However, no sophisticated method of acceleration exists for parameter adaptive filters or any other complex filter. For this case, simple subsampling with code optimization is a unique solution. Under the current Moore’s law, increases in central processing unit frequency have stopped. Moreover, the usage of more and more transistors is becoming insuperably complex due to power and thermal constraints. Most central processing units have multi-core architectures, complicated cache memories, and short vector processing units. This change has complicated vectorized programming. Therefore, we first organize vectorization patterns of vectorized programming to highlight the computing performance of central processing units by revisiting the general finite impulse response filtering. Furthermore, we propose a new vectorization pattern of vectorized programming and term it as loop vectorization. Moreover, these vectorization patterns mesh well with the acceleration method of subsampling of kernels for general finite impulse response filters. Experimental results reveal that the vectorization patterns are appropriate for general finite impulse response filtering. A new vectorization pattern with kernel subsampling is found to be effective for various filters. These include Gaussian range filtering, bilateral filtering, adaptive Gaussian filtering, randomly-kernel-subsampled Gaussian range filtering, randomly-kernel-subsampled bilateral filtering, and randomly-kernel-subsampled adaptive Gaussian filtering.

20 citations

Patent
Peter Amon1, Jürgen Pandel1
14 Aug 2009
TL;DR: In this article, a pixel value of the pixel to be encoded is determined based on the other pixels of the trajectory with the highest valuation, and a valuation parameter is in inverse proportion to the deviations in the uncoded associated pixel values of other pixels along a trajectory.
Abstract: Pixels with associated pixel values form a sequence of digitized images that are encoded by predicting pixels of the images and encoding the prediction errors. At least a part of the pixels are encoded by determining trajectories, each running through a pixel to be encoded and through other previously encoded pixels from the image of the pixel being encoded and/or from one or more images that are temporally proximate to that image. For each of the trajectories determined, a valuation parameter is in inverse proportion to the deviations in the uncoded associated pixel values of the other pixels along a trajectory. A predicted pixel value of the pixel to be encoded is determined based on the other pixels of the trajectory with the highest valuation.

20 citations


Network Information
Related Topics (5)
Image processing
229.9K papers, 3.5M citations
87% related
Image segmentation
79.6K papers, 1.8M citations
87% related
Feature (computer vision)
128.2K papers, 1.7M citations
86% related
Feature extraction
111.8K papers, 2.1M citations
86% related
Pixel
136.5K papers, 1.5M citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202257
2021116
2020145
2019203
2018204