scispace - formally typeset
Search or ask a question

Showing papers on "Edge-preserving smoothing published in 2013"


Journal ArticleDOI
TL;DR: The guided filter is a novel explicit image filter derived from a local linear model that can be used as an edge-preserving smoothing operator like the popular bilateral filter, but it has better behaviors near edges.
Abstract: In this paper, we propose a novel explicit image filter called guided filter. Derived from a local linear model, the guided filter computes the filtering output by considering the content of a guidance image, which can be the input image itself or another different image. The guided filter can be used as an edge-preserving smoothing operator like the popular bilateral filter [1], but it has better behaviors near edges. The guided filter is also a more generic concept beyond smoothing: It can transfer the structures of the guidance image to the filtering output, enabling new filtering applications like dehazing and guided feathering. Moreover, the guided filter naturally has a fast and nonapproximate linear time algorithm, regardless of the kernel size and the intensity range. Currently, it is one of the fastest edge-preserving filters. Experiments show that the guided filter is both effective and efficient in a great variety of computer vision and computer graphics applications, including edge-aware smoothing, detail enhancement, HDR compression, image matting/feathering, dehazing, joint upsampling, etc.

4,730 citations


Journal ArticleDOI
TL;DR: Performance of proposed method is superior to wavelet thresholding, bilateral filter and non-local means filter and superior/akin to multi-resolution bilateral filter in terms of method noise, visual quality, PSNR and Image Quality Index.
Abstract: Non-local means filter uses all the possible self-predictions and self-similarities the image can provide to determine the pixel weights for filtering the noisy image, with the assumption that the image contains an extensive amount of self-similarity. As the pixels are highly correlated and the noise is typically independently and identically distributed, averaging of these pixels results in noise suppression thereby yielding a pixel that is similar to its original value. The non-local means filter removes the noise and cleans the edges without losing too many fine structure and details. But as the noise increases, the performance of non-local means filter deteriorates and the denoised image suffers from blurring and loss of image details. This is because the similar local patches used to find the pixel weights contains noisy pixels. In this paper, the blend of non-local means filter and its method noise thresholding using wavelets is proposed for better image denoising. The performance of the proposed method is compared with wavelet thresholding, bilateral filter, non-local means filter and multi-resolution bilateral filter. It is found that performance of proposed method is superior to wavelet thresholding, bilateral filter and non-local means filter and superior/akin to multi-resolution bilateral filter in terms of method noise, visual quality, PSNR and Image Quality Index.

125 citations


Journal ArticleDOI
TL;DR: The performance of proposed methods is compared with existing denoising methods and found that, it has inferior performance compared to Bayesian least squares estimate using Gaussian Scale mixture and superior/comparable performance to that of wavelet thresholding, bilateral filter, multi-resolution bilateral filter
Abstract: The Gaussian filter is a local and linear filter that smoothes the whole image irrespective of its edges or details, whereas the bilateral filter is also a local but non-linear, considers both gray level similarities and geometric closeness of the neighboring pixels without smoothing edges The extension of bilateral filter: multi-resolution bilateral filter, where bilateral filter is applied to approximation subbands of an image decomposed and after each level of wavelet reconstruction The application of bilateral filter on the approximation subband results in loss of some image details, whereas that after each level of wavelet reconstruction flattens the gray levels thereby resulting in a cartoon-like appearance To tackle these issues, it is proposed to use the blend of Gaussian/bilateral filter and its method noise thresholding using wavelets In Gaussian noise scenarios, the performance of proposed methods is compared with existing denoising methods and found that, it has inferior performance compared to Bayesian least squares estimate using Gaussian Scale mixture and superior/comparable performance to that of wavelet thresholding, bilateral filter, multi-resolution bilateral filter, NL-means and Kernel based methods Further, proposed methods have the advantage of less computational time compared to other methods except wavelet thresholding, bilateral filter

92 citations


Journal ArticleDOI
TL;DR: A novel edge-preserving texture suppression filter exploiting the joint bilateral filter as a bridge to achieve the purpose of both properties of texture-smoothing and edge- Preserving, and is extended to a variety of image processing applications.
Abstract: Obtaining a texture-smoothing and edge-preserving filtered output is significant to image decomposition. Although the edge and the texture have salient difference in human vision, automatically distinguishing them is a difficult task, for they have similar intensity difference or gradient response. The state-of-the-art edge-preserving smoothing (EPS) based decomposition approaches are hard to obtain a satisfactory result. We propose a novel edge-preserving texture suppression filter, exploiting the joint bilateral filter as a bridge to achieve the purpose of both properties of texture-smoothing and edge-preserving. We develop the iterative asymmetric sampling and the local linear model to produce the degenerative image to suppress the texture, and apply the edge correction operator to achieve edge-preserving. An efficient accelerating implementation is introduced to improve the performance of filtering response. The experiments demonstrate that our filter produces satisfactory outputs with both properties of texture-smoothing and edge-preserving, while compared with the results of other popular EPS approaches in signal, visual and time analysis. Finally, we extend our filter to a variety of image processing applications.

59 citations


Journal ArticleDOI
TL;DR: A simple explicit image filter which can filter out noise while preserving edges and fine-scale details is derived, which has a fast and exact linear-time algorithm whose computational complexity is independent of the filtering kernel size; thus, it can be applied to real time image processing tasks.
Abstract: In this paper, we propose a novel approach for performing high-quality edge-preserving image filtering. Based on a local linear model and using the principle of Stein's unbiased risk estimate as an estimator for the mean squared error from the noisy image only, we derive a simple explicit image filter which can filter out noise while preserving edges and fine-scale details. Moreover, this filter has a fast and exact linear-time algorithm whose computational complexity is independent of the filtering kernel size; thus, it can be applied to real time image processing tasks. The experimental results demonstrate the effectiveness of the new filter for various computer vision applications, including noise reduction, detail smoothing and enhancement, high dynamic range compression, and flash/no-flash denoising.

53 citations


Proceedings ArticleDOI
26 May 2013
TL;DR: The approach to enhance hazy images with the `Dark-Channel Prior' method with image refinement by the `Weighted Least Square' based edge-preserving smoothing improves the contrast, color and detail for the entire image domain effectively.
Abstract: Images captured under hazy conditions have low contrast and poor color. This is primarily due to air-light which degrades image quality according to the transmission map. The approach to enhance these hazy images we introduce here is based on the `Dark-Channel Prior' method with image refinement by the `Weighted Least Square' based edge-preserving smoothing. Local contrast is further enhanced by multi-scale tone manipulation. The proposed method improves the contrast, color and detail for the entire image domain effectively. In the experiment, we compare the proposed method with conventional methods to validate performance.

34 citations


Journal ArticleDOI
TL;DR: This paper integrates Burt-Adelson's Laplacian pyramids with lifting schemes for the construction of slightly redundant decompositions, and discusses several alternatives in the design of non-stationary finite impulse response filters for a stable multiresolution smoothing system.
Abstract: This paper integrates Burt-Adelson's Laplacian pyramids with lifting schemes for the construction of slightly redundant decompositions. These decompositions implement multiscale smoothing on possibly non-equidistant point sets. Thanks to the slight redundancy and to the smoothing operations in the lifting scheme, the proposed construction unifies sparsity of the analysis, smoothness of the reconstruction and stability of the transforms. The decomposition is of linear computational complexity, with just a slightly larger constant than the fast lifted wavelet transform. This paper also discusses several alternatives in the design of non-stationary finite impulse response filters for a stable multiresolution smoothing system. These filters are adapted to each other and to the locations of the observations.

17 citations


Proceedings ArticleDOI
25 Aug 2013
TL;DR: A new edge-based method, called edge-ray filter, to detect the scene character, which can accommodate dark- on-bright and bright-on-dark characters simultaneously, and provides accurate character segmentation masks is proposed.
Abstract: Edge is a type of valuable clues for scene character detection task. Generally, the existing edge-based methods rely on the assumption of straight text line to prune away the non-character candidates. This paper proposes a new edge-based method, called edge-ray filter, to detect the scene character. The main contribution of the proposed method lies in filtering out complex backgrounds by fully utilizing the essential spatial layout of edges instead of the assumption of straight text line. Edges are extracted by a combination of Canny and Edge Preserving Smoothing Filter (EPSF). To effectively boost the filtering strength of the designed edge-ray filter, we employ a new Edge Quasi-Connectivity Analysis (EQCA) to unify complex edges as well as contour of broken character. Label Histogram Analysis (LHA) then filters out non-character edges and redundant rays through setting proper thresholds. Finally, two frequently-used heuristic rules, namely aspect ratio and occupation, are exploited to wipe off distinct false alarms. In addition to have the ability to handle special scenarios, the proposed method can accommodate dark-on-bright and bright-on-dark characters simultaneously, and provides accurate character segmentation masks. We perform experiments on the benchmark ICDAR 2011 Robust Reading Competition dataset as well as scene images with special scenarios. The experimental results demonstrate the validity of our proposal.

16 citations


Proceedings ArticleDOI
15 Jul 2013
TL;DR: A content adaptive bilateral filter that preserves edges and smoothes flat areas better than the existing bilateral filter and is applied to design a local tone mapping algorithm for high dynamic range images as well as a noise reduction algorithm for low dynamic range color images.
Abstract: Bilateral filter is perhaps the most popular and simplest edge-preserving local filter. However, it may produce halos near some edges due to unwanted smoothing of the edges. In this paper, a content adaptive bilateral filter is proposed to overcome this problem. Both spatial similarity and range similarity parameters of the proposed bilateral filter are adaptive to the content of an image to be filtered rather than being fixed as in the existing bilateral filter. The proposed filter preserves edges and smoothes flat areas better than the existing bilateral filter. It is applied to design a local tone mapping algorithm for high dynamic range images as well as a noise reduction algorithm for low dynamic range color images. Experimental results show that the resultant tone mapping algorithm produces images with better visual quality and at the same time halo artifacts are avoided from appearing in the tone mapped image. The resultant noise reduction algorithm can also produce images with higher structural similarity index values and better visual quality. Noise in smooth regions are more reduced and edges are preserved better in the de-noised image.

15 citations


Patent
Jaewon Shin1, Brian Schoner1
07 Feb 2013
TL;DR: In this article, the edge smoothing block filtering (EBSF) algorithm is used to identify candidate pixels having surrounding pixels similar to pixels surrounding the current pixel and filter the candidate pixels based on the cost difference.
Abstract: Aspects of an edge smoothing block filter and combinations of the filter with a motion compensated temporal filter are described. In one embodiment, edge smoothing block filtering includes selecting a current pixel to be filtered and selecting a candidate pixel within a search area about the current pixel. The edge smoothing block filtering generally seeks to identify candidate pixels having surrounding pixels similar to pixels surrounding the current pixel. The edge smoothing block filtering further computes a cost difference between pixels within a candidate pixel neighborhood and pixels within a current pixel neighborhood, and filters the current pixel based on the cost difference. Aspects of the filters and filter element combinations may preserve edges and textures adaptively based on image content. For example, diagonal or curved image edges may be filtered along edges while texture is preserved along the edges.

12 citations


Proceedings ArticleDOI
22 Mar 2013
TL;DR: The main objective of the paper is to portray a clear cut idea about Savitzky Golay filter and to study the design of Savitski Golay filters based on the concepts of Linear Algebra.
Abstract: Smoothing and differencing is one of the major important and necessary step in the field of signal processing, image processing and also in the field on analytical chemistry. The search for an efficient image smoothing and edge detection method is a challenging task in image processing sector. Savitzky Golay Filters are one among the widely used filters for analytical chemistry. Even though they have exceptional features, they are rarely used in the field of image processing. The designed filter is applied for image smoothing and a mathematical model based on partial derivatives is proposed to extract the edges in images. The smoothing technique of SG filter offers an extremely simple aid in extracting the edge information. An approach using SG filter which can be applied in preserving edge information is one of the major tasks involved in the classification process in the domain of Optical Character Recognition. The paper is focused on designing the Savitzky Golay filter by using the concepts of linear algebra. The main objective of the paper is to portray a clear cut idea about Savitzky Golay filter and to study the design of Savitsky Golay filters based on the concepts of Linear Algebra.

Proceedings ArticleDOI
01 Sep 2013
TL;DR: This paper aims to find a good trade-off between high smoothing quality and fast processing time for real-time computing for machine vision, entertainment industry with smart TVs or consumer cameras, or surveillance and reconnaissance with different imaging sensors.
Abstract: Image smoothing is widely used for enhancing the quality of single images or videos. There is a large amount of application areas such as machine vision, entertainment industry with smart TVs or consumer cameras, or surveillance and reconnaissance with different imaging sensors. In many cases it is not easy to find the trade-off between high smoothing quality and fast processing time. However, this is necessary for the mentioned applications as they are dependent on real-time computing. In this paper, we aim to find a good trade-off. Local texture is analyzed with Local Binary Patterns (LBPs) which are used to adapt the size of a Gaussian smoothing kernel for each pixel. Real-time requirements are met by the implementation on a Graphical Processing Unit (GPU). An image of 512 × 512 pixels is processed in 2.6 ms.

Journal ArticleDOI
TL;DR: A novel smoothing approach, called the nonstationarity adaptive filtering, which estimates the intensity of a pixel by averaging intensities in its adaptive homogeneous neighborhood, thus leading to homogeneously consistent tensor fields and consequently more coherent fibers.
Abstract: Although promising for studying the microstructure of in vivo tissues, the performance and the potentiality of diffusion tensor magnetic resonance imaging are hampered by the presence of high-level noise in diffusion weighted (DW) images. This paper proposes a novel smoothing approach, called the nonstationarity adaptive filtering, which estimates the intensity of a pixel by averaging intensities in its adaptive homogeneous neighborhood. The latter is determined according to five constraints and spatiodirectional nonstationarity measure maps. The proposed approach is compared with an anisotropic diffusion method used in DW image smoothing. Experimental results on both synthetic and real human DW images show that the proposed method achieves a better compromise between the smoothness of homogeneous regions and the preservation of desirable features such as boundaries, even for highly noisy data, thus leading to homogeneously consistent tensor fields and consequently more coherent fibers.

Proceedings ArticleDOI
TL;DR: A new smoothing method preserving structures which drives the diffusion function of the angle between the two edge directions and the gradient value enables to preserve edges and corners, contrary to other anisotropic diffusion methods.
Abstract: This paper is dedicated to a new anisotropic diffusion approach for image regularization based on a gradient and two diffusion directions obtained from half Gaussian kernels. This approach results in smoothing an image while preserving edges. From an anisotropic edge detector, built of half Gaussian derivative kernels, we introduce a new smoothing method preserving structures which drives the diffusion function of the angle between the two edge directions and the gradient value. Due to the two directions diffusion used in the control function, our diffusion scheme enables to preserve edges and corners, contrary to other anisotropic diffusion methods. Moreover, parameters of the Gaussian kernel can be tuned to be sufficiently thin extracting precisely edges whereas its length allows detecting in contour orientations which leads to a coherent image regularization. Finally, we present some experimental results and discuss about the choice of the different parameters.

Proceedings ArticleDOI
25 Aug 2013
TL;DR: A method which performs an isotropic morphological dilation via implicit smoothing for the purpose of restoring the degraded character shapes of binarized images by exploiting the idea of geodesic morphology that the binary image and its distance transformed image are interconvertible.
Abstract: We propose a method which performs an isotropic morphological dilation via implicit smoothing for the purpose of restoring the degraded character shapes of binarized images Exploiting the idea of geodesic morphology that the binary image and its distance transformed image are interconvertible, we apply a smoothing method not to the binary image but to the distance transformed image, and then reconvert it by binarization This allows us to apply conventional smoothing methods for continuous intensity, ie, gray scale, images to the discrete intensity, ie, binary, image implicitly For instance, by using an isotropic diffusion together with geodesic dilation, an isotropic dilation along the stroke direction is obtained and brings better results

Journal ArticleDOI
TL;DR: A new face relighting method using an illuminance template generated from a single reference portrait wrapped according to the shape of the target to relight the target with the template in CIELAB color space.
Abstract: SUMMARY We propose a new face relighting method using an illuminance template generated from a single reference portrait. First, the reference is wrapped according to the shape of the target. Second, we employ a new spatially variant edge-preserving smoothing filter to remove the facial identity and texture details of the wrapped reference, and obtain the illumination template. Finally, we relight the target with the template in CIELAB color space. Experiments show the effectiveness of our method for both grayscale and color faces taken from different databases, and the comparisons with previous works demonstrate a better relighting effect produced

Book ChapterDOI
Wilhelm Burger, Mark J. Burge1
01 Jan 2013
TL;DR: The filters described in this chapter are “edge preserving” in the sense that they change their smoothing behavior adaptively depending upon the local image structure, typically characterized by high intensity gradients.
Abstract: Noise reduction in images is a common objective in image processing, not only for producing pleasing results for human viewing but also to facilitate easier extraction of meaningful information in subsequent steps, for example, in segmentation or feature detection. Simple smoothing filters, such as the Gaussian filter and the filters discussed in Chapter 3 of this volume effectively perform low-pass filtering and thus remove high-frequency noise. However, they also tend to suppress high-rate intensity variations that are part of the original signal, thereby destroying image structures that are visually important. The filters described in this chapter are “edge preserving” in the sense that they change their smoothing behavior adaptively depending upon the local image structure. In general, maximum smoothing is performed over “flat” (uniform) image regions, while smoothing is reduced near or across edge-like structures, typically characterized by high intensity gradients.

Proceedings ArticleDOI
01 Dec 2013
TL;DR: A multi-direction based discretization along with a selection strategy for choosing the best direction of possible edge pixels is introduced to improve the denoising capability of nonlinear anisotropic diffusion schemes.
Abstract: Anisotropic diffusion based schemes are widely used in image smoothing and noise removal. Typically, the partial differential equation (PDE) used is based on computing image gradients or isotropically smoothed version of the gradient image. To improve the denoising capability of such nonlinear anisotropic diffusion schemes, we introduce a multi-direction based discretization along with a selection strategy for choosing the best direction of possible edge pixels. This strategy avoids the directionality based bias which can over-smooth features that are not aligned with the coordinate axis. The proposed hybrid discretization scheme helps in preserving multi-scale features present in the images via selective smoothing of the PDE. Experimental results indicate such an adaptive modification provides improved restoration results on noisy images.

Journal ArticleDOI
30 Nov 2013
TL;DR: In this paper, a selective smoothing regularization technique was developed to solve the weakness of smoothing filters with a constant coefficient, which makes layer interfaces and fault structures vague because it does not consider any information of geologic structures and variations of velocity.
Abstract: In general, smoothing filters regularize functions by reducing differences between adjacent values. The smoothing filters, therefore, can regularize inverse solutions and produce more accurate subsurface structure when we apply it to full waveform inversion. If we apply a smoothing filter with a constant coefficient to subsurface image or velocity model, it will make layer interfaces and fault structures vague because it does not consider any information of geologic structures and variations of velocity. In this study, we develop a selective smoothing regularization technique, which adapts smoothing coefficients according to inversion iteration, to solve the weakness of smoothing regularization with a constant coefficient. First, we determine appropriate frequencies and analyze the corresponding wavenumber coverage. Then, we define effective maximum wavenumber as 99 percentile of wavenumber spectrum in order to choose smoothing coefficients which can effectively limit the wavenumber coverage. By adapting the chosen smoothing coefficients according to the iteration, we can implement multi-scale full waveform inversion while inverting multifrequency components simultaneously. Through the successful inversion example on a salt model with high-contrast velocity structures, we can note that our method effectively regularizes the inverse solution. We also verify that our scheme is applicable to field data through the numerical example to the synthetic data containing random noise.

Proceedings ArticleDOI
03 Jul 2013
TL;DR: It is shown how circular integral invariants (II) may be adapted for feature-preserving smoothing to facilitate segmentation and it is shown that it leads to considerably less feature deterioration than Gaussian blurring and it improves segmentation of regions of interest as compared to anisotropic diffusion.
Abstract: Medical images pose a major challenge for image analysis: often they have poor signal-to-noise, necessitating smoothing; yet such smoothing needs to preserve the boundaries of regions of interest and small features such as mammogram microcalcifications. We show how circular integral invariants (II) may be adapted for feature-preserving smoothing to facilitate segmentation. Though II is isotropic, we show that it leads to considerably less feature deterioration than Gaussian blurring and it improves segmentation of regions of interest as compared to anisotropic diffusion, particularly for hierarchical contour based segmentation methods.

Proceedings Article
01 Jan 2013
TL;DR: This paper proposes a novel edge-preserving texture filter, which smudges the color values inside uniformly textured areas, thus making the processed image more workable for color-based image segmentation.
Abstract: Presentado a la 8th International Conference on Computer Vision Theory and Applications celebrada en Barcelona del 21 al 24 de febrero de 2013.

Journal ArticleDOI
TL;DR: The evaluation results of the digital smoothing polynomial filter showed its efficiency in smoothing and echo signals reproduction and the proposed cascade filter gave considerable advantages.
Abstract: The digital smoothing polynomial filter or Savitzky-Golay filter, which is assumed to be appropriate for ultrasonic IRIS applications due to its ability of conciliation between the high smoothing level and degree of peak retention, fits a set of data points to a polynomial by applying least-squares fitting. In this study, the performance of the digital smoothing polynomial filter is evaluated in terms of signal-to-noise ratio and interface echoes preservation. Moreover, a smoothing polynomial filter that works in a cascade setup is proposed to take advantage of the conciliation ability of the standard filter. The theory and experiments of the polynomial smoothing filter are presented, and graphical results are given. The evaluation results of the digital smoothing polynomial filter showed its efficiency in smoothing and echo signals reproduction. The proposed cascade filter gave considerable advantages.

Proceedings ArticleDOI
01 Sep 2013
TL;DR: Experimental results for real images show that the proposed patch-based bilateral filter outperforms the bilateral filter in all cases, is comparable to the nonlocal-(NL-)means filter, and outper performs the more elaborate variants of the NL-meansfilter in some cases.
Abstract: The bilateral filter has a weakness in its denoising ability, because the intensity similarity is evaluated by comparing just the intensity values of a given pixel and another pixel. To alleviate this problem, this paper proposes a patch-based bilateral (PBL) filter in which the intensity similarity is evaluated by comparing a whole patch around each pixel. Experimental results for real images show that this method outperforms the bilateral filter in all cases, is comparable to the nonlocal-(NL-)means filter, and outperforms the more elaborate variants of the NL-means filter in some cases.

Journal ArticleDOI
TL;DR: The structural-context-preserving image coarsening capability of the proposed method is verified by some results from experiments and examples, and the new method's characteristics in practical application to decomposition-based image enhancement are shown.

Book ChapterDOI
13 Dec 2013
TL;DR: Experimental results prove that the proposed content adaptive L 0 smoothing method can have better results than existing L 0 smoothness method.
Abstract: Edge preserving smoothing is a technique to decompose an image into two layers - one smoothing layer and one detail layer. It is an important image editing tool. The edges are preserved in the smoothing layer and details are decomposed into the detail layer. In this paper, we propose a content adaptive L 0 smoothing method. Unlike common smoothing schemes, we use a perceptual based content adaptive weighted fidelity term. The algorithm gives a larger weight to the region with more information, which is most likely edges, and gives a smaller weight to the region with less information, which is most likely a flat area. So the resulting smoothed image can preserve more edges and smooth the smoothing areas better. Experimental results prove that the proposed method can have better results than existing L 0 smoothing method.

Proceedings ArticleDOI
13 May 2013
TL;DR: An adaptive difference of Gaussian (DoG) filter with varying window size depending upon the edge strengths in the image is proposed with advantages over similar other techniques such as simple Gaussian filter, DoG filter, and is comparable to the bilateral filter in terms of edge enhancement.
Abstract: We present a physiologically inspired adaptive algorithm for noise removal in an image while preserving significant amount of edge details. The algorithm is motivated by the classical lateral inhibition based receptive field in the visual system as well as the holistic approach of the well known bilateral filter. We propose an adaptive difference of Gaussian (DoG) filter with varying window size depending upon the edge strengths in the image. Our algorithm has advantages over similar other techniques such as simple Gaussian filter, DoG filter, and is comparable to the bilateral filter in terms of edge enhancement. Furthermore, time complexity of our algorithm is much less than the bilateral filter.

Proceedings ArticleDOI
01 Jul 2013
TL;DR: A novel explicit image filter called guided depth image filter for natural image matting, used as an edge-preserving smoothing operator like bilateral filter, but has better behaviors near edges.
Abstract: In this paper we propose a novel explicit image filter called guided depth image filter for natural image matting. Different from the traditional matting model, the guided image filter computes the filtering output by considering the content of a depth image. The guided depth image filter can be used as an edge-preserving smoothing operator like bilateral filter, but has better behaviors near edges. The proposed filter by using nonlocal neighborhoods, and contribute a simple and fast algorithm giving competitive results. Experimental results indicate that our matting results are comparable to the state of the art methods.

Book ChapterDOI
13 Dec 2013
TL;DR: This work proposes anextrema interpolation formulation for edge-aware smoothing, in which the extrema of the sought signal are fixed and the final smoothing result is the one that has the least distance with the original signal.
Abstract: The ability of blurring details while preserving salient edges is valuable for many image processing applications. We propose an extrema interpolation formulation for edge-aware smoothing, in which the extrema of the sought signal are fixed. A set of extrema cannot determine a unique signal, so the final smoothing result is the one that has the least distance with the original signal. Our method can follow significant edges more closely, as can be understood intuitively from the standpoint of monotonic approximation. Experimental results on one- and two-dimensional signals validate the distinctive effects of our method. We also demonstrate the effectiveness of our method for image detail enhancement and image abstraction.

Proceedings ArticleDOI
Hua Zhong1, Lu Lu1, Jingjing Zhang1, Shuang Wang1, Xiaojing Hou 
21 Jul 2013
TL;DR: A novel nonlocal Lee (NL-Lee) filter based on both structure similarity and homogeneity similarity, namely hybrid patch similarity, which can effectively enhance the patch regularity assumption and leads to state-of-the-art results.
Abstract: This paper presents a novel nonlocal Lee (NL-Lee) filter based on both structure similarity and homogeneity similarity, namely hybrid patch similarity, which can effectively enhance the patch regularity assumption. The proposed NL-Lee filter shares the framework of the NLM filter but combines the traditional Lee filter in a distributive way. Structure similarity from the NLM filter and homogeneity similarity from the Lee filter are well balanced in the new filter. As a result, the new NL-Lee filter obtains good trade-off between speckle smoothing and detail preservation and leads to state-of-the-art results.

Proceedings ArticleDOI
10 Jun 2013
TL;DR: In this article, a dip oriented Gaussian filter was used to produce models that conform to the reflector structures. But the smoothing axes along the local dip directions were not orientated.
Abstract: Reflection tomography is non-unique and a regularization term is usually added into its objective function as an additional constraint. The anisotropic Gaussian filter has been successfully employed as such a regularization operator. By orienting the smoothing axes along the local dip directions, the new Gaussian filter helps reflection tomography produce models that conform to the reflector structures. For efficiency purpose, the 3D Gaussian filter is factorized into three 1D filters in a non-orthogonal coordinate system. The dip oriented Gaussian filter also provides the freedom for applying spatially variable smoothing.