scispace - formally typeset
Search or ask a question
Book

Digital Image Restoration

16 Nov 2012-
TL;DR: The article introduces digital image restoration to the reader who is just beginning in this field, and provides a review and analysis for the readers who may already be well-versed in image restoration.
Abstract: The article introduces digital image restoration to the reader who is just beginning in this field, and provides a review and analysis for the reader who may already be well-versed in image restoration. The perspective on the topic is one that comes primarily from work done in the field of signal processing. Thus, many of the techniques and works cited relate to classical signal processing approaches to estimation theory, filtering, and numerical analysis. In particular, the emphasis is placed primarily on digital image restoration algorithms that grow out of an area known as "regularized least squares" methods. It should be noted, however, that digital image restoration is a very broad field, as we discuss, and thus contains many other successful approaches that have been developed from different perspectives, such as optics, astronomy, and medical imaging, just to name a few. In the process of reviewing this topic, we address a number of very important issues in this field that are not typically discussed in the technical literature.
Citations
More filters
Journal ArticleDOI
TL;DR: The goal of this article is to introduce the concept of SR algorithms to readers who are unfamiliar with this area and to provide a review for experts to present the technical review of various existing SR methodologies which are often employed.
Abstract: A new approach toward increasing spatial resolution is required to overcome the limitations of the sensors and optics manufacturing technology. One promising approach is to use signal processing techniques to obtain an high-resolution (HR) image (or sequence) from observed multiple low-resolution (LR) images. Such a resolution enhancement approach has been one of the most active research areas, and it is called super resolution (SR) (or HR) image reconstruction or simply resolution enhancement. In this article, we use the term "SR image reconstruction" to refer to a signal processing approach toward resolution enhancement because the term "super" in "super resolution" represents very well the characteristics of the technique overcoming the inherent resolution limitation of LR imaging systems. The major advantage of the signal processing approach is that it may cost less and the existing LR imaging systems can be still utilized. The SR image reconstruction is proved to be useful in many practical cases where multiple frames of the same scene can be obtained, including medical imaging, satellite imaging, and video applications. The goal of this article is to introduce the concept of SR algorithms to readers who are unfamiliar with this area and to provide a review for experts. To this purpose, we present the technical review of various existing SR methodologies which are often employed. Before presenting the review of existing SR algorithms, we first model the LR image acquisition process.

3,491 citations

Journal ArticleDOI
TL;DR: It is shown that various inverse problems in signal recovery can be formulated as the generic problem of minimizing the sum of two convex functions with certain regularity properties, which makes it possible to derive existence, uniqueness, characterization, and stability results in a unified and standardized fashion for a large class of apparently disparate problems.
Abstract: We show that various inverse problems in signal recovery can be formulated as the generic problem of minimizing the sum of two convex functions with certain regularity properties. This formulation makes it possible to derive existence, uniqueness, characterization, and stability results in a unified and standardized fashion for a large class of apparently disparate problems. Recent results on monotone operator splitting methods are applied to establish the convergence of a forward-backward algorithm to solve the generic problem. In turn, we recover, extend, and provide a simplified analysis for a variety of existing iterative methods. Applications to geometry/texture image decomposition schemes are also discussed. A novelty of our framework is to use extensively the notion of a proximity operator, which was introduced by Moreau in the 1960s.

2,645 citations

Proceedings ArticleDOI
21 Jul 2017
TL;DR: It is concluded that the NTIRE 2017 challenge pushes the state-of-the-art in single-image super-resolution, reaching the best results to date on the popular Set5, Set14, B100, Urban100 datasets and on the authors' newly proposed DIV2K.
Abstract: This paper introduces a novel large dataset for example-based single image super-resolution and studies the state-of-the-art as emerged from the NTIRE 2017 challenge. The challenge is the first challenge of its kind, with 6 competitions, hundreds of participants and tens of proposed solutions. Our newly collected DIVerse 2K resolution image dataset (DIV2K) was employed by the challenge. In our study we compare the solutions from the challenge to a set of representative methods from the literature and evaluate them using diverse measures on our proposed DIV2K dataset. Moreover, we conduct a number of experiments and draw conclusions on several topics of interest. We conclude that the NTIRE 2017 challenge pushes the state-of-the-art in single-image super-resolution, reaching the best results to date on the popular Set5, Set14, B100, Urban100 datasets and on our newly proposed DIV2K.

2,388 citations


Cites background from "Digital Image Restoration"

  • ...Single image super-resolution as well as image restoration research literature spans over decades [36, 20, 4, 13, 16, 3, 15, 14, 6, 32, 54, 30, 17, 23, 12, 47, 48, 10, 21]....

    [...]

Journal ArticleDOI
TL;DR: An alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization from a new half-quadratic model applicable to not only the anisotropic but also the isotropic forms of TV discretizations is proposed.
Abstract: We propose, analyze, and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new half-quadratic model applicable to not only the anisotropic but also the isotropic forms of TV discretizations. The per-iteration computational complexity of the algorithm is three fast Fourier transforms. We establish strong convergence properties for the algorithm including finite convergence for some variables and relatively fast exponential (or $q$-linear in optimization terminology) convergence for the others. Furthermore, we propose a continuation scheme to accelerate the practical convergence of the algorithm. Extensive numerical results show that our algorithm performs favorably in comparison to several state-of-the-art algorithms. In particular, it runs orders of magnitude faster than the lagged diffusivity algorithm for TV-based deblurring. Some extensions of our algorithm are also discussed.

1,883 citations


Cites methods from "Digital Image Restoration"

  • ...There are different approaches based on statistics [14, 5], Fourier and/or wavelet transforms [25, 31], or variational analysis [38, 7, 11] for image deblurring....

    [...]

Journal ArticleDOI
TL;DR: This paper introduces two-step 1ST (TwIST) algorithms, exhibiting much faster convergence rate than 1ST for ill-conditioned problems, and introduces a monotonic version of TwIST (MTwIST); although the convergence proof does not apply, the effectiveness of the new methods are experimentally confirmed on problems of image deconvolution and of restoration with missing samples.
Abstract: Iterative shrinkage/thresholding (1ST) algorithms have been recently proposed to handle a class of convex unconstrained optimization problems arising in image restoration and other linear inverse problems. This class of problems results from combining a linear observation model with a nonquadratic regularizer (e.g., total variation or wavelet-based regularization). It happens that the convergence rate of these 1ST algorithms depends heavily on the linear observation operator, becoming very slow when this operator is ill-conditioned or ill-posed. In this paper, we introduce two-step 1ST (TwIST) algorithms, exhibiting much faster convergence rate than 1ST for ill-conditioned problems. For a vast class of nonquadratic convex regularizers (lscrP norms, some Besov norms, and total variation), we show that TwIST converges to a minimizer of the objective function, for a given range of values of its parameters. For noninvertible observation operators, we introduce a monotonic version of TwIST (MTwIST); although the convergence proof does not apply to this scenario, we give experimental evidence that MTwIST exhibits similar speed gains over IST. The effectiveness of the new methods are experimentally confirmed on problems of image deconvolution and of restoration with missing samples.

1,870 citations


Additional excerpts

  • ...[1], [5], [35]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Abstract: We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field (MRF) equivalence, this assignment also determines an MRF image model. The energy function is a more convenient and natural mechanism for embodying picture attributes than are the local characteristics of the MRF. For a range of degradation mechanisms, including blurring, nonlinear deformations, and multiplicative or additive noise, the posterior distribution is an MRF with a structure akin to the image model. By the analogy, the posterior distribution defines another (imaginary) physical system. Gradual temperature reduction in the physical system isolates low energy states (``annealing''), or what is the same thing, the most probable states under the Gibbs distribution. The analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations. The result is a highly parallel ``relaxation'' algorithm for MAP estimation. We establish convergence properties of the algorithm and we experiment with some simple pictures, for which good restorations are obtained at low signal-to-noise ratios.

18,761 citations

Journal ArticleDOI
TL;DR: It is proven that the local maxima of the wavelet transform modulus detect the locations of irregular structures and provide numerical procedures to compute their Lipschitz exponents.
Abstract: The mathematical characterization of singularities with Lipschitz exponents is reviewed. Theorems that estimate local Lipschitz exponents of functions from the evolution across scales of their wavelet transform are reviewed. It is then proven that the local maxima of the wavelet transform modulus detect the locations of irregular structures and provide numerical procedures to compute their Lipschitz exponents. The wavelet transform of singularities with fast oscillations has a particular behavior that is studied separately. The local frequency of such oscillations is measured from the wavelet transform modulus maxima. It has been shown numerically that one- and two-dimensional signals can be reconstructed, with a good approximation, from the local maxima of their wavelet transform modulus. As an application, an algorithm is developed that removes white noises from signals by analyzing the evolution of the wavelet transform maxima across scales. In two dimensions, the wavelet transform maxima indicate the location of edges in images. >

4,064 citations

Journal ArticleDOI
TL;DR: An iterative method of restoring degraded images was developed by treating images, point spread functions, and degraded images as probability-frequency functions and by applying Bayes’s theorem.
Abstract: An iterative method of restoring degraded images was developed by treating images, point spread functions, and degraded images as probability-frequency functions and by applying Bayes’s theorem. The method functions effectively in the presence of noise and is adaptable to computer operation.

3,869 citations

Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of determining the distribution of proper motions in a line-of-sight line of sight (LoSOS) image from a given number count.
Abstract: J q,•.=I:Pi,Vt;, i-I f \"'(~)d~=1 and \"'W~O of the variation of star density along a line-of-sight from the distribution of proper motions in that direction; (2) the determination of the space distribution of radio sources from number counts; (3) the determination of the radial variation of star density in a globular cluster from star counts; (4) the correction of radioastronomical and spectrographic observations for the effect of the instrumental profile; and (5) the determination of the temperature stratification in the solar atmosphere from limb-darkening data. These examples suffice to show that the problem under consideration arises in many branches of astronomy and that its solution is vital to the process of extracting useful information from observations.

3,670 citations

Book
11 Aug 2011
TL;DR: The authors describe an algorithm that reconstructs a close approximation of 1-D and 2-D signals from their multiscale edges and shows that the evolution of wavelet local maxima across scales characterize the local shape of irregular structures.
Abstract: A multiscale Canny edge detection is equivalent to finding the local maxima of a wavelet transform. The authors study the properties of multiscale edges through the wavelet theory. For pattern recognition, one often needs to discriminate different types of edges. They show that the evolution of wavelet local maxima across scales characterize the local shape of irregular structures. Numerical descriptors of edge types are derived. The completeness of a multiscale edge representation is also studied. The authors describe an algorithm that reconstructs a close approximation of 1-D and 2-D signals from their multiscale edges. For images, the reconstruction errors are below visual sensitivity. As an application, a compact image coding algorithm that selects important edges and compresses the image data by factors over 30 has been implemented. >

3,187 citations