scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Nonlinear total variation based noise removal algorithms

01 Nov 1992-Physica D: Nonlinear Phenomena (Elsevier North-Holland, Inc.)-Vol. 60, pp 259-268
TL;DR: In this article, a constrained optimization type of numerical algorithm for removing noise from images is presented, where the total variation of the image is minimized subject to constraints involving the statistics of the noise.
About: This article is published in Physica D: Nonlinear Phenomena.The article was published on 1992-11-01. It has received 15225 citations till now. The article focuses on the topics: Image processing & Total variation denoising.
Citations
More filters
Proceedings ArticleDOI
07 Jun 2015
TL;DR: Inception as mentioned in this paper is a deep convolutional neural network architecture that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual Recognition Challenge 2014 (ILSVRC14).
Abstract: We propose a deep convolutional neural network architecture codenamed Inception that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual Recognition Challenge 2014 (ILSVRC14). The main hallmark of this architecture is the improved utilization of the computing resources inside the network. By a carefully crafted design, we increased the depth and width of the network while keeping the computational budget constant. To optimize quality, the architectural decisions were based on the Hebbian principle and the intuition of multi-scale processing. One particular incarnation used in our submission for ILSVRC14 is called GoogLeNet, a 22 layers deep network, the quality of which is assessed in the context of classification and detection.

40,257 citations

Book
23 May 2011
TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Abstract: Many problems of recent interest in statistics and machine learning can be posed in the framework of convex optimization. Due to the explosion in size and complexity of modern datasets, it is increasingly important to be able to solve problems with a very large number of features or training examples. As a result, both the decentralized collection or storage of these datasets as well as accompanying distributed solution methods are either necessary or at least highly desirable. In this review, we argue that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas. The method was developed in the 1970s, with roots in the 1950s, and is equivalent or closely related to many other algorithms, such as dual decomposition, the method of multipliers, Douglas–Rachford splitting, Spingarn's method of partial inverses, Dykstra's alternating projections, Bregman iterative algorithms for l1 problems, proximal methods, and others. After briefly surveying the theory and history of the algorithm, we discuss applications to a wide variety of statistical and machine learning problems of recent interest, including the lasso, sparse logistic regression, basis pursuit, covariance selection, support vector machines, and many others. We also discuss general distributed optimization, extensions to the nonconvex setting, and efficient implementation, including some details on distributed MPI and Hadoop MapReduce implementations.

17,433 citations


Cites background from "Nonlinear total variation based noi..."

  • ...There is a vast literature, but some important modern papers are those on total variation denoising [145], soft thresholding [49], the lasso [156], basis pursuit [34], compressed sensing [50, 28, 29], and structure learning of sparse graphical models [123]....

    [...]

  • ...This problem is often called total variation denoising [145], and has applications in signal processing....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Abstract: This paper considers the model problem of reconstructing an object from incomplete frequency samples. Consider a discrete-time signal f/spl isin/C/sup N/ and a randomly chosen set of frequencies /spl Omega/. Is it possible to reconstruct f from the partial knowledge of its Fourier coefficients on the set /spl Omega/? A typical result of this paper is as follows. Suppose that f is a superposition of |T| spikes f(t)=/spl sigma//sub /spl tau//spl isin/T/f(/spl tau/)/spl delta/(t-/spl tau/) obeying |T|/spl les/C/sub M//spl middot/(log N)/sup -1/ /spl middot/ |/spl Omega/| for some constant C/sub M/>0. We do not know the locations of the spikes nor their amplitudes. Then with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the /spl lscr//sub 1/ minimization problem. In short, exact recovery may be obtained by solving a convex optimization problem. We give numerical values for C/sub M/ which depend on the desired probability of success. Our result may be interpreted as a novel kind of nonlinear sampling theorem. In effect, it says that any signal made out of |T| spikes may be recovered by convex programming from almost every set of frequencies of size O(|T|/spl middot/logN). Moreover, this is nearly optimal in the sense that any method succeeding with probability 1-O(N/sup -M/) would in general require a number of frequency samples at least proportional to |T|/spl middot/logN. The methodology extends to a variety of other situations and higher dimensions. For example, we show how one can reconstruct a piecewise constant (one- or two-dimensional) object from incomplete frequency samples - provided that the number of jumps (discontinuities) obeys the condition above - by minimizing other convex functionals such as the total variation of f.

14,587 citations

Journal ArticleDOI
TL;DR: A new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford-Shah (1989) functional for segmentation and level sets is proposed, which can detect objects whose boundaries are not necessarily defined by the gradient.
Abstract: We propose a new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford-Shah (1989) functional for segmentation and level sets. Our model can detect objects whose boundaries are not necessarily defined by the gradient. We minimize an energy which can be seen as a particular case of the minimal partition problem. In the level set formulation, the problem becomes a "mean-curvature flow"-like evolving the active contour, which will stop on the desired boundary. However, the stopping term does not depend on the gradient of the image, as in the classical active contour models, but is instead related to a particular segmentation of the image. We give a numerical algorithm using finite differences. Finally, we present various experimental results and in particular some examples for which the classical snakes methods based on the gradient are not applicable. Also, the initial curve can be anywhere in the image, and interior contours are automatically detected.

10,404 citations


Cites methods from "Nonlinear total variation based noi..."

  • ...The algorithm is as follows (we essentially adopt the method from [23] for the discretization of the divergence operator and the iterative algorithm from [1]): knowing , we first compute and using (6) and (7), respectively....

    [...]

Journal ArticleDOI
TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Abstract: The time-frequency and time-scale communities have recently developed a large number of overcomplete waveform dictionaries --- stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of frames (MOF), Matching pursuit (MP), and, for special dictionaries, the best orthogonal basis (BOB). Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution. BP has interesting relations to ideas in areas as diverse as ill-posed problems, in abstract harmonic analysis, total variation denoising, and multiscale edge denoising. BP in highly overcomplete dictionaries leads to large-scale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear programming by interior-point methods. We obtain reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.

9,950 citations


Cites methods from "Nonlinear total variation based noi..."

  • ...Experiments with standard time-frequency dictionaries indicate some of the potential bene ts of BP. Experiments with some nonstandard dictionaries { like the stationary wavelet dictionary and the Heaviside dictionary { indicate important connections between BP and methods like Mallat and Zhong's Multi-Scale Edge Representation and Osher, Rudin and Fatemi's Total Variation-based De-Noising methods....

    [...]

  • ...Recently, Rudin, Osher and Fatemi [31] have called attention to the possibility of de-noising images using total-variation penalized least-squares....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The PSC algorithm as mentioned in this paper approximates the Hamilton-Jacobi equations with parabolic right-hand-sides by using techniques from the hyperbolic conservation laws, which can be used also for more general surface motion problems.

13,020 citations

Journal ArticleDOI
TL;DR: In this article, a new version of the Perona and Malik theory for edge detection and image restoration is proposed, which keeps all the improvements of the original model and avoids its drawbacks.
Abstract: A new version of the Perona and Malik theory for edge detection and image restoration is proposed. This new version keeps all the improvements of the original model and avoids its drawbacks: it is proved to be stable in presence of noise, with existence and uniqueness results. Numerical experiments on natural images are presented.

2,565 citations


"Nonlinear total variation based noi..." refers background or methods in this paper

  • ...Additionally, in [8], Alvarez, Lions and Morel devised an interesting stable image restoration algorithm based on mean curvature motion, see also ref....

    [...]

  • ...[8] as a model for smoothing and edge detection....

    [...]

Journal ArticleDOI
TL;DR: Here the authors will consider only nonsingular linear integral equations of the first kind, where the known functions h(x), K(x, y) and g(x) are assumed to be bounded and usually to be continuous.
Abstract: where the known functions h(x) , K(x, y) and g(x) are assumed to be bounded and usually to be continuous. If h(x) ~0 the equation is of first kind; if h(x) ~ 0 for a -<_ x ~ b, the equation is of second kind; if h(x) vanishes somewhere but not identically, the equation is of third kind. If the range of integration is infinite or if the kernel K(x, y) is not bounded, the equation is singular. Here we will consider only nonsingular linear integral equations of the first kind:

1,879 citations


Additional excerpts

  • ...The first attempt along these lines was made by Phillips [1] and later refined by Twomey [2] [3] in the one-dimensional case....

    [...]

Journal ArticleDOI
TL;DR: The authors examine prior smoothness constraints of a different form, which permit the recovery of discontinuities without introducing auxiliary variables for marking the location of jumps and suspending the constraints in their vicinity.
Abstract: The linear image restoration problem is to recover an original brightness distribution X/sup 0/ given the blurred and noisy observations Y=KX/sup 0/+B, where K and B represent the point spread function and measurement error, respectively. This problem is typical of ill-conditioned inverse problems that frequently arise in low-level computer vision. A conventional method to stabilize the problem is to introduce a priori constraints on X/sup 0/ and design a cost functional H(X) over images X, which is a weighted average of the prior constraints (regularization term) and posterior constraints (data term); the reconstruction is then the image X, which minimizes H. A prominent weakness in this approach, especially with quadratic-type stabilizers, is the difficulty in recovering discontinuities. The authors therefore examine prior smoothness constraints of a different form, which permit the recovery of discontinuities without introducing auxiliary variables for marking the location of jumps and suspending the constraints in their vicinity. In this sense, discontinuities are addressed implicitly rather than explicitly. >

1,205 citations


"Nonlinear total variation based noi..." refers background in this paper

  • ...We remark that Geman and Reynolds, in a very interesting paper [10] , proposed minimizing various nonlinear functionals of the form f q~(~x 2 + u 2) dx dy 12 with constraints....

    [...]

Journal ArticleDOI
TL;DR: The gradient projection method was originally presented to the American Mathematical Society for solving linear programming problems by Dantzig et al. as discussed by the authors, and has been applied to nonlinear programming problems as well.
Abstract: more constraints or equations, with either a linear or nonlinear objective function. This distinction is made primarily on the basis of the difficulty of solving these two types of nonlinear problems. The first type is the less difficult of the two, and in this, Part I of the paper, it is shown how it is solved by the gradient projection method. It should be noted that since a linear objective function is a special case of a nonlinear objective function, the gradient projection method will also solve a linear programming problem. In Part II of the paper [16], the extension of the gradient projection method to the more difficult problem of nonlinear constraints and equations will be described. The basic paper on linear programming is the paper by Dantzig [5] in which the simplex method for solving the linear programming problem is presented. The nonlinear programming problem is formulated and a necessary and sufficient condition for a constrained maximum is given in terms of an equivalent saddle value problem in the paper by Kuhn and Tucker [10]. Further developments motivated by this paper, including a computational procedure, have been published recently [1]. The gradient projection method was originally presented to the American Mathematical Society

1,142 citations


"Nonlinear total variation based noi..." refers methods in this paper

  • ...The theoretical justification for this approach comes from the fact that it is merely the gradient-projection method of Rosen [14]....

    [...]