scispace - formally typeset
Search or ask a question
Author

Laure Blanc-Féraud

Bio: Laure Blanc-Féraud is an academic researcher from French Institute for Research in Computer Science and Automation. The author has contributed to research in topics: Image restoration & Deconvolution. The author has an hindex of 30, co-authored 176 publications receiving 5743 citations. Previous affiliations of Laure Blanc-Féraud include Centre national de la recherche scientifique & University of Nice Sophia Antipolis.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper proposes a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable, which leads to the definition of an original reconstruction algorithm, called ARTUR, which can be applied in a large number of applications in image processing.
Abstract: Many image processing problems are ill-posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such an edge-preserving regularization. Under these conditions, we show that it is possible to introduce an auxiliary variable whose role is twofold. First, it marks the discontinuities and ensures their preservation from smoothing. Second, it makes the criterion half-quadratic. The optimization is then easier. We propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This leads to the definition of an original reconstruction algorithm, called ARTUR. Some theoretical properties of ARTUR are discussed. Experimental results illustrate the behavior of the algorithm. These results are shown in the field of 2D single photon emission tomography, but this method can be applied in a large number of applications in image processing.

1,360 citations

Proceedings ArticleDOI
13 Nov 1994
TL;DR: The authors propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable, which yields two algorithms, ARTUR and LEGEND, which are applied to the problem of SPECT reconstruction.
Abstract: Many image processing problems are ill-posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. The authors first give sufficient conditions for the design of such an edge-preserving regularization. Under these conditions, it is possible to introduce an auxiliary variable whose role is twofold. Firstly, it marks the discontinuities and ensures their preservation from smoothing. Secondly, it makes the criterion half-quadratic. The optimization is then easier. The authors propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This yields two algorithms, ARTUR and LEGEND. The authors apply these algorithms to the problem of SPECT reconstruction. >

628 citations

Journal ArticleDOI
TL;DR: This work proposes to combine the Richardson–Lucy algorithm with a regularization constraint based on Total Variation, which suppresses unstable oscillations while preserving object edges and shows that this constraint improves the deconvolution results as compared with the unregularized Richardson– Lucy algorithm, both visually and quantitatively.
Abstract: Confocal laser scanning microscopy is a powerful and popular technique for 3D imaging of biological specimens. Although confocal microscopy images are much sharper than standard epifluorescence ones, they are still degraded by residual out-of-focus light and by Poisson noise due to photon-limited detection. Several deconvolution methods have been proposed to reduce these degradations, including the Richardson-Lucy iterative algorithm, which computes maximum likelihood estimation adapted to Poisson statistics. As this algorithm tends to amplify noise, regularization constraints based on some prior knowledge on the data have to be applied to stabilize the solution. Here, we propose to combine the Richardson-Lucy algorithm with a regularization constraint based on Total Variation, which suppresses unstable oscillations while preserving object edges. We show on simulated and real images that this constraint improves the deconvolution results as compared with the unregularized Richardson-Lucy algorithm, both visually and quantitatively.

420 citations

Journal ArticleDOI
TL;DR: An algorithm to split an image into a sum u + v of a bounded variation component and a component containing the textures and the noise is constructed, inspired from a recent work of Y. Meyer.
Abstract: We construct an algorithm to split an image into a sum u + v of a bounded variation component and a component containing the textures and the noise. This decomposition is inspired from a recent work of Y. Meyer. We find this decomposition by minimizing a convex functional which depends on the two variables u and v, alternately in each variable. Each minimization is based on a projection algorithm to minimize the total variation. We carry out the mathematical study of our method. We present some numerical results. In particular, we show how the u component can be used in nontextured SAR image restoration.

369 citations

Journal ArticleDOI
TL;DR: A supervised classification model based on a variational approach to find an optimal partition composed of homogeneous classes with regular interfaces and shows how these forces can be defined through the minimization of a unique fonctional.
Abstract: We present a supervised classification model based on a variational approach This model is devoted to find an optimal partition composed of homogeneous classes with regular interfaces The originality of the proposed approach concerns the definition of a partition by the use of level sets Each set of regions and boundaries associated to a class is defined by a unique level set function We use as many level sets as different classes and all these level sets are moving together thanks to forces which interact in order to get an optimal partition We show how these forces can be defined through the minimization of a unique fonctional The coupled Partial Differential Equations (PDE) related to the minimization of the functional are considered through a dynamical scheme Given an initial interface set (zero level set), the different terms of the PDE's are governing the motion of interfaces such that, at convergence, we get an optimal partition as defined above Each interface is guided by internal forces (regularity of the interface), and external ones (data term, no vacuum, no regions overlapping) Several experiments were conducted on both synthetic and real images

287 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.
Abstract: This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem and arises in many important applications as in the task of recovering a large matrix from a small subset of its entries (the famous Netflix problem). Off-the-shelf algorithms such as interior point methods are not directly amenable to large problems of this kind with over a million unknown entries. This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank. The algorithm is iterative, produces a sequence of matrices $\{\boldsymbol{X}^k,\boldsymbol{Y}^k\}$, and at each step mainly performs a soft-thresholding operation on the singular values of the matrix $\boldsymbol{Y}^k$. There are two remarkable features making this attractive for low-rank matrix completion problems. The first is that the soft-thresholding operation is applied to a sparse matrix; the second is that the rank of the iterates $\{\boldsymbol{X}^k\}$ is empirically nondecreasing. Both these facts allow the algorithm to make use of very minimal storage space and keep the computational cost of each iteration low. On the theoretical side, we provide a convergence analysis showing that the sequence of iterates converges. On the practical side, we provide numerical examples in which $1,000\times1,000$ matrices are recovered in less than a minute on a modest desktop computer. We also demonstrate that our approach is amenable to very large scale problems by recovering matrices of rank about 10 with nearly a billion unknowns from just about 0.4% of their sampled entries. Our methods are connected with the recent literature on linearized Bregman iterations for $\ell_1$ minimization, and we develop a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.

5,276 citations

Journal ArticleDOI
TL;DR: A general mathematical and experimental methodology to compare and classify classical image denoising algorithms and a nonlocal means (NL-means) algorithm addressing the preservation of structure in a digital image are defined.
Abstract: The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics In spite of the sophistication of the recently proposed methods, m

4,153 citations

Journal ArticleDOI
TL;DR: ImageJ2 as mentioned in this paper is the next generation of ImageJ, which provides a host of new functionality and separates concerns, fully decoupling the data model from the user interface.
Abstract: ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software’s ability to handle the requirements of modern science. We rewrote the entire ImageJ codebase, engineering a redesigned plugin mechanism intended to facilitate extensibility at every level, with the goal of creating a more powerful tool that continues to serve the existing community while addressing a wider range of scientific requirements. This next-generation ImageJ, called “ImageJ2” in places where the distinction matters, provides a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. Scientific imaging benefits from open-source programs that advance new method development and deployment to a diverse audience. ImageJ has continuously evolved with this idea in mind; however, new and emerging scientific requirements have posed corresponding challenges for ImageJ’s development. The described improvements provide a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs. Future efforts will focus on implementing new algorithms in this framework and expanding collaborations with other popular scientific software suites.

4,093 citations