scispace - formally typeset
Search or ask a question
Author

Yoann Le Montagner

Bio: Yoann Le Montagner is an academic researcher from Pasteur Institute. The author has contributed to research in topics: Compressed sensing & Iterative reconstruction. The author has an hindex of 6, co-authored 11 publications receiving 1167 citations. Previous affiliations of Yoann Le Montagner include Centre national de la recherche scientifique & Télécom ParisTech.

Papers
More filters
Journal ArticleDOI
TL;DR: Icy is a collaborative bioimage informatics platform that combines a community website for contributing and sharing tools and material, and software with a high-end visual programming framework for seamless development of sophisticated imaging workflows.
Abstract: Icy is a collaborative platform for biological image analysis that extends reproducible research principles by facilitating and stimulating the contribution and sharing of algorithm-based tools and protocols between researchers. Current research in biology uses evermore complex computational and imaging tools. Here we describe Icy, a collaborative bioimage informatics platform that combines a community website for contributing and sharing tools and material, and software with a high-end visual programming framework for seamless development of sophisticated imaging workflows. Icy extends the reproducible research principles, by encouraging and facilitating the reusability, modularity, standardization and management of algorithms and protocols. Icy is free, open-source and available at http://icy.bioimageanalysis.org/ .

1,261 citations

Journal ArticleDOI
TL;DR: A new Poisson-Gaussian unbiased risk estimator (PG-URE) of the MSE applicable to a mixed Poisson/Gaussian noise model that unifies the widely used Gaussian and Poisson noise models in fluorescence bioimaging applications is introduced.
Abstract: The behavior and performance of denoising algorithms are governed by one or several parameters, whose optimal settings depend on the content of the processed image and the characteristics of the noise, and are generally designed to minimize the mean squared error (MSE) between the denoised image returned by the algorithm and a virtual ground truth. In this paper, we introduce a new Poisson-Gaussian unbiased risk estimator (PG-URE) of the MSE applicable to a mixed Poisson-Gaussian noise model that unifies the widely used Gaussian and Poisson noise models in fluorescence bioimaging applications. We propose a stochastic methodology to evaluate this estimator in the case when little is known about the internal machinery of the considered denoising algorithm, and we analyze both theoretically and empirically the characteristics of the PG-URE estimator. Finally, we evaluate the PG-URE-driven parametrization for three standard denoising algorithms, with and without variance stabilizing transforms, and different characteristics of the Poisson-Gaussian noise mixture.

76 citations

Journal ArticleDOI
TL;DR: Light Sheet Fluorescence Microscopy (LSFM)- an emerging and attractive technique for 3D optical sectioning of large samples- appears to be a particularly well-suited approach to deal with TMs, and an acute and fair comparative assessment is provided.
Abstract: Tissue mimics (TMs) on the scale of several hundred microns provide a beneficial cell culture configuration for in vitro engineered tissue and are currently under the spotlight in tissue engineering and regenerative medicine. Due to the cell density and size, TMs are fairly inaccessible to optical observation and imaging within these samples remains challenging. Light Sheet Fluorescence Microscopy (LSFM)- an emerging and attractive technique for 3D optical sectioning of large samples- appears to be a particularly well-suited approach to deal with them. In this work, we compared the effectiveness of different light sheet illumination modalities reported in the literature to improve resolution and/or light exposure for complex 3D samples. In order to provide an acute and fair comparative assessment, we also developed a systematic, computerized benchmarking method. The outcomes of our experiment provide meaningful information for valid comparisons and arises the main differences between the modalities when imaging different types of TMs.

38 citations

Proceedings ArticleDOI
09 Jun 2011
TL;DR: A short presentation of the compressed sensing imaging framework, along with a review of recent applications in the biomedical imaging field, and a comparison of three reconstruction algorithms, evaluating data fidelity and computational efficiency are proposed.
Abstract: In this paper, we propose a short presentation of the compressed sensing imaging framework, along with a review of recent applications in the biomedical imaging field. One of the critical issue that used to hinder the application of compressed sensing in a bioimaging context is the computational cost of the underlying image reconstruction process. However, some recently published algorithms manage to overcome this difficulty, leading to acceptable reconstruction computational times. We illustrate with simulations on biological images of fluorescence microscopy a comparison of three reconstruction algorithms, evaluating data fidelity and computational efficiency.

24 citations

Proceedings ArticleDOI
01 Sep 2012
TL;DR: This paper investigates how this framework can be extended to perform an efficient joint reconstruction of a sequence of time-correlated 2D images, using 3D total variation regularization, and evaluates the performances on test sequences issued from the bio-imaging field.
Abstract: The theory of compressed sensing (CS) predicts that random (or pseudo-random) linear measurements together with non-linear reconstruction can be used to sample and recover structured signals in a compressive manner. Lots of previous results demonstrated the efficiency of CS in recovering 2D images acquired using dedicated CS devices (single-pixel camera, accelerated MRI, etc…). In this paper, we investigate how this framework can be extended to perform an efficient joint reconstruction of a sequence of time-correlated 2D images, using 3D total variation regularization. We also evaluate the performances of this framework on test sequences issued from the bio-imaging field.

18 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The origins, challenges and solutions of NIH Image and ImageJ software are discussed, and how their history can serve to advise and inform other software projects.
Abstract: For the past 25 years NIH Image and ImageJ software have been pioneers as open tools for the analysis of scientific images. We discuss the origins, challenges and solutions of these two programs, and how their history can serve to advise and inform other software projects.

44,587 citations

Journal ArticleDOI
TL;DR: ImageJ2 as mentioned in this paper is the next generation of ImageJ, which provides a host of new functionality and separates concerns, fully decoupling the data model from the user interface.
Abstract: ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software’s ability to handle the requirements of modern science. We rewrote the entire ImageJ codebase, engineering a redesigned plugin mechanism intended to facilitate extensibility at every level, with the goal of creating a more powerful tool that continues to serve the existing community while addressing a wider range of scientific requirements. This next-generation ImageJ, called “ImageJ2” in places where the distinction matters, provides a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. Scientific imaging benefits from open-source programs that advance new method development and deployment to a diverse audience. ImageJ has continuously evolved with this idea in mind; however, new and emerging scientific requirements have posed corresponding challenges for ImageJ’s development. The described improvements provide a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs. Future efforts will focus on implementing new algorithms in this framework and expanding collaborations with other popular scientific software suites.

4,093 citations

Journal ArticleDOI
TL;DR: QuPath provides researchers with powerful batch-processing and scripting functionality, and an extensible platform with which to develop and share new algorithms to analyze complex tissue images, making it suitable for a wide range of additional image analysis applications across biomedical research.
Abstract: QuPath is new bioimage analysis software designed to meet the growing need for a user-friendly, extensible, open-source solution for digital pathology and whole slide image analysis. In addition to offering a comprehensive panel of tumor identification and high-throughput biomarker evaluation tools, QuPath provides researchers with powerful batch-processing and scripting functionality, and an extensible platform with which to develop and share new algorithms to analyze complex tissue images. Furthermore, QuPath’s flexible design makes it suitable for a wide range of additional image analysis applications across biomedical research.

2,838 citations

Journal ArticleDOI
15 Feb 2017-Methods
TL;DR: TrackMate is an extensible platform where developers can easily write their own detection, particle linking, visualization or analysis algorithms within the TrackMate environment and is validated for quantitative lifetime analysis of clathrin-mediated endocytosis in plant cells.

2,356 citations

Posted Content
TL;DR: The entire ImageJ codebase was rewrote, engineering a redesigned plugin mechanism intended to facilitate extensibility at every level, with the goal of creating a more powerful tool that continues to serve the existing community while addressing a wider range of scientific requirements.
Abstract: ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software's ability to handle the requirements of modern science. Due to these new and emerging challenges in scientific imaging, ImageJ is at a critical development crossroads. We present ImageJ2, a total redesign of ImageJ offering a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. ImageJ2 provides a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs.

2,156 citations