scispace - formally typeset
Search or ask a question
Author

Miguel Ángel González Ballester

Bio: Miguel Ángel González Ballester is an academic researcher from Pompeu Fabra University. The author has contributed to research in topics: Segmentation & Point distribution model. The author has an hindex of 25, co-authored 194 publications receiving 2913 citations. Previous affiliations of Miguel Ángel González Ballester include T-Systems & Catalan Institution for Research and Advanced Studies.


Papers
More filters
Journal ArticleDOI
TL;DR: How far state-of-the-art deep learning methods can go at assessing CMRI, i.e., segmenting the myocardium and the two ventricles as well as classifying pathologies is measured, to open the door to highly accurate and fully automatic analysis of cardiac CMRI.
Abstract: Delineation of the left ventricular cavity, myocardium, and right ventricle from cardiac magnetic resonance images (multi-slice 2-D cine MRI) is a common clinical task to establish diagnosis. The automation of the corresponding tasks has thus been the subject of intense research over the past decades. In this paper, we introduce the “Automatic Cardiac Diagnosis Challenge” dataset (ACDC), the largest publicly available and fully annotated dataset for the purpose of cardiac MRI (CMR) assessment. The dataset contains data from 150 multi-equipments CMRI recordings with reference measurements and classification from two medical experts. The overarching objective of this paper is to measure how far state-of-the-art deep learning methods can go at assessing CMRI, i.e., segmenting the myocardium and the two ventricles as well as classifying pathologies. In the wake of the 2017 MICCAI-ACDC challenge, we report results from deep learning methods provided by nine research groups for the segmentation task and four groups for the classification task. Results show that the best methods faithfully reproduce the expert analysis, leading to a mean value of 0.97 correlation score for the automatic extraction of clinical indices and an accuracy of 0.96 for automatic diagnosis. These results clearly open the door to highly accurate and fully automatic analysis of cardiac CMRI. We also identify scenarios for which deep learning methods are still failing. Both the dataset and detailed results are publicly available online, while the platform will remain open for new submissions.

1,056 citations

Journal ArticleDOI
TL;DR: This paper provides a statistical estimation framework to quantify PVE and to propagate voxel-based estimates in order to compute global magnitudes, such as volume, with associated estimates of uncertainty.

184 citations

Journal ArticleDOI
TL;DR: This paper presents a 2D/3D correspondence building method based on a non-rigid 2D point matching process, which iteratively uses a symmetric injective nearest-neighbor mapping operator and 2D thin-plate splines based deformations to find a fraction of best matched2D point pairs between features extracted from the X-ray images and those extracts from the 3D model.

165 citations

Journal ArticleDOI
TL;DR: This paper proposes a novel method to construct a patient-specific three-dimensional model that provides an appropriate intra-operative visualization without the need for a pre or intra-operatively imaging.

137 citations

Journal ArticleDOI
TL;DR: A new fully automatic approach based on Deep Convolutional Neural Networks (DCNN) for robust and reproducibleThrombus region of interest detection and subsequent fine thrombus segmentation and a new segmentation network architecture, based on Fully convolutional Networks and a Holistically‐Nested Edge Detection Network, is presented.

114 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
31 Jan 2002-Neuron
TL;DR: In this paper, a technique for automatically assigning a neuroanatomical label to each voxel in an MRI volume based on probabilistic information automatically estimated from a manually labeled training set is presented.

7,120 citations

Journal ArticleDOI

6,278 citations

Journal ArticleDOI
TL;DR: nnU-Net as mentioned in this paper is a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task.
Abstract: Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.

2,040 citations