scispace - formally typeset
Search or ask a question
Author

Gemma Piella

Bio: Gemma Piella is an academic researcher from Pompeu Fabra University. The author has contributed to research in topics: Computer science & Lifting scheme. The author has an hindex of 25, co-authored 143 publications receiving 4411 citations. Previous affiliations of Gemma Piella include Autonomous University of Barcelona & Polytechnic University of Catalonia.


Papers
More filters
Journal ArticleDOI
TL;DR: The proposed TTTS fetal surgery planning and simulation platform is integrated into a flexible C++ and MITK-based application to provide a full exploration of the intrauterine environment by simulating the fetoscope camera as well as the laser ablation.

17 citations

Journal ArticleDOI
TL;DR: Two ensembling strategies are explored, namely, stacking and cascading to combine the strengths of both families, and results show that either combination strategy outperform all of the individual methods, thus demonstrating the capability of learning systematic combinations that lead to an overall improvement.

17 citations

Proceedings ArticleDOI
10 Dec 2002
TL;DR: A method for the construction of nonlinear 2D wavelet decompositions using an adaptive update lifting scheme which allows perfect reconstruction and the possibility of better preserving edges, even at low resolutions.
Abstract: The paper discusses a method for the construction of nonlinear 2D wavelet decompositions using an adaptive update lifting scheme A very interesting aspect is that the decomposition does not require any bookkeeping, ie, it is nonredundant, but that it, nevertheless, allows perfect reconstruction The major ingredient of the construction is the so-called decision map which triggers the choice of the update filter Another interesting point is the possibility of better preserving edges, even at low resolutions

17 citations

Journal ArticleDOI
TL;DR: In HCM patients, segments with normal contraction coexisted with segments exhibiting no or significantly reduced deformation, which resulted in a greater heterogeneity of strain values, which characterized specific patterns of myocardial deformation in patients with LVH due to different etiologies.
Abstract: Introduction and objectives It has been suggested that in hypertrophic cardiomyopathy (HCM) the regional fiber disarray results in segments of no – or severely reduced – deformation, distributed non-uniformly within the left ventricle (LV). This is in contrast with other types of hypertrophies such as athlete's heart or hypertensive left ventricular hypertrophy (HT-LVH), which may have abnormal cardiac deformation but never as reduced as showing absence of deformation in certain segments. Hence, we propose to use the distribution of the strain values to study deformation in HCM. Methods Using tagged magnetic resonance imaging, we reconstructed the LV systolic deformation from 12 controls, 10 athletes, 12 patients with HCM and 10 patients with HT-LVH. Deformation was quantified using a fast nonrigid registration algorithm and measuring radial and circumferential peak systolic strain values from 16 LV segments. Results Hypertrophic cardiomyopathy patients showed significantly lower average strain values when compared to other groups. However, while the deformation in healthy subjects and HT-LVH was concentrated around the mean value, in HCM there was a coexistence of segments with normal contraction and segments with none or significantly reduced deformation, resulting in a greater heterogeneity of the strain values. Some nondeformed segments were also found in the absence of fibrosis or hypertrophy. Conclusions Strain distribution characterizes specific patterns of myocardial deformation in patients with different etiologies of LVH. HCM patients had significantly lower average strain as well as greater strain heterogeneity (compared to controls, athletes and HT-LVH), and they presented nondeformed regions.

17 citations

Journal ArticleDOI
TL;DR: The state-of-the-art, as well as the current clinical status and challenges associated with the two later tasks of interpretation and decision support are discussed, together with the challenges related to the learning process, the auditability/traceability, the system infrastructure and the integration within clinical processes in cardiovascular imaging.
Abstract: The use of machine learning (ML) approaches to target clinical problems is called to revolutionize clinical decision-making in cardiology. The success of these tools is dependent on the understanding of the intrinsic processes being used during the conventional pathway by which clinicians make decisions. In a parallelism with this pathway, ML can have an impact at four levels: for data acquisition, predominantly by extracting standardized, high-quality information with the smallest possible learning curve; for feature extraction, by discharging healthcare practitioners from performing tedious measurements on raw data; for interpretation, by digesting complex, heterogeneous data in order to augment the understanding of the patient status; and for decision support, by leveraging the previous steps to predict clinical outcomes, response to treatment or to recommend a specific intervention. This paper discusses the state-of-the-art, as well as the current clinical status and challenges associated with the two later tasks of interpretation and decision support, together with the challenges related to the learning process, the auditability/traceability, the system infrastructure and the integration within clinical processes in cardiovascular imaging.

16 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: This article has reviewed the reasons why people want to love or leave the venerable (but perhaps hoary) MSE and reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems.
Abstract: In this article, we have reviewed the reasons why we (collectively) want to love or leave the venerable (but perhaps hoary) MSE. We have also reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems. The message we are trying to send here is not that one should abandon use of the MSE nor to blindly switch to any other particular signal fidelity measure. Rather, we hope to make the point that there are powerful, easy-to-use, and easy-to-understand alternatives that might be deployed depending on the application environment and needs. While we expect (and indeed, hope) that the MSE will continue to be widely used as a signal fidelity measure, it is our greater desire to see more advanced signal fidelity measures being used, especially in applications where perceptual criteria might be relevant. Ideally, the performance of a new signal processing algorithm might be compared to other algorithms using several fidelity criteria. Lastly, we hope that we have given further motivation to the community to consider recent advanced signal fidelity measures as design criteria for optimizing signal processing algorithms and systems. It is in this direction that we believe that the greatest benefit eventually lies.

2,601 citations

Proceedings Article
01 Jan 1999

2,010 citations

Journal ArticleDOI
TL;DR: This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data Fusion.
Abstract: The development of the Internet in recent years has made it possible and useful to access many different information systems anywhere in the world to obtain information. While there is much research on the integration of heterogeneous information systems, most commercial systems stop short of the actual integration of available data. Data fusion is the process of fusing multiple records representing the same real-world object into a single, consistent, and clean representation.This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data fusion, namely, uncertain and conflicting data values. We give an overview and classification of different ways of fusing data and present several techniques based on standard and advanced operators of the relational algebra and SQL. Finally, the article features a comprehensive survey of data integration systems from academia and industry, showing if and how data fusion is performed in each.

1,797 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.
Abstract: A fast and effective image fusion method is proposed for creating a highly informative fused image through merging multiple images. The proposed method is based on a two-scale decomposition of an image into a base layer containing large scale variations in intensity, and a detail layer capturing small scale details. A novel guided filtering-based weighted average technique is proposed to make full use of spatial consistency for fusion of the base and detail layers. Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.

1,300 citations