scispace - formally typeset
Search or ask a question
Author

Gemma Piella

Bio: Gemma Piella is an academic researcher from Pompeu Fabra University. The author has contributed to research in topics: Computer science & Lifting scheme. The author has an hindex of 25, co-authored 143 publications receiving 4411 citations. Previous affiliations of Gemma Piella include Autonomous University of Barcelona & Polytechnic University of Catalonia.


Papers
More filters
Journal ArticleDOI
TL;DR: A new method for the automatic comparison of myocardial motion patterns and the characterization of their degree of abnormality, based on a statistical atlas of motion built from a reference healthy population is presented.

75 citations

Journal ArticleDOI
TL;DR: This review covers state‐of‐the‐art segmentation and classification methodologies for the whole fetus and, more specifically, the fetal brain, lungs, liver, heart and placenta in magnetic resonance imaging and (3D) ultrasound for the first time.

70 citations

Journal ArticleDOI
TL;DR: Five 3D ultrasound tracking algorithms are evaluated regarding their ability to quantify abnormal deformation in timing or amplitude and radial strain was found to have a low accuracy in comparison to longitudinal and circumferential components.
Abstract: This paper evaluates five 3D ultrasound tracking algorithms regarding their ability to quantify abnormal deformation in timing or amplitude. A synthetic database of B-mode image sequences modeling healthy, ischemic and dyssynchrony cases was generated for that purpose. This database is made publicly available to the community. It combines recent advances in electromechanical and ultrasound modeling. For modeling heart mechanics, the Bestel-Clement-Sorine electromechanical model was applied to a realistic geometry. For ultrasound modeling, we applied a fast simulation technique to produce realistic images on a set of scatterers moving according to the electromechanical simulation result. Tracking and strain accuracies were computed and compared for all evaluated algorithms. For tracking, all methods were estimating myocardial displacements with an error below 1 mm on the ischemic sequences. The introduction of a dilated geometry was found to have a significant impact on accuracy. Regarding strain, all methods were able to recover timing differences between segments, as well as low strain values. On all cases, radial strain was found to have a low accuracy in comparison to longitudinal and circumferential components.

63 citations

01 Jan 2002
TL;DR: In this article, the authors propose a lifting scheme for adaptive wavelet decompositions, which is based on an extension of the lifting scheme, and derive necessary and sufficient conditions for the invertibility of such an adaptive system for various scenarios.
Abstract: textAdaptive wavelet decompositions appear useful in various applications in image and video processing, such as image analysis, compression, feature extraction, denoising and deconvolution, or optic flow estimation. For such tasks it may be important that the multiresolution representations take into account the characteristics of the underlying signal and do leave intact important signal characteristics such as sharp transitions, edges, singularities or other regions of interest. In this paper, we propose a technique for building adaptive wavelets by means of an extension of the lifting scheme. The classical lifting scheme provides a simple yet flexible method for building new, possibly nonlinear, wavelets from existing ones. It comprises a given wavelet transform, followed by a prediction and an update step. The update step in such a scheme computes a modification of the approximation signal, using information in the detail band. It is obvious that such an operation can be inverted, and therefore the perfect reconstruction property is guaranteed. In this paper we propose a lifting scheme including an adaptive update lifting and a fixed prediction lifting step. The adaptivity consists hereof that the system can choose between two different update filters, and that this choice is triggered by the local gradient of the original signal. If the gradient is large (in some seminorm sense) it chooses one filter, if it is small the other. In this paper we derive necessary and sufficient conditions for the invertibility of such an adaptive system for various scenarios.

63 citations

Journal ArticleDOI
TL;DR: This letter treats a class of adaptive update-lifting schemes that do not require bookkeeping for perfect reconstruction that are triggered by a binary threshold criterion based on a generalized gradient chosen so that it ignores portions of a signal that are polynomial up to a given order.
Abstract: This letter treats a class of adaptive update-lifting schemes that do not require bookkeeping for perfect reconstruction. The choice of the update-lifting filter is triggered by a binary threshold criterion based on a generalized gradient that is chosen in such a way that it only smooths homogeneous regions. This criterion can be chosen so that it ignores portions of a signal that are polynomial up to a given order. The update-lifting filter modifies the signal in these polynomial regions but leaves other portions unaffected.

59 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: This article has reviewed the reasons why people want to love or leave the venerable (but perhaps hoary) MSE and reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems.
Abstract: In this article, we have reviewed the reasons why we (collectively) want to love or leave the venerable (but perhaps hoary) MSE. We have also reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems. The message we are trying to send here is not that one should abandon use of the MSE nor to blindly switch to any other particular signal fidelity measure. Rather, we hope to make the point that there are powerful, easy-to-use, and easy-to-understand alternatives that might be deployed depending on the application environment and needs. While we expect (and indeed, hope) that the MSE will continue to be widely used as a signal fidelity measure, it is our greater desire to see more advanced signal fidelity measures being used, especially in applications where perceptual criteria might be relevant. Ideally, the performance of a new signal processing algorithm might be compared to other algorithms using several fidelity criteria. Lastly, we hope that we have given further motivation to the community to consider recent advanced signal fidelity measures as design criteria for optimizing signal processing algorithms and systems. It is in this direction that we believe that the greatest benefit eventually lies.

2,601 citations

Proceedings Article
01 Jan 1999

2,010 citations

Journal ArticleDOI
TL;DR: This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data Fusion.
Abstract: The development of the Internet in recent years has made it possible and useful to access many different information systems anywhere in the world to obtain information. While there is much research on the integration of heterogeneous information systems, most commercial systems stop short of the actual integration of available data. Data fusion is the process of fusing multiple records representing the same real-world object into a single, consistent, and clean representation.This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data fusion, namely, uncertain and conflicting data values. We give an overview and classification of different ways of fusing data and present several techniques based on standard and advanced operators of the relational algebra and SQL. Finally, the article features a comprehensive survey of data integration systems from academia and industry, showing if and how data fusion is performed in each.

1,797 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.
Abstract: A fast and effective image fusion method is proposed for creating a highly informative fused image through merging multiple images. The proposed method is based on a two-scale decomposition of an image into a base layer containing large scale variations in intensity, and a detail layer capturing small scale details. A novel guided filtering-based weighted average technique is proposed to make full use of spatial consistency for fusion of the base and detail layers. Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.

1,300 citations