scispace - formally typeset
Search or ask a question
Author

Gemma Piella

Bio: Gemma Piella is an academic researcher from Pompeu Fabra University. The author has contributed to research in topics: Computer science & Lifting scheme. The author has an hindex of 25, co-authored 143 publications receiving 4411 citations. Previous affiliations of Gemma Piella include Autonomous University of Barcelona & Polytechnic University of Catalonia.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the effects of quantization in an adaptive wavelet decomposition are discussed and conditions for recovering the original decisions at the synthesis and for relating the reconstruction error to the quantization error.
Abstract: Classical linear wavelet representations of images have the drawback that they are not optimally suited to represent edge information To overcome this problem, nonlinear multiresolution decompositions have been designed to take into account the characteristics of the input signal/image In our previous work20,22,23 we have introduced an adaptive lifting framework, that does not require bookkeeping but has the property that it processes edges and homogeneous image regions in a different fashion The current paper discusses the effects of quantization in such an adaptive wavelet decomposition We provide conditions for recovering the original decisions at the synthesis and for relating the reconstruction error to the quantization error Such an analysis is essential for the application of these adaptive decompositions in image compression

11 citations

Book ChapterDOI
14 Sep 2017
TL;DR: A novel descriptor is presented that uses the similarity between local image patches to encode local displacements due to atrophy between a pair of longitudinal MRI scans and achieves \(76\%\) accuracy in predicting which MCI patients will progress to AD up to 3 years before conversion.
Abstract: Alzheimer’s disease (AD) is characterized by a progressive decline in the cognitive functions accompanied by an atrophic process which can already be observed in the early stages using magnetic resonance images (MRI). Individualized prediction of future progression to AD, when patients are still in the mild cognitive impairment (MCI) stage, has potential impact for preventive treatment. Atrophy patterns extracted from longitudinal MRI sequences provide valuable information to identify MCI patients at higher risk of developing AD in the future. We present a novel descriptor that uses the similarity between local image patches to encode local displacements due to atrophy between a pair of longitudinal MRI scans. Using a conventional logistic regression classifier, our descriptor achieves \(76\%\) accuracy in predicting which MCI patients will progress to AD up to 3 years before conversion.

11 citations

01 Jan 2002
TL;DR: In this article, the authors present an overview on image fusion techniques using multiresolution decompositions and propose a new region-based approach which combines aspects of both object and pixel-level fusion.
Abstract: This paper presents an overview on image fusion techniques using multiresolution decompositions. The aim is two-fold: ($i$) to reframe the multiresolution-based fusion methodology into a common formalism and, within this framework, ($ii$) to develop a new region-based approach which combines aspects of both object and pixel-level fusion. To this end, we first present a general framework which encompasses most of the existing multiresolution-based fusion schemes and provides freedom to create new ones. Then, we extend this framework to allow a region-based fusion approach. The basic idea is to make a multiresolution segmentation based on all different input images and to use this segmentation to guide the fusion process. Performance assessment is also addressed and future directions and open problems are discussed as well.

11 citations

Proceedings ArticleDOI
24 Nov 2003
TL;DR: The effects of quantization in such adaptive wavelet decomposition are discussed and conditions for recovering the original decisions at the synthesis and for relating the reconstruction error to the quantization error are provided.
Abstract: Classical linear wavelet representations of images have the drawback that they are not well suited to represent edge information. To overcome this problem, nonlinear multiresolution decompositions are being designed that can take into account the characteristics of the input signal/image. In our previous work [(G. Piella et al., July 2002), (H.J.A.M. Heijmans et al., 2002)] we have introduced an adaptive lifting framework that does not require bookkeeping but has the property that it processes edges and homogeneous regions in an image in a different fashion. The current paper discusses the effects of quantization in such adaptive wavelet decomposition. We provide conditions for recovering the original decisions at the synthesis and for relating the reconstruction error to the quantization error. Such an analysis is essential for the application of these adaptive decompositions in image compression algorithms.

11 citations

Book ChapterDOI
05 Oct 2012
TL;DR: This paper presents motion and deformation quantification results obtained from synthetic and in vitro phantom data provided by the second cardiac Motion Analysis Challenge at STACOM-MICCAI, using the Temporal Diffeomorphic Free Form Deformation (TDFFD) algorithm.
Abstract: This paper presents motion and deformation quantification results obtained from synthetic and in vitro phantom data provided by the second cardiac Motion Analysis Challenge at STACOM-MICCAI We applied the Temporal Diffeomorphic Free Form Deformation (TDFFD) algorithm to the datasets This algorithm builds upon a diffeomorphic version of the FFD, to provide a 3D+t continuous and differentiable transform The similarity metric includes a comparison between consecutive images, and between a reference and each of the following images Motion and strain accuracy were evaluated on synthetic 3D ultrasound sequences with known ground truth motion Experiments were also conducted on in vitro acquisitions

10 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: This article has reviewed the reasons why people want to love or leave the venerable (but perhaps hoary) MSE and reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems.
Abstract: In this article, we have reviewed the reasons why we (collectively) want to love or leave the venerable (but perhaps hoary) MSE. We have also reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems. The message we are trying to send here is not that one should abandon use of the MSE nor to blindly switch to any other particular signal fidelity measure. Rather, we hope to make the point that there are powerful, easy-to-use, and easy-to-understand alternatives that might be deployed depending on the application environment and needs. While we expect (and indeed, hope) that the MSE will continue to be widely used as a signal fidelity measure, it is our greater desire to see more advanced signal fidelity measures being used, especially in applications where perceptual criteria might be relevant. Ideally, the performance of a new signal processing algorithm might be compared to other algorithms using several fidelity criteria. Lastly, we hope that we have given further motivation to the community to consider recent advanced signal fidelity measures as design criteria for optimizing signal processing algorithms and systems. It is in this direction that we believe that the greatest benefit eventually lies.

2,601 citations

Proceedings Article
01 Jan 1999

2,010 citations

Journal ArticleDOI
TL;DR: This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data Fusion.
Abstract: The development of the Internet in recent years has made it possible and useful to access many different information systems anywhere in the world to obtain information. While there is much research on the integration of heterogeneous information systems, most commercial systems stop short of the actual integration of available data. Data fusion is the process of fusing multiple records representing the same real-world object into a single, consistent, and clean representation.This article places data fusion into the greater context of data integration, precisely defines the goals of data fusion, namely, complete, concise, and consistent data, and highlights the challenges of data fusion, namely, uncertain and conflicting data values. We give an overview and classification of different ways of fusing data and present several techniques based on standard and advanced operators of the relational algebra and SQL. Finally, the article features a comprehensive survey of data integration systems from academia and industry, showing if and how data fusion is performed in each.

1,797 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.
Abstract: A fast and effective image fusion method is proposed for creating a highly informative fused image through merging multiple images. The proposed method is based on a two-scale decomposition of an image into a base layer containing large scale variations in intensity, and a detail layer capturing small scale details. A novel guided filtering-based weighted average technique is proposed to make full use of spatial consistency for fusion of the base and detail layers. Experimental results demonstrate that the proposed method can obtain state-of-the-art performance for fusion of multispectral, multifocus, multimodal, and multiexposure images.

1,300 citations