scispace - formally typeset
Search or ask a question
Author

Rémy Prost

Bio: Rémy Prost is an academic researcher from University of Lyon. The author has contributed to research in topics: Wavelet & Wavelet transform. The author has an hindex of 24, co-authored 144 publications receiving 2372 citations. Previous affiliations of Rémy Prost include Intelligence and National Security Alliance & Institut national des sciences Appliquées de Lyon.


Papers
More filters
Journal ArticleDOI
TL;DR: A generic framework for 3D surface remeshing based on a metric-driven Discrete Voronoi Diagram construction that combines the robustness and theoretical strength of Delaunay criteria with the efficiency of an entirely discrete geometry processing.
Abstract: In this paper, we propose a generic framework for 3D surface remeshing. Based on a metric-driven Discrete Voronoi Diagram construction, our output is an optimized 3D triangular mesh with a user-defined vertex budget. Our approach can deal with a wide range of applications, from high-quality mesh generation to shape approximation. By using appropriate metric constraints, the method generates isotropic or anisotropic elements. Based on point sampling, our algorithm combines the robustness and theoretical strength of Delaunay criteria with the efficiency of an entirely discrete geometry processing. Besides the general described framework, we show the experimental results using isotropic, quadric-enhanced isotropic, and anisotropic metrics, which prove the efficiency of our method on large meshes at a low computational cost.

213 citations

Journal ArticleDOI
TL;DR: Two oblivious watermarking methods for 3-D polygonal mesh models, which modify the distribution of vertex norms according to the watermark bit to be embedded, are proposed, which are remarkably robust against distortionless attacks.
Abstract: Although it has been known that oblivious (or blind) watermarking schemes are less robust than nonoblivious ones, they are more useful for various applications where a host signal is not available in the watermark detection procedure. From a viewpoint of oblivious watermarking for a three-dimensional (3-D) polygonal mesh model, distortionless attacks, such as similarity transforms and vertex reordering, might be more serious than distortion attacks including adding noise, smoothing, simplification, remeshing, clipping, and so on. Clearly, it is required to develop an oblivious watermarking that is robust against distortionless as well as distortion attacks. In this paper, we propose two oblivious watermarking methods for 3-D polygonal mesh models, which modify the distribution of vertex norms according to the watermark bit to be embedded. One method is to shift the mean value of the distribution and another is to change its variance. Histogram mapping functions are introduced to modify the distribution. These mapping functions are devised to reduce the visibility of watermark as much as possible. Since the statistical features of vertex norms are invariant to the distortionless attacks, the proposed methods are robust against such attacks. In addition, our methods employ an oblivious watermark detection scheme, which can extract the watermark without referring to the cover mesh model. Through simulations, we demonstrate that the proposed approaches are remarkably robust against distortionless attacks. In addition, they are fairly robust against various distortion attacks

209 citations

Journal ArticleDOI
TL;DR: The study confirmed the fractal nature of the geometry of the epicardial coronary artery tree, and gave a simple and accurate fractal ratio between the diameters of the mother and two daughter vessels such that Dm=0.678, which makes it easy to calculate the precise diameter of any of the three vessels when those of the other two are known.
Abstract: Aims: Coronary artery bifurcations present a harmonious asymmetric geometry that is fractal in nature. Interventional treatment of bifurcation lesions is a major technical issue. The present study is aimed at a precise quantification of this geometry in the hope of deriving a formulation that would be simple to calculate. Methods and results: Forty seven patients with strictly normal coronarographic results obtained ahead of valve replacement were enrolled, and 27'of these underwent IVUS examination to confirm that their arteries were indeed normal. Three reference diameters were measured: those of the mother vessel (D m, ) and of either daughter vessel (D d1 , D d2 ). One hundred and seventy-three bifurcations were thus subjected to quantitative analysis. The mean diameter of the mother vessels was 3.33±0.94 mm, of the major daughter vessels 2.70±0.77 mm, and of the minor daughter vessels 2.23±0.68 mm. The ratio R= D m /(D d1 +D d2 ) of mother-vessel diameter to the sum of the two daughter-vessel diameters was 3.39/(2.708+2.236)=0.678. This ratio held at all levels of bifurcation: i.e., whatever diameter the mother vessel. Conclusion: The study confirmed the fractal nature of the geometry of the epicardial coronary artery tree, and gave a simple and accurate fractal ratio between the diameters of the mother and two daughter vessels such that D m = 0.678 (D d1 +D d2 ). This makes it easy to calculate the precise diameter of any of the three vessels when those of the other two are known.

180 citations

Journal ArticleDOI
TL;DR: A new lossy to lossless progressive compression scheme for triangular meshes, based on a wavelet multiresolution theory for irregular 3D meshes, is proposed, where the algorithm performs better than previously published approaches for both lossless and progressive compression.
Abstract: We propose a new lossy to lossless progressive compression scheme for triangular meshes, based on a wavelet multiresolution theory for irregular 3D meshes. Although remeshing techniques obtain better compression ratios for geometric compression, this approach can be very effective when one wants to keep the connectivity and geometry of the processed mesh completely unchanged. The simplification is based on the solving of an inverse problem. Optimization of both the connectivity and geometry of the processed mesh improves the approximation quality and the compression ratio of the scheme at each resolution level. We show why this algorithm provides an efficient means of compression for both connectivity and geometry of 3D meshes and it is illustrated by experimental results on various sets of reference meshes, where our algorithm performs better than previously published approaches for both lossless and progressive compression.

126 citations

Journal ArticleDOI
TL;DR: This paper proposes to perform and assess CS reconstruction of channel RF data using the recently introduced wave atoms [1] representation, which exhibit advantageous properties for sparsely representing such oscillatory patterns and shows the superiority of the wave atom representation.
Abstract: Compressive sensing (CS) theory makes it possible – under certain assumptions – to recover a signal or an image sampled below the Nyquist sampling limit. In medical ultrasound imaging, CS could allow lowering the amount of acquired data needed to reconstruct the echographic image. CS thus offers the perspective of speeding up echographic acquisitions and could have many applications, e.g. triplex acquisitions for CFM/B-mode/Doppler imaging, high-frame-rate echocardiography, 3D imaging using matrix probes, etc. The objective of this paper is to study the feasibility of CS for the reconstruction of channel RF data, i.e. the 2D set of raw RF lines gathered at the receive elements. Successful application of CS implies selecting a representation basis where the data to be reconstructed have a sparse expansion. Because they consist mainly in warped oscillatory patterns, channel RF data do not easily lend themselves to a sparse representation and thus represent a specific challenge. Within this perspective, we propose to perform and assess CS reconstruction of channel RF data using the recently introduced wave atoms [1] representation, which exhibit advantageous properties for sparsely representing such oscillatory patterns. Reconstructions obtained using wave atoms are compared with the reconstruction performed with two conventional representation bases, namely Fourier and Daubechies wavelets. The first experiment was conducted on simulated channel RF data acquired from a numerical cyst phantom. The quality of the reconstructions was quantified through the mean absolute error at varying subsampling rates by removing 50–90% of the original samples. The results obtained for channel RF data reconstruction yield error ranges of [0.6–3.0] × 10−2, [0.2–2.6] × 10−2, [0.1–1.5] × 10−2, for wavelets, Fourier and wave atoms respectively. The error ranges observed for the associated beamformed log-envelope images are [2.4–20.6] dB, [1.1–12.2] dB, and [0.5–8.8 dB] using wavelets, Fourier, and wave atoms, respectively. These results thus show the superiority of the wave atom representation and the feasibility of CS for the reconstruction of US RF data. The second experiment aimed at showing the experimental feasibility of the method proposed using a data set acquired by imaging a general-purpose phantom (CIRS Model 054GS) using an Ultrasonix MDP scanner. The reconstruction was performed by removing 80% of the initial samples and using wave atoms. The reconstructed image was found to reliably preserve the speckle structure and was associated with an error of 5.5 dB.

124 citations


Cited by
More filters
Journal ArticleDOI
David J. Thomson1
01 Sep 1982
TL;DR: In this article, a local eigenexpansion is proposed to estimate the spectrum of a stationary time series from a finite sample of the process, which is equivalent to using the weishted average of a series of direct-spectrum estimates based on orthogonal data windows to treat both bias and smoothing problems.
Abstract: In the choice of an estimator for the spectrum of a stationary time series from a finite sample of the process, the problems of bias control and consistency, or "smoothing," are dominant. In this paper we present a new method based on a "local" eigenexpansion to estimate the spectrum in terms of the solution of an integral equation. Computationally this method is equivalent to using the weishted average of a series of direct-spectrum estimates based on orthogonal data windows (discrete prolate spheroidal sequences) to treat both the bias and smoothing problems. Some of the attractive features of this estimate are: there are no arbitrary windows; it is a small sample theory; it is consistent; it provides an analysis-of-variance test for line components; and it has high resolution. We also show relations of this estimate to maximum-likelihood estimates, show that the estimation capacity of the estimate is high, and show applications to coherence and polyspectrum estimates.

3,921 citations

Journal ArticleDOI
Leon Cohen1
01 Jul 1989
TL;DR: A review and tutorial of the fundamental ideas and methods of joint time-frequency distributions is presented with emphasis on the diversity of concepts and motivations that have gone into the formation of the field.
Abstract: A review and tutorial of the fundamental ideas and methods of joint time-frequency distributions is presented. The objective of the field is to describe how the spectral content of a signal changes in time and to develop the physical and mathematical ideas needed to understand what a time-varying spectrum is. The basic gal is to devise a distribution that represents the energy or intensity of a signal simultaneously in time and frequency. Although the basic notions have been developing steadily over the last 40 years, there have recently been significant advances. This review is intended to be understandable to the nonspecialist with emphasis on the diversity of concepts and motivations that have gone into the formation of the field. >

3,568 citations

Proceedings Article
01 Jan 1994
TL;DR: The main focus in MUCKE is on cleaning large scale Web image corpora and on proposing image representations which are closer to the human interpretation of images.
Abstract: MUCKE aims to mine a large volume of images, to structure them conceptually and to use this conceptual structuring in order to improve large-scale image retrieval. The last decade witnessed important progress concerning low-level image representations. However, there are a number problems which need to be solved in order to unleash the full potential of image mining in applications. The central problem with low-level representations is the mismatch between them and the human interpretation of image content. This problem can be instantiated, for instance, by the incapability of existing descriptors to capture spatial relationships between the concepts represented or by their incapability to convey an explanation of why two images are similar in a content-based image retrieval framework. We start by assessing existing local descriptors for image classification and by proposing to use co-occurrence matrices to better capture spatial relationships in images. The main focus in MUCKE is on cleaning large scale Web image corpora and on proposing image representations which are closer to the human interpretation of images. Consequently, we introduce methods which tackle these two problems and compare results to state of the art methods. Note: some aspects of this deliverable are withheld at this time as they are pending review. Please contact the authors for a preview.

2,134 citations

Journal ArticleDOI
TL;DR: A tutorial review of both linear and quadratic representations is given, and examples of the application of these representations to typical problems encountered in time-varying signal processing are provided.
Abstract: A tutorial review of both linear and quadratic representations is given. The linear representations discussed are the short-time Fourier transform and the wavelet transform. The discussion of quadratic representations concentrates on the Wigner distribution, the ambiguity function, smoothed versions of the Wigner distribution, and various classes of quadratic time-frequency representations. Examples of the application of these representations to typical problems encountered in time-varying signal processing are provided. >

1,587 citations

Book
02 Jan 1991

1,377 citations