scispace - formally typeset
Search or ask a question
Author

Didier J. LeGall

Bio: Didier J. LeGall is an academic researcher from Telcordia Technologies. The author has contributed to research in topics: Deinterlacing & Filter (video). The author has an hindex of 8, co-authored 12 publications receiving 739 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A multiresolution representation for video signals is introduced and Interpolation in an FIR (finite impulse response) scheme solves uncovered area problems, considerably improving the temporal prediction.
Abstract: A multiresolution representation for video signals is introduced. A three-dimensional spatiotemporal pyramid algorithms for high-quality compression of advanced television sequences is presented. The scheme utilizes a finite memory structure and is robust to channel errors, provides compatible subchannels, and can handle different scan formats, making it well suited for the broadcast environment. Additional features such as fast random access and reverse playback make it suitable for digital storage as well. Model-based processing is used both over space and time, where motion-based interpolation is used. Interpolation in an FIR (finite impulse response) scheme solves uncovered area problems, considerably improving the temporal prediction. The complexity is comparable to that of previous recursive schemes. Computer simulations indicate that high compression factors (about an order of magnitude) are easily achieved with no apparent loss of quality. The scheme also has a number of commonalities with the emerging MPEG standard. >

204 citations

Patent
09 Jun 1988
TL;DR: In this paper, a class of filters for use in connection with the sub-band coding of a video signal is disclosed, which are of short length, symmetric and have coefficients in the form of an integer divided by a power of two.
Abstract: A class of filters for use in connection with the sub-band coding of a video signal is disclosed. The filters are of short length, symmetric and have coefficients in the form of an integer divided by a power of two. The filters may be implemented with a minimum of circuitry and permit exact reconstruction of a sub-band coded video signal.

103 citations

Patent
19 Jul 1989
TL;DR: In this article, an adaptive transform coding algorithm for a still image is proposed, where the image is divided into small blocks of pixels and each block of pixels is transformed using an orthogonal transform such as a discrete cosine transform.
Abstract: In accordance with our adaptive transform coding algorithm for a still image, the image is divided into small blocks of pixels and each block of pixels is transformed using an orthogonal transform such as a discrete cosine transform. The resulting transform coefficients are compressed and coded to form a bit stream for transmission to a remote receiver. The compression parameters for each block of pixels are chosen based on a busyness measure for the block such as the magnitude of the (K+1) th most significant transform coefficient. This enables busy blocks for which the human visual system is not sensitive to degradation to be transmitted at low bit rates while enabling other blocks for which the human visual system is sensitive to degradation to be transmitted at higher bit rates. Thus, the algorithm is able to achieve a tradeoff between image quality and bit rate.

94 citations

01 Jan 1989
TL;DR: In this paper, perfect reconstruction filter banks in multiple dimensions in the contest of arbitrary sampling patterns are presented for the special case of quincunx sub-sampling and derive filter banks to go from processing to interlaced scanning.
Abstract: subband decompisition of HDTV signlas is important both for representation purpose (to create compatible subchannels) and fro coding (several proposed compression schemes include some subband division). We first review perfect reconstruction filter banks in multiple dimensions in the contest of arbitrary sampling patterns. Then we concentrate on the special case of quincunx substampling and derive filter banks to go from processing to interlaced scanning (with a highpass which contains deinterlacing information) as well as from interlaced to progressive. We apply this decompisition to a sequence and indicate bitrates.

88 citations

Journal ArticleDOI
TL;DR: This work applies subband decompisition of HDTV signlas to a sequence and indicates bitrates on the special case of quincunx substampling and derives filter banks to go from processing to interlaced scanning as well as from interlaces to progressive.
Abstract: subband decompisition of HDTV signlas is important both for representation purpose (to create compatible subchannels) and fro coding (several proposed compression schemes include some subband division). We first review perfect reconstruction filter banks in multiple dimensions in the contest of arbitrary sampling patterns. Then we concentrate on the special case of quincunx substampling and derive filter banks to go from processing to interlaced scanning (with a highpass which contains deinterlacing information) as well as from interlaced to progressive. We apply this decompisition to a sequence and indicate bitrates.

84 citations


Cited by
More filters
Journal ArticleDOI
J.M. Shapiro1
TL;DR: The embedded zerotree wavelet algorithm (EZW) is a simple, yet remarkably effective, image compression algorithm, having the property that the bits in the bit stream are generated in order of importance, yielding a fully embedded code.
Abstract: The embedded zerotree wavelet algorithm (EZW) is a simple, yet remarkably effective, image compression algorithm, having the property that the bits in the bit stream are generated in order of importance, yielding a fully embedded code The embedded code represents a sequence of binary decisions that distinguish an image from the "null" image Using an embedded coding algorithm, an encoder can terminate the encoding at any point thereby allowing a target rate or target distortion metric to be met exactly Also, given a bit stream, the decoder can cease decoding at any point in the bit stream and still produce exactly the same image that would have been encoded at the bit rate corresponding to the truncated bit stream In addition to producing a fully embedded bit stream, the EZW consistently produces compression results that are competitive with virtually all known compression algorithms on standard test images Yet this performance is achieved with a technique that requires absolutely no training, no pre-stored tables or codebooks, and requires no prior knowledge of the image source The EZW algorithm is based on four key concepts: (1) a discrete wavelet transform or hierarchical subband decomposition, (2) prediction of the absence of significant information across scales by exploiting the self-similarity inherent in images, (3) entropy-coded successive-approximation quantization, and (4) universal lossless data compression which is achieved via adaptive arithmetic coding >

5,559 citations

Journal ArticleDOI
Olivier Rioul1, Martin Vetterli
TL;DR: A simple, nonrigorous, synthetic view of wavelet theory is presented for both review and tutorial purposes, which includes nonstationary signal analysis, scale versus frequency,Wavelet analysis and synthesis, scalograms, wavelet frames and orthonormal bases, the discrete-time case, and applications of wavelets in signal processing.
Abstract: A simple, nonrigorous, synthetic view of wavelet theory is presented for both review and tutorial purposes. The discussion includes nonstationary signal analysis, scale versus frequency, wavelet analysis and synthesis, scalograms, wavelet frames and orthonormal bases, the discrete-time case, and applications of wavelets in signal processing. The main definitions and properties of wavelet transforms are covered, and connections among the various fields where results have been developed are shown. >

2,945 citations

Book
01 Mar 1995
TL;DR: Wavelets and Subband Coding offered a unified view of the exciting field of wavelets and their discrete-time cousins, filter banks, or subband coding and developed the theory in both continuous and discrete time.
Abstract: First published in 1995, Wavelets and Subband Coding offered a unified view of the exciting field of wavelets and their discrete-time cousins, filter banks, or subband coding. The book developed the theory in both continuous and discrete time, and presented important applications. During the past decade, it filled a useful need in explaining a new view of signal processing based on flexible time-frequency analysis and its applications. Since 2007, the authors now retain the copyright and allow open access to the book.

2,793 citations

Journal ArticleDOI
TL;DR: The article provides arguments in favor of an alternative approach that uses splines, which is equally justifiable on a theoretical basis, and which offers many practical advantages, and brings out the connection with the multiresolution theory of the wavelet transform.
Abstract: The article provides arguments in favor of an alternative approach that uses splines, which is equally justifiable on a theoretical basis, and which offers many practical advantages. To reassure the reader who may be afraid to enter new territory, it is emphasized that one is not losing anything because the traditional theory is retained as a particular case (i.e., a spline of infinite degree). The basic computational tools are also familiar to a signal processing audience (filters and recursive algorithms), even though their use in the present context is less conventional. The article also brings out the connection with the multiresolution theory of the wavelet transform. This article attempts to fulfil three goals. The first is to provide a tutorial on splines that is geared to a signal processing audience. The second is to gather all their important properties and provide an overview of the mathematical and computational tools available; i.e., a road map for the practitioner with references to the appropriate literature. The third goal is to give a review of the primary applications of splines in signal and image processing.

1,732 citations

Patent
03 Jan 1992
TL;DR: In this paper, a system of distributing video and audio information employs digital signal processing to achieve high rates of data compression, and the compressed and encoded audio and video information is sent over standard telephone, cable or satellite broadcast channels to a receiver specified by a subscriber of the service, preferably in less than real time, for later playback and optional recording on standard audio and/or video tape.
Abstract: A system of distributing video and/or audio information employs digital signal processing to achieve high rates of data compression. The compressed and encoded audio and/or video information is sent over standard telephone, cable or satellite broadcast channels to a receiver specified by a subscriber of the service, preferably in less than real time, for later playback and optional recording on standard audio and/or video tape.

1,032 citations