scispace - formally typeset
Search or ask a question

Showing papers on "Wavelet published in 1998"


Book
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.

17,693 citations


Journal ArticleDOI
TL;DR: In this article, a step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino-Southern Oscillation (ENSO).
Abstract: A practical step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino–Southern Oscillation (ENSO). The guide includes a comparison to the windowed Fourier transform, the choice of an appropriate wavelet basis function, edge effects due to finite-length time series, and the relationship between wavelet scale and Fourier frequency. New statistical significance tests for wavelet power spectra are developed by deriving theoretical wavelet spectra for white and red noise processes and using these to establish significance levels and confidence intervals. It is shown that smoothing in time or scale can be used to increase the confidence of the wavelet spectrum. Empirical formulas are given for the effect of smoothing on significance levels and confidence intervals. Extensions to wavelet analysis such as filtering, the power Hovmoller, cross-wavelet spectra, and coherence are described. The statistical significance tests are used to give a quantitative measure of change...

12,803 citations


Journal ArticleDOI
TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Abstract: The time-frequency and time-scale communities have recently developed a large number of overcomplete waveform dictionaries --- stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of frames (MOF), Matching pursuit (MP), and, for special dictionaries, the best orthogonal basis (BOB). Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution. BP has interesting relations to ideas in areas as diverse as ill-posed problems, in abstract harmonic analysis, total variation denoising, and multiscale edge denoising. BP in highly overcomplete dictionaries leads to large-scale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear programming by interior-point methods. We obtain reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.

9,950 citations


Journal ArticleDOI
TL;DR: The lifting wavelet as discussed by the authors is a simple construction of second generation wavelets that can be adapted to intervals, domains, surfaces, weights, and irregular samples, and it leads to a faster, in-place calculation of the wavelet transform.
Abstract: We present the lifting scheme, a simple construction of second generation wavelets; these are wavelets that are not necessarily translates and dilates of one fixed function. Such wavelets can be adapted to intervals, domains, surfaces, weights, and irregular samples. We show how the lifting scheme leads to a faster, in-place calculation of the wavelet transform. Several examples are included.

2,082 citations


Journal ArticleDOI
TL;DR: A new framework for statistical signal processing based on wavelet-domain hidden Markov models (HMMs) that concisely models the statistical dependencies and non-Gaussian statistics encountered in real-world signals is developed.
Abstract: Wavelet-based statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many real-world signals. We develop a new framework for statistical signal processing based on wavelet-domain hidden Markov models (HMMs) that concisely models the statistical dependencies and non-Gaussian statistics encountered in real-world signals. Wavelet-domain HMMs are designed with the intrinsic properties of the wavelet transform in mind and provide powerful, yet tractable, probabilistic signal models. Efficient expectation maximization algorithms are developed for fitting the HMMs to observational signal data. The new framework is suitable for a wide range of applications, including signal estimation, detection, classification, prediction, and even synthesis. To demonstrate the utility of wavelet-domain HMMs, we develop novel algorithms for signal denoising, classification, and detection.

1,783 citations


Proceedings ArticleDOI
04 Jan 1998
TL;DR: A general trainable framework for object detection in static images of cluttered scenes based on a wavelet representation of an object class derived from a statistical analysis of the class instances and a motion-based extension to enhance the performance of the detection algorithm over video sequences is presented.
Abstract: This paper presents a general trainable framework for object detection in static images of cluttered scenes. The detection technique we develop is based on a wavelet representation of an object class derived from a statistical analysis of the class instances. By learning an object class in terms of a subset of an overcomplete dictionary of wavelet basis functions, we derive a compact representation of an object class which is used as an input to a support vector machine classifier. This representation overcomes both the problem of in-class variability and provides a low false detection rate in unconstrained environments. We demonstrate the capabilities of the technique in two domains whose inherent information content differs significantly. The first system is face detection and the second is the domain of people which, in contrast to faces, vary greatly in color, texture, and patterns. Unlike previous approaches, this system learns from examples and does not rely on any a priori (hand-crafted) models or motion-based segmentation. The paper also presents a motion-based extension to enhance the performance of the detection algorithm over video sequences. The results presented here suggest that this architecture may well be quite general.

1,594 citations


Journal ArticleDOI
TL;DR: Two approaches to build integer to integer wavelet transforms are presented and the precoder of Laroiaet al., used in information transmission, is adapted and combined with expansion factors for the high and low pass band in subband filtering.

1,269 citations


Journal ArticleDOI
TL;DR: A nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coefficients is developed, andVariants of this method based on simple threshold nonlinear estimators are nearly minimax.
Abstract: We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets, we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coefficients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel- and Besov-type smoothness constraints and asymptotically mini-max over Besov bodies with $p \leq q$. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with $p<2$, so the method can significantly outperform every linear method (e.g., kernel, smoothing spline, sieve in a minimax sense). Variants of our method based on simple threshold nonlinear estimators are nearly minimax. Our method possesses the interpretation of spatial adaptivity; it reconstructs using a kernel which may vary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper, which was first drafted in November 1990, discuss practical implementation, spatial adaptation properties, universal near minimaxity and applications to inverse problems.

1,066 citations


Journal ArticleDOI
TL;DR: A wavelet-based tool for the analysis of long-range dependence and a related semi-parametric estimator of the Hurst parameter is introduced and is shown to be unbiased under very general conditions, and efficient under Gaussian assumptions.
Abstract: A wavelet-based tool for the analysis of long-range dependence and a related semi-parametric estimator of the Hurst parameter is introduced. The estimator is shown to be unbiased under very general conditions, and efficient under Gaussian assumptions. It can be implemented very efficiently allowing the direct analysis of very large data sets, and is highly robust against the presence of deterministic trends, as well as allowing their detection and identification. Statistical, computational, and numerical comparisons are made against traditional estimators including that of Whittle. The estimator is used to perform a thorough analysis of the long-range dependence in Ethernet traffic traces. New features are found with important implications for the choice of valid models for performance evaluation. A study of mono versus multifractality is also performed, and a preliminary study of the stationarity with respect to the Hurst parameter and deterministic trends.

1,034 citations


Journal ArticleDOI
C.I. Podilchuk1, Wenjun Zeng2
TL;DR: This work proposes perceptually based watermarking schemes in two frameworks: the block-based discrete cosine transform and multiresolution wavelet framework and discusses the merits of each one, which are shown to provide very good results both in terms of image transparency and robustness.
Abstract: The huge success of the Internet allows for the transmission, wide distribution, and access of electronic data in an effortless manner. Content providers are faced with the challenge of how to protect their electronic data. This problem has generated a flurry of research activity in the area of digital watermarking of electronic content for copyright protection. The challenge here is to introduce a digital watermark that does not alter the perceived quality of the electronic content, while being extremely robust to attack. For instance, in the case of image data, editing the picture or illegal tampering should not destroy or transform the watermark into another valid signature. Equally important, the watermark should not alter the perceived visual quality of the image. From a signal processing perspective, the two basic requirements for an effective watermarking scheme, robustness and transparency, conflict with each other. We propose two watermarking techniques for digital images that are based on utilizing visual models which have been developed in the context of image compression. Specifically, we propose watermarking schemes where visual models are used to determine image dependent upper bounds on watermark insertion. This allows us to provide the maximum strength transparent watermark which, in turn, is extremely robust to common image processing and editing such as JPEG compression, rescaling, and cropping. We propose perceptually based watermarking schemes in two frameworks: the block-based discrete cosine transform and multiresolution wavelet framework and discuss the merits of each one. Our schemes are shown to provide very good results both in terms of image transparency and robustness.

962 citations


Journal ArticleDOI
TL;DR: In this paper, a wavelet transform method was proposed to combine the high spectral resolution of Landsat TM images and the high spatial resolution of SPOT panchromatic images (SPOT PAN).
Abstract: To take advantage of the high spectral resolution of Landsat TM images and the high spatial resolution of SPOT panchromatic images (SPOT PAN), we present a wavelet transform method to merge the two data types. In a pyramidal fashion, each TM reflective band or SPOT PAN image was decomposed into an orthogonal wavelet representation at a given coarser resolution, which consisted of a low frequency approximation image and a set of high frequency, spatially-oriented detail images. Band-by-band, the merged images were derived by performing an inverse wavelet transform using the approximation image from each TM band and detail images from SPOT PAN. The spectral and spatial features of the merged results of the wavelet methods were compared quantitatively with those of intensity-hue-saturation (IHS), principal component analysis (PCA), and the Brovey transform. It was found that multisensor data merging is a trade-off between the spectral information from a low spatial-high spectral resolution sensor an...

Journal ArticleDOI
TL;DR: Multiscale Principal Component Analysis (MSPCA) as mentioned in this paper combines the ability of PCA to decorrelate the variables by extracting a linear relationship with that of wavelet analysis to extract deterministic features and approximately decorrelation of autocorrelated measurements.
Abstract: Multiscale principal-component analysis (MSPCA) combines the ability of PCA to decorrelate the variables by extracting a linear relationship with that of wavelet analysis to extract deterministic features and approximately decorrelate autocorrelated measurements. MSPCA computes the PCA of wavelet coefficients at each scale and then combines the results at relevant scales. Due to its multiscale nature, MSPCA is appropriate for the modeling of data containing contributions from events whose behavior changes over time and frequency. Process monitoring by MSPCA involves combining only those scales where significant events are detected, and is equivalent to adaptively filtering the scores and residuals, and adjusting the detection limits for easiest detection of deterministic changes in the measurements. Approximate decorrelation of wavelet coefficients also makes MSPCA effective for monitoring autocorrelated measurements without matrix augmentation or time-series modeling. In addition to improving the ability to detect deterministic changes, monitoring by MSPCA also simultaneously extracts those features that represent abnormal operation. The superior performance of MSPCA for process monitoring is illustrated by several examples.

Journal ArticleDOI
TL;DR: Extensive computations are presented that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F: the largest alpha for which FinEpsilon(q)(alpha )(L( q)(I)),1/q=alpha/2+1/2, and the norm |F|B(q) alpha)(L(Q)(I)).
Abstract: This paper examines the relationship between wavelet-based image processing algorithms and variational problems. Algorithms are derived as exact or approximate minimizers of variational problems; in particular, we show that wavelet shrinkage can be considered the exact minimizer of the following problem. Given an image F defined on a square I, minimize over all g in the Besov space B11(L1(I)) the functional |F-g|L2(I)2+λ|g|(B11(L1(I))). We use the theory of nonlinear wavelet image compression in L2(I) to derive accurate error bounds for noise removal through wavelet shrinkage applied to images corrupted with i.i.d., mean zero, Gaussian noise. A new signal-to-noise ratio (SNR), which we claim more accurately reflects the visual perception of noise in images, arises in this derivation. We present extensive computations that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F: the largest α for which F∈Bqα(Lq(I)),1/q=α/2+1/2, and the norm |F|Bqα(Lq(I)). Both theoretical and experimental results indicate that our choice of shrinkage parameters yields uniformly better results than Donoho and Johnstone's VisuShrink procedure; an example suggests, however, that Donoho and Johnstone's (1994, 1995, 1996) SureShrink method, which uses a different shrinkage parameter for each dyadic level, achieves a lower error than our procedure.

Book
01 Sep 1998
TL;DR: This chapter discusses the MRA, Orthonormal Wavelets, and Their Relationship to Filter Banks, and the Definition of the CWT, as well as other applications of Wavelet Transforms, including Communication Applications.
Abstract: Preface. Acknowledgments. 1. Continuous Wavelet Transform. Introduction. Continuous-Time Wavelets. Definition of the CWT. The CWT as a Correlation. Constant Q-Factor Filtering Interpretation and Time-Frequency Resolution. The CWT as an Operator. Inverse CWT. Problems. 2. Introduction to the Discrete Wavelet Transform and Orthogonal Wavelet Decomposition. Introduction. Approximations of Vectors in Nested Linear Vector Subspaces. Example of Approximating Vectors in Nested Subspaces of a Finite-Dimensional Linear Vector Space. Example of Approximating Vectors in Nested Subspaces of an Infinite-Dimensional Linear Vector Space. Example of an MRA. Bases for the Approximation Subspaces and Haar Scaling Function. Bases for the Detail Subspaces and Haar Wavelet. Digital Filter Implementation of the Haar Wavelet Decomposition. Problems. 3. MRA, Orthonormal Wavelets, and Their Relationship to Filter Banks. Introduction. Formal Definition of an MRA. Construction of a General Orthonormal MRA. Scaling Function and Subspaces. Implications of the Dilation Equation and Orthogonality. A Wavelet Basis for the MRA. Two-scale Relation for psi (t). Basis for the Detail Subspaces. Direct Sum Decomposition. Digital Filtering Interpretation. Decomposition Filters. Reconstructing the Signal. Examples of Orthogonal Basis-Generating Wavelets. Daubechies D4 Scaling Function and Wavelet. Bandlimited Wavelets. Interpreting Orthonormal MRAs for Discrete-Time Signals. Continuous-time MRA Interpretation for the DTWT. Discrete-Time MRA. Basis Functions for the DTWT. Miscellaneous Issues Related to PRQMF Filter Banks. Generating Scaling Functions and Wavelets from Filter Coefficients. Problems. 4. Alternative Wavelet Representations. Introduction. Biorthogonal Wavelet Bases. Filtering Relationship for Biorthogonal Filters. Examples of Biorthogonal Scaling Functions and Wavelets. Two-Dimensional Wavelets. Nonseparable Multidimensional Wavelets. Wavelet Packets. Problems. 5. Wavelet Transform and Data Compression. Introduction. Transform Coding. DTWT for Image Compression. Image Compression Using DTWT and Run-length Encoding. Embedded Tree Image Coding. Comparison with JPEG. Audio Compression. Audio Masking. Standards Specifying Subband Implementation: ISO/MPEG Coding for Audio. Wavelet-Based Audio Coding. Video Coding Using Multiresolution Techniques: A Brief Introduction. 6. Other Applications of Wavelet Transforms. Introduction. Wavelet Denoising. Speckle Removal. Edge Detection and Object Isolation. Image Fusion. Object Detection by Wavelet Transforms of Projections. Communication Applications. Scaling Functions as Signaling Pulses. Discrete Wavelet Multitone Modulation. 7. Advanced Topics. Introduction. CWTs Revisited. Parseval's Identity for the CWT. Inverse CWT Is a Many-to-One Operation. Wavelet Inner Product as a Projection Operation. Bridging the Gap Between CWTs and DWTs. CWT with an Orthonormal Basis-Generating Wavelet. A Trous Algorithm. Regularity and Convergence. Daubechies Construction of Orthonormal Scaling Functions. Bandlimited Biorthogonal Decomposition. Scaling Function Pair Construction. Wavelet Pair Construction. Design and Selection of Wavelets. The Lifting Scheme. Best Basis Selection. Wavelet Matching. Perfect Reconstruction Circular Convolution Filter Banks. Downsampling. Upsampling. Procedure for Implementation. Conditions for Perfect Reconstruction. Procedure for Constructing PRCC Filter Banks. Interpolators Matched to the Input Process. Interpolation Sampling. Frequency-Sampled Implementation of Bandlimited DWTs. The Scaling Operation and Self-Similar Signals. LTI Systems and Eigenfunctions. Continuous-Time Linear Scale-Invariant System. Scaling in Discrete Time. Discrete-time LSI Systems. Appendix A. Fundamentals of Multirate Systems. The Downsampler. The Upsampler. Noble Identities. Appendix B. Linear Algebra and Vector Spaces. Brief Review of Vector Spaces. Vector Subspace. Linear Independence and Bases. Inner Product Spaces. Hilbert Space and Riesz Bases. Index.

Book
30 Apr 1998
TL;DR: In this paper, the Haar basis wavelet system is used for multiresolution analysis and wavelet thresholding, and a cascade algorithm is used to transform wavelets into a wavelet transform.
Abstract: 1 Wavelets.- 1.1 What can wavelets offer?.- 1.2 General remarks.- 1.3 Data compression.- 1.4 Local adaptivity.- 1.5 Nonlinear smoothing properties.- 1.6 Synopsis.- 2 The Haar basis wavelet system.- 3 The idea of multiresolution analysis.- 3.1 Multiresolution analysis.- 3.2 Wavelet system construction.- 3.3 An example.- 4 Some facts from Fourier analysis.- 5 Basic relations of wavelet theory.- 5.1 When do we have a wavelet expansion?.- 5.2 How to construct mothers from a father.- 5.3 Additional remarks.- 6 Construction of wavelet bases.- 6.1 Construction starting from Riesz bases.- 6.2 Construction starting from m0.- 7 Compactly supported wavelets.- 7.1 Daubechies' construction.- 7.2 Coiflets.- 7.3 Symmlets.- 8 Wavelets and Approximation.- 8.1 Introduction.- 8.2 Sobolev Spaces.- 8.3 Approximation kernels.- 8.4 Approximation theorem in Sobolev spaces.- 8.5 Periodic kernels and projection operators.- 8.6 Moment condition for projection kernels.- 8.7 Moment condition in the wavelet case.- 9 Wavelets and Besov Spaces.- 9.1 Introduction.- 9.2 Besov spaces.- 9.3 Littlewood-Paley decomposition.- 9.4 Approximation theorem in Besov spaces.- 9.5 Wavelets and approximation in Besov spaces.- 10 Statistical estimation using wavelets.- 10.1 Introduction.- 10.2 Linear wavelet density estimation.- 10.3 Soft and hard thresholding.- 10.4 Linear versus nonlinear wavelet density estimation.- 10.5 Asymptotic properties of wavelet thresholding estimates.- 10.6 Some real data examples.- 10.7 Comparison with kernel estimates.- 10.8 Regression estimation.- 10.9 Other statistical models.- 11 Wavelet thresholding and adaptation.- 11.1 Introduction.- 11.2 Different forms of wavelet thresholding.- 11.3 Adaptivity properties of wavelet estimates.- 11.4 Thresholding in sequence space.- 11.5 Adaptive thresholding and Stein's principle.- 11.6 Oracle inequalities.- 11.7 Bibliographic remarks.- 12 Computational aspects and software.- 12.1 Introduction.- 12.2 The cascade algorithm.- 12.3 Discrete wavelet transform.- 12.4 Statistical implementation of the DWT.- 12.5 Translation invariant wavelet estimation.- 12.6 Main wavelet commands in XploRe.- A Tables.- A.1 Wavelet Coefficients.- A.2.- B Software Availability.- C Bernstein and Rosenthal inequalities.- D A Lemma on the Riesz basis.- Author Index.

Journal ArticleDOI
TL;DR: In this article, the use of wavelet transforms for analyzing power system fault transients in order to determine the fault location is described, which is related to the travel time of the signals which are already decomposed into their modal components.
Abstract: This paper describes the use of wavelet transforms for analyzing power system fault transients in order to determine the fault location. Traveling wave theory is utilized in capturing the travel time of the transients along the monitored lines between the fault point and the relay. Time resolution for the high frequency components of the fault transients, is provided by the wavelet transform. This information is related to the travel time of the signals which are already decomposed into their modal components. The aerial mode is used for all fault types, whereas the ground mode is used to resolve problems associated with certain special cases. The wavelet transform is found to be an excellent discriminant for identifying the traveling wave reflections from the fault, irrespective of the fault type and impedance. EMTP simulations are used to test and validate the proposed fault location approach for typical power system faults.

Journal ArticleDOI
TL;DR: In this paper, a prior distribution is imposed on the wavelet coefficients of the unknown response function, designed to capture the sparseness of wavelet expansion that is common to most applications.
Abstract: We discuss a Bayesian formalism which gives rise to a type of wavelet threshold estimation in nonparametric regression. A prior distribution is imposed on the wavelet coefficients of the unknown response function, designed to capture the sparseness of wavelet expansion that is common to most applications. For the prior specified, the posterior median yields a thresholding procedure. Our prior model for the underlying function can be adjusted to give functions falling in any specific Besov space. We establish a relationship between the hyperparameters of the prior model and the parameters of those Besov spaces within which realizations from the prior will fall. Such a relationship gives insight into the meaning of the Besov space parameters. Moreover, the relationship established makes it possible in principle to incorporate prior knowledge about the function's regularity properties into the prior model for its wavelet coefficients. However, prior knowledge about a function's regularity properties might be difficult to elicit; with this in mind, we propose a standard choice of prior hyperparameters that works well in our examples. Several simulated examples are used to illustrate our method, and comparisons are made with other thresholding methods. We also present an application to a data set that was collected in an anaesthesiological study.

Proceedings ArticleDOI
12 May 1998
TL;DR: In this article, a multiresolution wavelet fusion technique is proposed for the digital watermarking of still images, which is robust to a variety of signal distortions and is not required for watermark extraction.
Abstract: We present a novel technique for the digital watermarking of still images based on the concept of multiresolution wavelet fusion. The algorithm is robust to a variety of signal distortions. The original unmarked image is not required for watermark extraction. We provide analysis to describe the behaviour of the method for varying system parameter values. We compare our approach with another transform domain watermarking method. Simulation results show the superior performance of the technique and demonstrate its potential for the robust watermarking of photographic imagery.

Proceedings ArticleDOI
01 Jun 1998
TL;DR: This paper presents a technique based upon a multiresolution wavelet decomposition for building histograms on the underlying data distributions, with applications to databases, statistics, and simulation.
Abstract: Query optimization is an integral part of relational database management systems. One important task in query optimization is selectivity estimation, that is, given a query P, we need to estimate the fraction of records in the database that satisfy P. Many commercial database systems maintain histograms to approximate the frequency distribution of values in the attributes of relations.In this paper, we present a technique based upon a multiresolution wavelet decomposition for building histograms on the underlying data distributions, with applications to databases, statistics, and simulation. Histograms built on the cumulative data distributions give very good approximations with limited space usage. We give fast algorithms for constructing histograms and using them in an on-line fashion for selectivity estimation. Our histograms also provide quick approximate answers to OLAP queries when the exact answers are not required. Our method captures the joint distribution of multiple attributes effectively, even when the attributes are correlated. Experiments confirm that our histograms offer substantial improvements in accuracy over random sampling and other previous approaches.

Journal ArticleDOI
TL;DR: The robustness of the watermarking procedure to embed copyright protection into digital video to several video degradations and distortions is demonstrated.
Abstract: We present a watermarking procedure to embed copyright protection into digital video. Our watermarking procedure is scene-based and video dependent. It directly exploits spatial masking, frequency masking, and temporal properties to embed an invisible and robust watermark. The watermark consists of static and dynamic temporal components that are generated from a temporal wavelet transform of the video scenes. The resulting wavelet coefficient frames are modified by a perceptually shaped pseudorandom sequence representing the author. The noise-like watermark is statistically undetectable to thwart unauthorized removal. Furthermore, the author representation resolves the deadlock problem. The multiresolution watermark may be detected on single frames without knowledge of the location of the frames in the video scene. We demonstrate the robustness of the watermarking procedure to several video degradations and distortions.

Journal ArticleDOI
TL;DR: A method for detection and identification of polar gases and gas mixtures based on the technique of tera- hertz time-domain spectroscopy is presented, and the utility of a wavelet-based signal analysis for tasks such as denoising is demonstrated.
Abstract: A method for detection and identification of polar gases and gas mixtures based on the technique of tera- hertz time-domain spectroscopy is presented. This relatively new technology promises to be the first portable far-infrared spectrometer, providing a means for real-time spectroscopic measurements over a broad bandwidth up to several THz .T he measured time-domain waveforms can be efficiently param- eterized using standard tools from signal processing, includ- ing procedures developed for speech recognition applications. These are generally more efficient than conventional methods based on Fourier analysis, and are easier to implement in a real-time sensing system. Preliminary results of real-time gas mixture analysis using a linear predictive coding algo- rithm are presented. A number of possible avenues for im- proved signal processing schemes are discussed. In particular, the utility of a wavelet-based signal analysis for tasks such as denoising is demonstrated.

Journal ArticleDOI
TL;DR: In this article, the authors used wavelets to decompose economic time series into orthogonal timescale components, and showed the importance of recognizing variations in phase between time series variables when investigating the relationships between them and shed light on the conflicting results that have been obtained in the literature using Granger causality tests.
Abstract: Economists have long known that timescale matters in that the structure of decisions as to the relevant time horizon, degree of time aggregation, strength of relationship, and even the relevant variables differ by timescale. Unfortunately, until recently it was difficult to decompose economic time series into orthogonal timescale components except for the short or long run in which the former is dominated by noise. Wavelets are used to produce an orthogonal decomposition of some economic variables by timescale over six different timescales. The relationship of interest is that between money and income, i.e., velocity. We confirm that timescale decomposition is very important for analyzing economic relationships. The analysis indicates the importance of recognizing variations in phase between variables when investigating the relationships between them and throws considerable light on the conflicting results that have been obtained in the literature using Granger causality tests.

Book
01 Jan 1998
TL;DR: This unique resource examines the conceptual, computational, and practical aspects of applied signal processing using wavelets to understand and use the power and utility of new wavelet methods in science and engineering problems and analysis.
Abstract: This unique resource examines the conceptual, computational, and practical aspects of applied signal processing using wavelets. With this book, readers will understand and be able to use the power and utility of new wavelet methods in science and engineering problems and analysis. The text is written in a clear, accessible style avoiding unnecessary abstractions and details. From a computational perspective, wavelet signal processing algorithms are presented and applied to signal compression, noise suppression, and signal identification. Numerical illustrations of these computational techniques are further provided with interactive software (MATLAB code) is available on the world wide web. Topics and Features: * Continuous wavelet and Gabor transforms * Frame-based theory of discretization and reconstruction of analog signals is developed * New and efficient "overcomplete" wavelet transform is introduced and applied * Numerical illustrations with an object-oriented computational perspective using the Wavelet Signal Processing Workstation (MATLAB code) available This book is an excellent resource for information and computational tools needed to use wavelets in many types of signal processing problems. Graduates, professionals, and practitioners in engineering, computer science, geophysics, and applied mathematics will benefit from using the book and software tools.

Book
01 Jan 1998
TL;DR: This edited volume incorporates the most recent developments in the field to illustrate thoroughly how the use of these time-frequency methods is currently improving the quality of medical diagnosis, including technologies for assessing pulmonary and respiratory conditions.
Abstract: DESCRIPTION Brimming with top articles from experts in signal processing and biomedical engineering, Time Frequency and Wavelets in Biomedical Signal Processing introduces time-frequency, time-scale, wavelet transform methods, and their applications in biomedical signal processing. This edited volume incorporates the most recent developments in the field to illustrate thoroughly how the use of these time-frequency methods is currently improving the quality of medical diagnosis, including technologies for assessing pulmonary and respiratory conditions, EEGs, hearing aids, MRIs, mammograms, X rays, evoked potential signals analysis, neural networks applications, among other topics. Time Frequency and Wavelets in Biomedical Signal Processing will be of particular interest to signal processing engineers, biomedical engineers, and medical researchers.

Journal ArticleDOI
TL;DR: The authors show that the algorithm presented is capable of not only reducing speckle, but also enhancing features of diagnostic importance, such as myocardial walls in two-dimensional echocardiograms obtained from the parasternal short-axis view.
Abstract: This paper presents an algorithm for speckle reduction and contrast enhancement of echocardiographic images. Within a framework of multiscale wavelet analysis, the authors apply wavelet shrinkage techniques to eliminate noise while preserving the sharpness of salient features. In addition, nonlinear processing of feature energy is carried out to enhance contrast within local structures and along object boundaries. The authors show that the algorithm is capable of not only reducing speckle, but also enhancing features of diagnostic importance, such as myocardial walls in two-dimensional echocardiograms obtained from the parasternal short-axis view. Shrinkage of wavelet coefficients via soft thresholding within finer levels of scale is carried out on coefficients of logarithmically transformed echocardiograms. Enhancement of echocardiographic features is accomplished via nonlinear stretching followed by hard thresholding of wavelet coefficients within selected (midrange) spatial-frequency levels of analysis. The authors formulate the denoising and enhancement problem, introduce a class of dyadic wavelets, and describe their implementation of a dyadic wavelet transform. Their approach for speckle reduction and contrast enhancement was shown to be less affected by pseudo-Gibbs phenomena. The authors show experimentally that this technique produced superior results both qualitatively and quantitatively when compared to results obtained from existing denoising methods alone. A study using a database of clinical echocardiographic images suggests that such denoising and enhancement may improve the overall consistency of expert observers to manually defined borders.

Journal ArticleDOI
Brani Vidakovic1
TL;DR: A wavelet shrinkage by coherent Bayesian inference in the wavelet domain is proposed and the methods are tested on standard Donoho-Johnstone test functions.
Abstract: Wavelet shrinkage, the method proposed by the seminal work of Donoho and Johnstone is a disarmingly simple and efficient way of denoising data. Shrinking wavelet coefficients was proposed from several optimality criteria. In this article a wavelet shrinkage by coherent Bayesian inference in the wavelet domain is proposed. The methods are tested on standard Donoho-Johnstone test functions.

Journal ArticleDOI
TL;DR: Calculations based on high-resolution quantizations prove that the distortion rate D(R) of an image transform coding is proportional to 2/sup -2R/ when R is large enough, and shows that the compression performance of an orthonormal basis depends mostly on its ability to approximate images with a few nonzero vectors.
Abstract: Calculations based on high-resolution quantizations prove that the distortion rate D(R) of an image transform coding is proportional to 2/sup -2R/ when R is large enough. In wavelet and block cosine bases, we show that if R<1 bit/pixel, then D(R) varies like R/sup 1-2/spl gamma//, where /spl gamma/ remains of the order of 1 for most natural images. The improved performance of embedded codings in wavelet bases is analyzed. At low bit rates, we show that the compression performance of an orthonormal basis depends mostly on its ability to approximate images with a few nonzero vectors.

Proceedings ArticleDOI
08 Sep 1998
TL;DR: It is shown how the dual-tree complex wavelet transform can provide a good basis for multi-resolution image denoising and de-blurring.
Abstract: A new implementation of the Discrete Wavelet Transform is presented for applications such as image restoration and enhancement. It employs a dual tree of wavelet filters to obtain the real and imaginary parts of the complex wavelet coefficients. This introduces limited redundancy (4 : 1 for 2-dimensional signals) and allows the transform to provide approximate shift invariance and directionally selective filters (properties lacking in the traditional wavelet transform) while preserving the usual properties of perfect reconstruction and computational efficiency. We show how the dual-tree complex wavelet transform can provide a good basis for multi-resolution image denoising and de-blurring.

Proceedings ArticleDOI
04 Oct 1998
TL;DR: A new method for digital image watermarking which does not require the original image for watermark detection is presented and features improved resistance to attacks on the watermark, implicit visual masking utilizing the time-frequency localization property of the wavelet transform and a robust definition for the threshold which validates the watermarks.
Abstract: A new method for digital image watermarking which does not require the original image for watermark detection is presented. Assuming that we are using a transform domain spread spectrum watermarking scheme, it is important to add the watermark in select coefficients with significant image energy in the transform domain in order to ensure non-erasability of the watermark. Previous methods, which did not use the original in the detection process, could not selectively add the watermark to the significant coefficients, since the locations of such selected coefficients can change due to image manipulations. Since watermark verification typically consists of a process of correlation which is extremely sensitive to the relative order in which the watermark coefficients are placed within the image, such changes in the location of the watermarked coefficients was unacceptable. We present a scheme which overcomes this problem of "order sensitivity". Advantages of the proposed method include (i) improved resistance to attacks on the watermark, (ii) implicit visual masking utilizing the time-frequency localization property of the wavelet transform and (iii) a robust definition for the threshold which validates the watermark. We present results comparing our method with previous techniques, which clearly validate our claims.

Book
01 Jul 1998
TL;DR: In this article, the wavelet system and wavelet sets operator interpolation of wavelets are discussed. And examples of interpolation maps are given for examples of a wavelet interpolation map.
Abstract: Introduction The local commutant Structural theorems The wavelet system $\langle D,T\rangle$ Wavelet sets Operator interpolation of wavelets Concluding remarks Appendix: Examples of interpolation maps Bibliography.