scispace - formally typeset
Search or ask a question

Showing papers on "Wavelet published in 2002"


Journal ArticleDOI
TL;DR: In this paper, the authors describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform, which offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity.
Abstract: We describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform. Our implementations offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity. A central tool is Fourier-domain computation of an approximate digital Radon transform. We introduce a very simple interpolation in the Fourier space which takes Cartesian samples and yields samples on a rectopolar grid, which is a pseudo-polar sampling set based on a concentric squares geometry. Despite the crudeness of our interpolation, the visual performance is surprisingly good. Our ridgelet transform applies to the Radon transform a special overcomplete wavelet pyramid whose wavelets have compact support in the frequency domain. Our curvelet transform uses our ridgelet transform as a component step, and implements curvelet subbands using a filter bank of a/spl grave/ trous wavelet filters. Our philosophy throughout is that transforms should be overcomplete, rather than critically sampled. We apply these digital transforms to the denoising of some standard images embedded in white noise. In the tests reported here, simple thresholding of the curvelet coefficients is very competitive with "state of the art" techniques based on wavelets, including thresholding of decimated or undecimated wavelet transforms and also including tree-based Bayesian posterior mean methods. Moreover, the curvelet reconstructions exhibit higher perceptual quality than wavelet-based reconstructions, offering visually sharper images and, in particular, higher quality recovery of edges and of faint linear and curvilinear features. Existing theory for curvelet and ridgelet transforms suggests that these new approaches can outperform wavelet methods in certain image reconstruction problems. The empirical results reported here are in encouraging agreement.

2,244 citations


Journal ArticleDOI
TL;DR: A statistical view of the texture retrieval problem is presented by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme that leads to a new wavelet-based texture retrieval method that is based on the accurate modeling of the marginal distribution of wavelet coefficients using generalized Gaussian density (GGD).
Abstract: We present a statistical view of the texture retrieval problem by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme. We show that using a consistent estimator of texture model parameters for the FE step followed by computing the Kullback-Leibler distance (KLD) between estimated models for the SM step is asymptotically optimal in term of retrieval error probability. The statistical scheme leads to a new wavelet-based texture retrieval method that is based on the accurate modeling of the marginal distribution of wavelet coefficients using generalized Gaussian density (GGD) and on the existence a closed form for the KLD between GGDs. The proposed method provides greater accuracy and flexibility in capturing texture information, while its simplified form has a close resemblance with the existing methods which uses energy distribution in the frequency domain to identify textures. Experimental results on a database of 640 texture images indicate that the new method significantly improves retrieval rates, e.g., from 65% to 77%, compared with traditional approaches, while it retains comparable levels of computational complexity.

1,228 citations


Journal ArticleDOI
TL;DR: This work proposes new non-Gaussian bivariate distributions, and corresponding nonlinear threshold functions (shrinkage functions) are derived from the models using Bayesian estimation theory, but the new shrinkage functions do not assume the independence of wavelet coefficients.
Abstract: Most simple nonlinear thresholding rules for wavelet-based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. We only consider the dependencies between the coefficients and their parents in detail. For this purpose, new non-Gaussian bivariate distributions are proposed, and corresponding nonlinear threshold functions (shrinkage functions) are derived from the models using Bayesian estimation theory. The new shrinkage functions do not assume the independence of wavelet coefficients. We show three image denoising examples in order to show the performance of these new bivariate shrinkage rules. In the second example, a simple subband-dependent data-driven image denoising system is described and compared with effective data-driven techniques in the literature, namely VisuShrink, SureShrink, BayesShrink, and hidden Markov models. In the third example, the same idea is applied to the dual-tree complex wavelet coefficients.

1,048 citations


BookDOI
15 Jul 2002
TL;DR: The Illustrated Wavelet Transform Handbook: Introductory Theory and Applications in Science, Engineering, Medicine and Finance as discussed by the authors is a comprehensive overview of wavelet transform applications in science, engineering, medicine and finance.
Abstract: (The correction deals with the fact that the complex Morlet wavelet has a non-zero See for example: The Illustrated Wavelet Transform Handbook: Introductory. The Illustrated Wavelet Transform Handbook: Introductory Theory and Applications in Science, Engineering, Medicine and Finance. CRC Press, Boca Raton. Tags: synchrosqueezing time-frequency analysis wavelet transform P.S. Addison, The Illustrated Wavelet Transform Handbook: Introductory Theory.

942 citations


Journal ArticleDOI
TL;DR: The Wavelet Methods for Time Series Analysis (WMSSA) as discussed by the authors is a wavelet-based method for time series analysis, which is based on wavelet wavelet analysis.
Abstract: (2002). Wavelet Methods for Time Series Analysis. Journal of the American Statistical Association: Vol. 97, No. 457, pp. 362-363.

843 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compare two general and formal solutions to the problem of fusion of multispectral images with high-resolution panchromatic observations, and compare the results on SPOT data.
Abstract: This paper compares two general and formal solutions to the problem of fusion of multispectral images with high-resolution panchromatic observations. The former exploits the undecimated discrete wavelet transform, which is an octave bandpass representation achieved from a conventional discrete wavelet transform by omitting all decimators and upsampling the wavelet filter bank. The latter relies on the generalized Laplacian pyramid, which is another oversampled structure obtained by recursively subtracting from an image an expanded decimated lowpass version. Both the methods selectively perform spatial-frequencies spectrum substitution from an image to another. In both schemes, context dependency is exploited by thresholding the local correlation coefficient between the images to be merged, to avoid injection of spatial details that are not likely to occur in the target image. Unlike other multiscale fusion schemes, both the present decompositions are not critically subsampled, thus avoiding possible impairments in the fused images, due to missing cancellation of aliasing terms. Results are presented and discussed on SPOT data.

662 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a wavelet-based source detection algorithm that uses the Mexican Hat wavelet function, but may be adapted for use with other wavelet functions, and demonstrate the robustness of the algorithm by applying it to an image from an idealized detector with a spatially invariant Gaussian PSF and an exposure map similar to that of the Einstein IPC.
Abstract: Wavelets are scalable, oscillatory functions that deviate from zero only within a limited spatial regime and have average value zero, and thus may be used to simultaneously characterize the shape, location, and strength of astronomical sources. But in addition to their use as source characterizers, wavelet functions are rapidly gaining currency within the source detection field. Wavelet-based source detection involves the correlation of scaled wavelet functions with binned, two-dimensional image data. If the chosen wavelet function exhibits the property of vanishing moments, significantly nonzero correlation coefficients will be observed only where there are high-order variations in the data; e.g., they will be observed in the vicinity of sources. Source pixels are identified by comparing each correlation coefficient with its probability sampling distribution, which is a function of the (estimated or a priori known) background amplitude. In this paper, we describe the mission-independent, wavelet-based source detection algorithm "WAVDETECT," part of the freely available Chandra Interactive Analysis of Observations (CIAO) software package. Our algorithm uses the Marr, or "Mexican Hat" wavelet function, but may be adapted for use with other wavelet functions. Aspects of our algorithm include: (1) the computation of local, exposure-corrected normalized (i.e., flat-fielded) background maps; (2) the correction for exposure variations within the field of view (due to, e.g., telescope support ribs or the edge of the field); (3) its applicability within the low-counts regime, as it does not require a minimum number of background counts per pixel for the accurate computation of source detection thresholds; (4) the generation of a source list in a manner that does not depend upon a detailed knowledge of the point spread function (PSF) shape; and (5) error analysis. These features make our algorithm considerably more general than previous methods developed for the analysis of X-ray image data, especially in the low count regime. We demonstrate the robustness of WAVDETECT by applying it to an image from an idealized detector with a spatially invariant Gaussian PSF and an exposure map similar to that of the Einstein IPC; to Pleiades Cluster data collected by the ROSAT PSPC; and to simulated Chandra ACIS-I image of the Lockman Hole region.

630 citations


Journal ArticleDOI
TL;DR: This letter presents a locally adaptive denoising algorithm using the bivariate shrinkage function and is illustrated using both the orthogonal and dual tree complex wavelet transforms.
Abstract: The performance of image-denoising algorithms using wavelet transforms can be improved significantly by taking into account the statistical dependencies among wavelet coefficients as demonstrated by several algorithms presented in the literature. In two earlier papers by the authors, a simple bivariate shrinkage rule is described using a coefficient and its parent. The performance can also be improved using simple models by estimating model parameters in a local neighborhood. This letter presents a locally adaptive denoising algorithm using the bivariate shrinkage function. The algorithm is illustrated using both the orthogonal and dual tree complex wavelet transforms. Some comparisons with the best available results are given in order to illustrate the effectiveness of the proposed algorithm.

617 citations


Book
15 Jul 2002
TL;DR: In this paper, the Wavelet Transform is used for the identification of coherent structures and edge detection of coherent structures using the Inverse Wavelet transform and the Fourier transform.
Abstract: PREFACE GETTING STARTED THE CONTINUOUS WAVELET TRANSFORM Introduction The Wavelet Requirements for the Wavelet The Energy Spectrum of the Wavelet The Wavelet Transform Identification of Coherent Structures Edge Detection The Inverse Wavelet Transform The Signal Energy: Wavelet-Based Energy and Power Spectra The Wavelet Transform in Terms of the Fourier Transform Complex Wavelets: The Morlet Wavelet The Wavelet Transform, Short Time Fourier Transform and Heisenberg Boxes Adaptive Transforms: Matching Pursuits Wavelets in Two or More Dimensions The CWT: Computation, Boundary Effects and Viewing Endnotes THE DISCRETE WAVELET TRANSFORM Introduction Frames and Orthogonal Wavelet Bases Discrete Input Signals of Finite Length Everything Discrete Daubechies Wavelets Translation Invariance Biorthogonal Wavelets Two-Dimensional Wavelet Transforms Adaptive Transforms: Wavelet Packets Endnotes FLUIDS Introduction Statistical Measures Engineering Flows Geophysical Flows Other Applications in Fluids and Further Resources ENGINEERING TESTING, MONITORING AND CHARACTERISATION Introduction Machining Processes: Control, Chatter, Wear and Breakage Rotating Machinery Dynamics Chaos Non-Destructive Testing Surface Characterisation Other Applications in Engineering and Further Resources MEDICINE Introduction The Electrocardiogram Neuroelectric Waveforms Pathological Sounds, Ultrasounds and Vibrations Blood Flow and Blood Pressure Medical Imaging Other Applications in Medicine FRACTALS, FINANCE, GEOPHYSICS AND OTHER AREAS Introduction Fractals Finance Geophysics Other Areas APPENDIX: USEFUL BOOKS, PAPERS AND WEBSITES Useful Books and Papers Useful Websites REFERENCES INDEX

599 citations


Journal ArticleDOI
TL;DR: A new method to detect and count bright spots in fluorescence images coming from biological immunomicroscopy experiments is presented, based on the multiscale product of subband images resulting from the a trous wavelet transform decomposition of the original image, after thresholding of non-significant coefficients.

552 citations


Journal ArticleDOI
TL;DR: In this article, a joint inversion of static and seismic data is used for the determination of rupture complexity from a joint estimation of seismic data and synthetic seismic data, where the objective function is defined as the weighted sum of these functions.
Abstract: We present a new procedure for the determination of rupture complexity from a joint inversion of static and seismic data. Our fault parameterization involves multiple fault segments, variable local slip, rake angle, rise time, and rupture velocity. To separate the spatial and temporal slip history, we introduce a wavelet transform that proves effective at studying the time and frequency characteristics of the seismic waveforms. Both data and synthetic seismograms are transformed into wavelets, which are then separated into several groups based on their frequency content. For each group, we use error functions to compare the wavelet amplitude variation with time between data and synthetic seismograms. The function can be an L1 + L2 norm or a correlative function based on the amplitude and scale of wavelet functions. The objective function is defined as the weighted sum of these functions. Subsequently, we developed a finite-fault inversion routine in the wavelet domain. A simulated annealing algorithm is used to determine the finite-fault model that minimizes the objective function described in terms of wavelet coefficients. With this approach, we can simultaneously invert for the slip amplitude, slip direction, rise time, and rupture velocity efficiently. Extensive experiments conducted on synthetic data are used to assess the ability to recover rupture slip details. We, also explore slip-model stability for different choices of layered Earth models assuming the geometry encountered in the 1999 Hector Mine, California, earthquake.

Journal ArticleDOI
TL;DR: The dyadic discrete wavelet transform approach is shown to significantly increase the overall classification accuracy and is tested using hyperspectral data for various agricultural applications.
Abstract: In this paper, the dyadic discrete wavelet transform is proposed for feature extraction from a high-dimensional data space. The wavelet's inherent multiresolutional properties are discussed in terms related to multispectral and hyperspectral remote sensing. Furthermore, various wavelet-based features are applied to the problem of automatic classification of specific ground vegetations from hyperspectral signatures. The wavelet transform features are evaluated using an automated statistical classifier. The system is tested using hyperspectral data for various agricultural applications. The experimental results demonstrate the promising discriminant capability of the wavelet-based features. The automated classification system consistently provides over 95% and 80% classification accuracy for endmember and mixed-signature applications, respectively. When compared to conventional feature extraction methods, the wavelet transform approach is shown to significantly increase the overall classification accuracy.

Journal ArticleDOI
TL;DR: In this paper, a modified wavelet transform known as the S-transform is used for power quality analysis with very good time resolution. But, the amplitude peaks are regions of stationary phase.
Abstract: This paper presents a new approach for power quality analysis using a modified wavelet transform known as the S-transform. The local spectral information of the wavelet transform can, with slight modification, be used to perform local cross spectral analysis with very good time resolution. The "phase correction" absolutely references the phase of the wavelet transform to the zero time point, thus assuring that the amplitude peaks are regions of stationary phase. The excellent time-frequency resolution characteristic of the S-transform makes it an attractive candidate for analysis of power system disturbance signals. Several power quality problems are analyzed using both the S-transform and discrete wavelet transform, showing clearly the advantage of the S-transform in detecting, localizing, and classifying the power quality problems.

Journal ArticleDOI
TL;DR: Experimental results show that the proposed speckle reduction algorithm outperforms standard wavelet denoising techniques in terms of the signal-to-noise ratio and the equivalent-number-of-looks measures in most cases and achieves better performance than the refined Lee filter.
Abstract: The granular appearance of speckle noise in synthetic aperture radar (SAR) imagery makes it very difficult to visually and automatically interpret SAR data. Therefore, speckle reduction is a prerequisite for many SAR image processing tasks. In this paper, we develop a speckle reduction algorithm by fusing the wavelet Bayesian denoising technique with Markov-random-field-based image regularization. Wavelet coefficients are modeled independently and identically by a two-state Gaussian mixture model, while their spatial dependence is characterized by a Markov random field imposed on the hidden state of Gaussian mixtures. The Expectation-Maximization algorithm is used to estimate hyperparameters and specify the mixture model, and the iterated-conditional-modes method is implemented to optimize the state configuration. The noise-free wavelet coefficients are finally estimated by a shrinkage function based on local weighted averaging of the Bayesian estimator. Experimental results show that the proposed method outperforms standard wavelet denoising techniques in terms of the signal-to-noise ratio and the equivalent-number-of-looks measures in most cases. It also achieves better performance than the refined Lee filter.

Book ChapterDOI
07 Jan 2002
TL;DR: Long-range dependence-Scaling phenomena-(Multi)fractal-Wavelet analysis Scaling analysis-Sc scaling parameters estimation-Robustness-Fractional Brownian motion synthesis-Fano factor-Aggregation procedure-Allan variance.
Abstract: Long-range dependence-Scaling phenomena-(Multi)fractal-Wavelet analysis Scaling analysis-Scaling parameters estimation-Robustness-Fractional Brownian motion synthesis-Fano factor-Aggregation procedure-Allan variance.

Journal ArticleDOI
TL;DR: In realistic simulations, it is shown that, in contrast to Fourier coherence, wavelet coherence can detect short, significant episodes of coherence between non-stationary neural signals and can be directly applied for an 'online' quantification of the instantaneous coherent between two signals.
Abstract: This paper introduces the use of wavelet analysis to follow the temporal variations in the coupling between oscillatory neural signals. Coherence, based on Fourier analysis, has been commonly used as a first approximation to track such coupling under the assumption that neural signals are stationary. Yet, stationary neural processing may be the exception rather than the rule. In this context, the recent application to physical systems of a wavelet-based coherence, which does not depend on the stationarity of the signals, is highly relevant. This paper fully develops the method of wavelet coherence and its statistical properties so that it can be practically applied to continuous neural signals. In realistic simulations, we show that, in contrast to Fourier coherence, wavelet coherence can detect short, significant episodes of coherence between non-stationary neural signals. This method can be directly applied for an ‘online’ quantification of the instantaneous coherence between two signals.

DOI
01 Jan 2002
TL;DR: This thesis focuses on the development of new "true" two-dimensional representations for images using a discrete framework that can lead to algorithmic implementations and a new family of block directional and orthonormal transforms based on the ridgelet idea.
Abstract: Efficient representation of visual information lies at the foundation of many image processing tasks, including compression, filtering, and feature extraction. Efficiency of a representation refers to the ability to capture significant information of an object of interest in a small description. For practical applications, this representation has to be realized by structured transforms and fast algorithms. Recently, it has become evident that commonly used separable transforms (such as wavelets) are not necessarily best suited for images. Thus, there is a strong motivation to search for more powerful schemes that can capture the intrinsic geometrical structure of pictorial information. This thesis focuses on the development of new "true" two-dimensional representations for images. The emphasis is on the discrete framework that can lead to algorithmic implementations. The first method constructs multiresolution, local and directional image expansions by using non-separable filter banks. This discrete transform is developed in connection with the continuous-space curvelet construction in harmonic analysis. As a result, the proposed transform provides an efficient representation for two-dimensional piecewise smooth signals that resemble images. The link between the developed filter banks and the continuous-space constructions is set up in a newly defined directional multiresolution analysis. The second method constructs a new family of block directional and orthonormal transforms based on the ridgelet idea, and thus offers an efficient representation for images that are smooth away from straight edges. Finally, directional multiresolution image representations are employed together with statistical modeling, leading to powerful texture models and successful image retrieval systems.

Journal ArticleDOI
TL;DR: The proposed method for speckle reduction outperforms Kuan's local linear MMSE filtering by almost 3-dB signal-to-noise ratio and the visual quality of the results is excellent in terms of both background smoothing and preservation of edge sharpness, textures, and point targets.
Abstract: Speckle reduction is approached as a minimum mean-square error (MMSE) filtering performed in the undecimated wavelet domain by means of an adaptive rescaling of the detail coefficients, whose amplitude is divided by the variance ratio of the noisy coefficient to the noise-free one. All the above quantities are analytically calculated from the speckled image, the variance and autocorrelation of the fading variable, and the wavelet filters only, without resorting to any model to describe the underlying backscatter. On the test image Lena corrupted by synthetic speckle, the proposed method outperforms Kuan's local linear MMSE filtering by almost 3-dB signal-to-noise ratio. When true synthetic aperture radar (SAR) images are concerned, empirical criteria based on distributions of multiscale local coefficient of variation, calculated in the undecimated wavelet domain, are introduced to mitigate the rescaling of coefficients in highly heterogeneous areas where the speckle does not obey a fully developed model, to avoid blurring strong textures and point targets. Experiments carried out on widespread test SAR images and on a speckled mosaic image, comprising synthetic shapes, textures, and details from optical images, demonstrate that the visual quality of the results is excellent in terms of both background smoothing and preservation of edge sharpness, textures, and point targets. The absence of decimation in the wavelet decomposition avoids typical impairments often introduced by critically subsampled wavelet-based denoising.

Journal ArticleDOI
TL;DR: Design procedures, based on spectral factorization, for the design of pairs of dyadic wavelet bases where the two wavelets form an approximate Hilbert transform pair are described.
Abstract: Several authors have demonstrated that significant improvements can be obtained in wavelet-based signal processing by utilizing a pair of wavelet transforms where the wavelets form a Hilbert transform pair. This paper describes design procedures, based on spectral factorization, for the design of pairs of dyadic wavelet bases where the two wavelets form an approximate Hilbert transform pair. Both orthogonal and biorthogonal FIR solutions are presented, as well as IIR solutions. In each case, the solution depends on an allpass filter having a flat delay response. The design procedure allows for an arbitrary number of vanishing wavelet moments to be specified. A Matlab program for the procedure is given, and examples are also given to illustrate the results.

Journal ArticleDOI
TL;DR: In this article, a wavelet packet transform (WPT) based method is proposed for the damage assessment of structures, where the dynamic signals measured from a structure are first decomposed into wavelet component and then calculated and used as inputs into neural network models for dama.
Abstract: Wavelet transform (WT) is a mathematical tool that can decompose a temporal signal into a summation of time-domain basis functions of various frequency resolutions. This simultaneous time-frequency decomposition gives the WT a special advantage over the traditional Fourier transform in analyzing nonstationary signals. One drawback of the WT is that its resolution is rather poor in the high-frequency region. Since structural damage is typically a local phenomenon captured most likely by high frequency modes, this potential drawback can affect the application of the wavelet-based damage assessment techniques. The wavelet packet transform (WPT) adopts redundant basis functions and hence can provide an arbitrary time-frequency resolution. In this study, a WPT-based method is proposed for the damage assessment of structures. Dynamic signals measured from a structure are first decomposed into wavelet packet components. Component energies are then calculated and used as inputs into neural network models for dama...

Journal ArticleDOI
TL;DR: This paper presents a new wavelet-based image denoising method, which extends a "geometrical" Bayesian framework and combines three criteria for distinguishing supposedly useful coefficients from noise: coefficient magnitudes, their evolution across scales and spatial clustering of large coefficients near image edges.
Abstract: This paper presents a new wavelet-based image denoising method, which extends a "geometrical" Bayesian framework. The new method combines three criteria for distinguishing supposedly useful coefficients from noise: coefficient magnitudes, their evolution across scales and spatial clustering of large coefficients near image edges. These three criteria are combined in a Bayesian framework. The spatial clustering properties are expressed in a prior model. The statistical properties concerning coefficient magnitudes and their evolution across scales are expressed in a joint conditional model. The three main novelties with respect to related approaches are (1) the interscale-ratios of wavelet coefficients are statistically characterized and different local criteria for distinguishing useful coefficients from noise are evaluated, (2) a joint conditional model is introduced, and (3) a novel anisotropic Markov random field prior model is proposed. The results demonstrate an improved denoising performance over related earlier techniques.

Proceedings ArticleDOI
07 Aug 2002
TL;DR: A detailed performance study of the effects of using different wavelets on the performance of similarity searching for time-series data is presented and several wavelets that outperform both the Haar wavelet and the best-known non-wavelet transformations for this application are included.
Abstract: Considers the use of wavelet transformations as a dimensionality reduction technique to permit efficient similarity searching over high-dimensional time-series data. While numerous transformations have been proposed and studied, the only wavelet that has been shown to be effective for this application is the Haar wavelet. In this work, we observe that a large class of wavelet transformations (not only orthonormal wavelets but also bi-orthonormal wavelets) can be used to support similarity searching. This class includes the most popular and most effective wavelets being used in image compression. We present a detailed performance study of the effects of using different wavelets on the performance of similarity searching for time-series data. We include several wavelets that outperform both the Haar wavelet and the best-known non-wavelet transformations for this application. To ensure our results are usable by an application engineer, we also show how to configure an indexing strategy for the best-performing transformations. Finally, we identify classes of data that can be indexed efficiently using these wavelet transformations.

Journal ArticleDOI
TL;DR: Experimental results show that the proposed approach outperforms methods based on the intensity-hue-saturation transform, principal component analysis and discrete wavelet transform in preserving spectral and spatial information, especially in situations where the source images are not perfectly registered.

Journal ArticleDOI
TL;DR: In this paper, the Fourier Transform and Discrete Fourier Analysis (DFT) are used to analyze the inner product spaces of the Daubechies Wavelet. But they do not consider the multiresolution analysis.
Abstract: 0. Inner Product Spaces. 1. Fourier Series. 2. The Fourier Transform. 3. Discrete Fourier Analysis. 4. Wavelet Analysis. 5. Multiresolution Analysis. 6. The Daubechies Wavelets. 7. Other Wavelet Topics. Appendix A. Technical Matters. Appendix B. Matlab Routines. Bibliography.

Journal ArticleDOI
TL;DR: In this paper, a discrete wavelet transform (DWT) was used to detect single and multiple bearing race faults in the ball bearings of the inner race, outer race, and the combination faults.

Book ChapterDOI
01 Jan 2002
TL;DR: A framework for multiscale image analysis in which line segments play a role analogous to the role played by points in wavelet analysis is described.
Abstract: We describe a framework for multiscale image analysis in which line segments play a role analogous to the role played by points in wavelet analysis.

Journal ArticleDOI
TL;DR: The paper demonstrates that the wavelet based denoising method proposed in the paper can be employed in separating PD pulses from electrical noise successfully and can be used in pulse propagation studies of partial discharge in distributed impedance plant to provide enhanced information and further infer the original site of the PD pulse.
Abstract: The objective of the paper is to discuss a tool which is proving extremely efficient in partial discharge measurement studies. Though the technique itself is not new, its application to partial discharge studies is. It will be demonstrated in this paper that it has tremendous power and this accounts for its rapid growth as an application in this field. The paper begins with the description of the fundamentals of wavelet analysis, wavelet categories and the properties of the associated wavelet transforms. PD pulses as acquired from detectors composed of different detection circuits are investigated and numerically simulated, and a method on how to select optimally the wavelet corresponding to the representative forms of PD pulse is then presented. Finally, applications of wavelet analysis to partial discharge studies are explored. The paper demonstrates that the wavelet based denoising method proposed in the paper can be employed in separating PD pulses from electrical noise successfully and can be used in pulse propagation studies of partial discharge in distributed impedance plant to provide enhanced information and further infer the original site of the PD pulse.

Proceedings ArticleDOI
02 Sep 2002
TL;DR: The novel application of the shift invariant and directionally selective Dual Tree Complex Wavelet Transform (DT-CWT) to image fusion is now introduced and provides improved qualitative and quantitative results compared to previous wavelet fusion methods.
Abstract: The fusion of images is the process of combining two or more images into a single image retaining important features from each. Fusion is an important technique within many disparate fields such as remote sensing, robotics and medical applications. Wavelet based fusion techniques have been reasonably effective in combining perceptually important image features. Shift invariance of the wavelet transform is important in ensuring robust subband fusion. Therefore, the novel application of the shift invariant and directionally selective Dual Tree Complex Wavelet Transform (DT-CWT) to image fusion is now introduced. This novel technique provides improved qualitative and quantitative results compared to previous wavelet fusion methods.

Journal ArticleDOI
TL;DR: This paper examines the class of generalized Morse wavelets, which are eigenfunction wavelets suitable for use in time-varying spectrum estimation via averaging of time-scale eigenscalograms and shows that for /spl gamma/ > 1, generalized MorseWavelets can outperform the Hermites in energy concentration, contrary to a conclusion based on the /spl Gamma/ = 1 case.
Abstract: This paper examines the class of generalized Morse wavelets, which are eigenfunction wavelets suitable for use in time-varying spectrum estimation via averaging of time-scale eigenscalograms. Generalized Morse wavelets of order k (the corresponding eigenvalue order) depend on a doublet of parameters (/spl beta/, /spl gamma/); we extend results derived for the special case /spl beta/ = /spl gamma/ = 1 and include a proof of "the resolution of identity." The wavelets are easy to compute using the discrete Fourier transform (DFT) and, for (/spl beta/, /spl gamma/) = (2m, 2), can be computed exactly. A correction of a previously published eigenvalue formula is given. This shows that for /spl gamma/ > 1, generalized Morse wavelets can outperform the Hermites in energy concentration, contrary to a conclusion based on the /spl gamma/ = 1 case. For complex signals, scalogram analyses must be carried out using both the analytic and anti-analytic complex wavelets or odd and even real wavelets, whereas for real signals, the analytic complex wavelet is sufficient.

Journal ArticleDOI
TL;DR: This paper applies wavelet (packet) thresholding methods to denoise the function obtained in the previous step before adding it into the new iterate to improve the approximation.
Abstract: High-resolution image reconstruction refers to the reconstruction of high-resolution images from multiple low-resolution, shifted, degraded samples of a true image. In this paper, we analyze this problem from the wavelet point of view. By expressing the true image as a function in ${\cal L}({\Bbb R}^2)$, we derive iterative algorithms which recover the function completely in the ${\cal L}$ sense from the given low-resolution functions. These algorithms decompose the function obtained from the previous iteration into different frequency components in the wavelet transform domain and add them into the new iterate to improve the approximation. We apply wavelet (packet) thresholding methods to denoise the function obtained in the previous step before adding it into the new iterate. Our numerical results show that the reconstructed images from our wavelet algorithms are better than that from the Tikhonov least-squares approach. Extension to super-resolution image reconstruction, where some of the low-resolution images are missing, is also considered.