scispace - formally typeset
Search or ask a question

Showing papers on "Wavelet published in 2001"


Journal ArticleDOI
TL;DR: This work gives examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution, and obtains reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.
Abstract: The time-frequency and time-scale communities have recently developed a large number of overcomplete waveform dictionaries---stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of frames (MOF), matching pursuit (MP), and, for special dictionaries, the best orthogonal basis (BOB). Basis pursuit (BP) is a principle for decomposing a signal into an "optimal"' superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution. BP has interesting relations to ideas in areas as diverse as ill-posed problems, abstract harmonic analysis, total variation denoising, and multiscale edge denoising. BP in highly overcomplete dictionaries leads to large-scale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear and quadratic programming by interior-point methods. We obtain reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.

4,387 citations


Journal ArticleDOI
TL;DR: In this article, a dual tree of wavelet filters is proposed to obtain real and imaginary parts of the wavelet transform. And the dual tree can be extended for image and other multi-dimensional signals.

1,767 citations


Book
26 Sep 2001
TL;DR: In this paper, Linear Filters Optimum Linear Estimation Discrete Wavelet Transforms Wavelets and Stationary Processes Wavelet Denoising Wavelets for Variance-Covariance Estimation Artificial Neural Networks
Abstract: Preface Introduction Linear Filters Optimum Linear Estimation Discrete Wavelet Transforms Wavelets and Stationary Processes Wavelet Denoising Wavelets for Variance-Covariance Estimation Artificial Neural Networks

1,107 citations


Journal ArticleDOI
TL;DR: Just as the frequency-domain methods decomposes the variance of a time series into frequency components, so do wavelets decompose the variance according to scaling, and fast transform methods exist for wavelets as well as for Fourier transforms.
Abstract: (2001). Wavelet Methods for Time Series Analysis. Technometrics: Vol. 43, No. 4, pp. 491-491.

1,025 citations


Journal ArticleDOI
TL;DR: A new approach to mask the watermark according to the characteristics of the human visual system (HVS) is presented, which is accomplished pixel by pixel by taking into account the texture and the luminance content of all the image subbands.
Abstract: A watermarking algorithm operating in the wavelet domain is presented. Performance improvement with respect to existing algorithms is obtained by means of a new approach to mask the watermark according to the characteristics of the human visual system (HVS). In contrast to conventional methods operating in the wavelet domain, masking is accomplished pixel by pixel by taking into account the texture and the luminance content of all the image subbands. The watermark consists of a pseudorandom sequence which is adaptively added to the largest detail bands. As usual, the watermark is detected by computing the correlation between the watermarked coefficients and the watermarking code, and the detection threshold is chosen in such a way that the knowledge of the watermark energy used in the embedding phase is not needed, thus permitting one to adapt it to the image at hand. Experimental results and comparisons with other techniques operating in the wavelet domain prove the effectiveness of the new algorithm.

949 citations


Journal ArticleDOI
TL;DR: A direct comparison between these two methods for quantification of phase synchrony between neuronal signals on three signal sets is conducted, and it is concluded that they are fundamentally equivalent for the study of neuroelectrical signals.

886 citations


Proceedings ArticleDOI
07 Oct 2001
TL;DR: In the tests reported here, simple thresholding of the curvelet coefficients is very competitive with 'state of the art' techniques based on wavelets, including thresholded of decimated or undecimated wavelet transforms and also including tree-based Bayesian posterior mean methods.
Abstract: Summary form only given, as follows. We present approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform. Our implementations offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity. We apply these digital transforms to the denoising of some standard images embedded in white noise. In the tests reported here, simple thresholding of the curvelet coefficients is very competitive with 'state of the art' techniques based on wavelets, including thresholding of decimated or undecimated wavelet transforms and also including tree-based Bayesian posterior mean methods. Moreover, the curvelet reconstructions exhibit higher perceptual quality than wavelet-based reconstructions, offering visually sharper images and, in particular, higher quality recovery of edges and of faint linear and curvilinear features.

857 citations


Journal ArticleDOI
TL;DR: This work suggests a two-stage separation process: a priori selection of a possibly overcomplete signal dictionary in which the sources are assumed to be sparsely representable, followed by unmixing the sources by exploiting the their sparse representability.
Abstract: The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. This situation is common in acoustics, radio, medical signal and image processing, hyperspectral imaging, and other areas. We suggest a two-stage separation process: a priori selection of a possibly overcomplete signal dictionary (for instance, a wavelet frame or a learned dictionary) in which the sources are assumed to be sparsely representable, followed by unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures, but also derive a more efficient algorithm in the case of a nonovercomplete dictionary and an equal numbers of sources and mixtures. Experiments with artificial signals and musical sounds demonstrate significantly better separation than other known techniques.

829 citations


Journal ArticleDOI
TL;DR: The major objective of the present work was to characterize in a quantitative way functional dynamics of order/disorder microstates in short duration EEG signals with specific quantifiers derived to characterize how stimulus affects electrical events in terms of frequency synchronization (tuning) in the event related potentials.

780 citations


Journal ArticleDOI
TL;DR: It is shown that four channels of myoelectric data greatly improve the classification accuracy, as compared to one or two channels, and a robust online classifier is constructed, which produces class decisions on a continuous stream of data.
Abstract: This work represents an ongoing investigation of dexterous and natural control of powered upper limbs using the myoelectric signal. When approached as a pattern recognition problem, the success of a myoelectric control scheme depends largely on the classification accuracy. A novel approach is described that demonstrates greater accuracy than in previous work. Fundamental to the success of this method is the use of a wavelet-based feature set, reduced in dimension by principal components analysis. Further, it is shown that four channels of myoelectric data greatly improve the classification accuracy, as compared to one or two channels. It is demonstrated that exceptionally accurate performance is possible using the steady-state myoelectric signal. Exploiting these successes, a robust online classifier is constructed, which produces class decisions on a continuous stream of data. Although in its preliminary stages of development, this scheme promises a more natural and efficient means of myoelectric control than one based on discrete, transient bursts of activity.

690 citations


Journal ArticleDOI
TL;DR: It is concluded that wavelet resampling may be a generally useful method for inference on naturally complex time series based on random permutation after orthogonal transformation of the observed time series to the wavelet domain.
Abstract: Even in the absence of an experimental effect, functional magnetic resonance imaging (fMRI) time series generally demonstrate serial dependence. This colored noise or endogenous autocorrelation typically has disproportionate spectral power at low frequencies, i.e., its spectrum is (1/f)-like. Various pre-whitening and pre-coloring strategies have been proposed to make valid inference on standardised test statistics estimated by time series regression in this context of residually autocorrelated errors. Here we introduce a new method based on random permutation after orthogonal transformation of the observed time series to the wavelet domain. This scheme exploits the general whitening or decorrelating property of the discrete wavelet transform and is implemented using a Daubechies wavelet with four vanishing moments to ensure exchangeability of wavelet coefficients within each scale of decomposition. For (1/f)-like or fractal noises, e.g., realisations of fractional Brownian motion (fBm) parameterised by Hurst exponent 0 < H < 1, this resampling algorithm exactly preserves wavelet-based estimates of the second order stochastic properties of the (possibly nonstationary) time series. Performance of the method is assessed empirically using (1/f)-like noise simulated by multiple physical relaxation processes, and experimental fMRI data. Nominal type 1 error control in brain activation mapping is demonstrated by analysis of 13 images acquired under null or resting conditions. Compared to autoregressive pre-whitening methods for computational inference, a key advantage of wavelet resampling seems to be its robustness in activation mapping of experimental fMRI data acquired at 3 Tesla field strength. We conclude that wavelet resampling may be a generally useful method for inference on naturally complex time series.

Journal ArticleDOI
TL;DR: A novel speckle suppression method for medical ultrasound images that uses the alpha-stable model to develop a blind noise-removal processor that performs a nonlinear operation on the data and designs a Bayesian estimator that exploits these statistics.
Abstract: A novel speckle suppression method for medical ultrasound images is presented. First, the logarithmic transform of the original image is analyzed into the multiscale wavelet domain. The authors show that the subband decompositions of ultrasound images have significantly non-Gaussian statistics that are best described by families of heavy-tailed distributions such as the alpha-stable. Then, the authors design a Bayesian estimator that exploits these statistics. They use the alpha-stable model to develop a blind noise-removal processor that performs a nonlinear operation on the data. Finally, the authors compare their technique with current state-of-the-art soft and hard thresholding methods applied on actual ultrasound medical images and they quantify the achieved performance improvement.

Journal ArticleDOI
TL;DR: This work greatly simplify the HMT model by exploiting the inherent self-similarity of real-world images, and introduces a Bayesian universal HMT (uHMT) that fixes these nine parameters.
Abstract: Wavelet-domain hidden Markov models have proven to be useful tools for statistical signal and image processing. The hidden Markov tree (HMT) model captures the key features of the joint probability density of the wavelet coefficients of real-world data. One potential drawback to the HMT framework is the need for computationally expensive iterative training to fit an HMT model to a given data set (e.g., using the expectation-maximization algorithm). We greatly simplify the HMT model by exploiting the inherent self-similarity of real-world images. The simplified model specifies the HMT parameters with just nine meta-parameters (independent of the size of the image and the number of wavelet scales). We also introduce a Bayesian universal HMT (uHMT) that fixes these nine parameters. The uHMT requires no training of any kind, while extremely simple, we show using a series of image estimation/denoising experiments that these new models retain nearly all of the key image structure modeled by the full HMT. Finally, we propose a fast shift-invariant HMT estimation algorithm that outperforms other wavelet-based estimators in the current literature, both visually and in mean square error.

Journal ArticleDOI
01 Sep 2001
TL;DR: The use of multi-dimensional wavelets are proposed as an effective tool for general-purpose approximate query processing in modern, high-dimensional applications and a novel wavelet decomposition algorithm is proposed that can build wavelet-coefficient synopses of the data in an I/O-efficient manner.
Abstract: Approximate query processing has emerged as a cost-effective approach for dealing with the huge data volumes and stringent response-time requirements of today's decision support systems (DSS). Most work in this area, however, has so far been limited in its query processing scope, typically focusing on specific forms of aggregate queries. Furthermore, conventional approaches based on sampling or histograms appear to be inherently limited when it comes to approximating the results of complex queries over high-dimensional DSS data sets. In this paper, we propose the use of multi-dimensional wavelets as an effective tool for general-purpose approximate query processing in modern, high-dimensional applications. Our approach is based on building wavelet-coefficient synopses of the data and using these synopses to provide approximate answers to queries. We develop novel query processing algorithms that operate directly on the wavelet-coefficient synopses of relational tables, allowing us to process arbitrarily complex queries entirely in the wavelet-coefficient domain. This guarantees extremely fast response times since our approximate query execution engine can do the bulk of its processing over compact sets of wavelet coefficients, essentially postponing the expansion into relational tuples until the end-result of the query. We also propose a novel wavelet decomposition algorithm that can build these synopses in an I/O-efficient manner. Finally, we conduct an extensive experimental study with synthetic as well as real-life data sets to determine the effectiveness of our wavelet-based approach compared to sampling and histograms. Our results demonstrate that our techniques: (1) provide approximate answers of better quality than either sampling or histograms; (2) offer query execution-time speedups of more than two orders of magnitude; and (3) guarantee extremely fast synopsis construction times that scale linearly with the size of the data.

Journal ArticleDOI
TL;DR: In this paper, the authors introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced, and obtain Oracle inequalities and universal thresholding parameters for a large class of penalty functions.
Abstract: In this paper, we introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced. The approach can apply readily to many other statistical contexts. Various new penalty functions are proposed. The hard-thresholding and soft-thresholding estimators of Donoho and Johnstone are specific members of nonlinear regularized wavelet estimators. They correspond to the lower and upper envelopes of a class of the penalized least squares estimators. Necessary conditions for penalty functions are given for regularized estimators to possess thresholding properties. Oracle inequalities and universal thresholding parameters are obtained for a large class of penalty functions. The sampling properties of nonlinear regularized wavelet estimators are established and are shown to be adaptively minimax. To efficiently solve penalized least squares problems, nonlinear regularized Sobolev interpolators (NRSI) are proposed as initial estimators, wh...

Journal ArticleDOI
TL;DR: The main result of the paper is the construction of an adaptive scheme which produces an approximation to u with error O(N -s ) in the energy norm, whenever such a rate is possible by N-term approximation.
Abstract: This paper is concerned with the construction and analysis of wavelet-based adaptive algorithms for the numerical solution of elliptic equations. These algorithms approximate the solution u of the equation by a linear combination of N wavelets, Therefore, a benchmark for their performance is provided by the rate of best approximation to u by an arbitrary linear combination of N wavelets (so called N-term approximation), which would be obtained by keeping the N largest wavelet coefficients of the real solution (which of course is unknown). The main result of the paper is the construction of an adaptive scheme which produces an approximation to u with error O(N -s ) in the energy norm, whenever such a rate is possible by N-term approximation. The range of s > 0 for which this holds is only limited by the approximation properties of the wavelets together with their ability to compress the elliptic operator. Moreover, it is shown that the number of arithmetic operations needed to compute the approximate solution stays proportional to N. The adaptive algorithm applies to a wide class of elliptic problems and wavelet bases. The analysis in this paper puts forward new techniques for treating elliptic problems as well as the linear systems of equations that arise from the wavelet discretization.

Journal ArticleDOI
TL;DR: This article builds on the background material in generic transform coding given, shows that boundary effects motivate the use of biorthogonal wavelets, and introduces the symmetric wavelet transform.
Abstract: One of the purposes of this article is to give a general audience sufficient background into the details and techniques of wavelet coding to better understand the JPEG 2000 standard. The focus is on the fundamental principles of wavelet coding and not the actual standard itself. Some of the confusing design choices made in wavelet coders are explained. There are two types of filter choices: orthogonal and biorthogonal. Orthogonal filters have the property that there are energy or norm preserving. Nevertheless, modern wavelet coders use biorthogonal filters which do not preserve energy. Reasons for these specific design choices are explained. Another purpose of this article is to compare and contrast "early" wavelet coding with "modern" wavelet coding. This article compares the techniques of the modern wavelet coders to the subband coding techniques so that the reader can appreciate how different modern wavelet coding is from early wavelet coding. It discusses basic properties of the wavelet transform which are pertinent to image compression. It builds on the background material in generic transform coding given, shows that boundary effects motivate the use of biorthogonal wavelets, and introduces the symmetric wavelet transform. Subband coding or "early" wavelet coding method is discussed followed by an explanation of the EZW coding algorithm. Other modern wavelet coders that extend the ideas found in the EZW algorithm are also described.

Proceedings Article
11 Sep 2001
TL;DR: This work presents general “sketch” based methods for capturing various linear projections of the data and use them to provide pointwise and rangesum estimation of data streams.
Abstract: We present techniques for computing small space representations of massive data streams. These are inspired by traditional wavelet-based approximations that consist of specific linear projections of the underlying data. We present general “sketch” based methods for capturing various linear projections of the data and use them to provide pointwise and rangesum estimation of data streams. These methods use small amounts of space and per-item time while streaming through the data, and provide accurate representation as our experiments with real data streams show.

Journal ArticleDOI
TL;DR: In this article, the authors describe a mission-independent wavelet-based source detection algorithm called WAVDETECT, which does not require a minimum number of background counts per pixel for the accurate computation of source detection thresholds.
Abstract: Wavelets are scaleable, oscillatory functions that deviate from zero only within a limited spatial regime and have average value zero. In addition to their use as source characterizers, wavelet functions are rapidly gaining currency within the source detection field. Wavelet-based source detection involves the correlation of scaled wavelet functions with binned, two-dimensional image data. If the chosen wavelet function exhibits the property of vanishing moments, significantly non-zero correlation coefficients will be observed only where there are high-order variations in the data; e.g., they will be observed in the vicinity of sources. In this paper, we describe the mission-independent, wavelet-based source detection algorithm WAVDETECT, part of the CIAO software package. Aspects of our algorithm include: (1) the computation of local, exposure-corrected normalized (i.e. flat-fielded) background maps; (2) the correction for exposure variations within the field-of-view; (3) its applicability within the low-counts regime, as it does not require a minimum number of background counts per pixel for the accurate computation of source detection thresholds; (4) the generation of a source list in a manner that does not depend upon a detailed knowledge of the point spread function (PSF) shape; and (5) error analysis. These features make our algorithm considerably more general than previous methods developed for the analysis of X-ray image data, especially in the low count regime. We demonstrate the algorithm's robustness by applying it to various images.

Book
01 Jan 2001
TL;DR: Wavelets and wavelet thresholding Smoothing non-equidistantly spaced data using second generation wavelets and thresholding and Bayesian correction with geometrical priors for image noise reduction.
Abstract: Wavelets and wavelet thresholding.- The minimum mean squared error threshold.- Estimating the minimum MSE threshold.- Thresholding and GCV applicability in more realistic situations.- Bayesian correction with geometrical priors for image noise reduction.- Smoothing non-equidistantly spaced data using second generation wavelets and thresholding.

Journal ArticleDOI
TL;DR: A new image texture segmentation algorithm, HMTseg, based on wavelets and the hidden Markov tree model, which can directly segment wavelet-compressed images without the need for decompression into the space domain.
Abstract: We introduce a new image texture segmentation algorithm, HMTseg, based on wavelets and the hidden Markov tree (HMT) model. The HMT is a tree-structured probabilistic graph that captures the statistical properties of the coefficients of the wavelet transform. Since the HMT is particularly well suited to images containing singularities (edges and ridges), it provides a good classifier for distinguishing between textures. Utilizing the inherent tree structure of the wavelet HMT and its fast training and likelihood computation algorithms, we perform texture classification at a range of different scales. We then fuse these multiscale classifications using a Bayesian probabilistic graph to obtain reliable final segmentations. Since HMTseg works on the wavelet transform of the image, it can directly segment wavelet-compressed images without the need for decompression into the space domain. We demonstrate the performance of HMTseg with synthetic, aerial photo, and document image segmentations.

Journal ArticleDOI
TL;DR: In this paper, the authors used wavelet analysis and envelope detection (ED) to detect bearing failure in a motor-pump driven system, which can detect both periodic and non-periodic signals, allowing the machine operator to more easily detect the remaining types of bearing faults.
Abstract: The components which often fail in a rolling element bearing are the outer-race, the inner-race, the rollers, and the cage. Such failures generate a series of impact vibrations in short time intervals, which occur at Bearing Characteristic Frequencies (BCF). Since BCF contain very little energy, and are usually overwhelmed by noise and higher levels of macro-structural vibrations, they are difficult to find in their frequency spectra when using the common technique of Fast Fourier Transforms (FFT). Therefore, Envelope Detection (ED) is always used with FFT to identify faults occurring at the BCF. However, the computation of ED is complicated, and requires expensive equipment and experienced operators to process. This, coupled with the incapacity of FFT to detect nonstationary signals, makes wavelet analysis a popular alternative for machine fault diagnosis. Wavelet analysis provides multi-resolution in time-frequency distribution for easier detection of abnormal vibration signals. From the results of extensive experiments performed in a series of motor-pump driven systems, the methods of wavelet analysis and FFT with ED are proven to be efficient in detecting some types of bearing faults. Since wavelet analysis can detect both periodic and nonperiodic signals, it allows the machine operator to more easily detect the remaining types of bearing faults which are impossible by the method of FFT with ED. Hence, wavelet analysis is a better fault diagnostic tool for the practice in maintenance.

Book
27 Sep 2001
TL;DR: A comprehensive presentation of the conceptual basis of wavelet analysis, including the construction and analysis of wavelets bases, can be found in this article, along with a detailed exposition of the Haar series.
Abstract: This book provides a comprehensive presentation of the conceptual basis of wavelet analysis, including the construction and analysis of wavelet bases. It motivates the central ideas of wavelet theory by offering a detailed exposition of the Haar series, then shows how a more abstract approach allows readers to generalize and improve upon the Haar series. It then presents a number of variations and extensions of Haar construction.

Journal Article
01 Jan 2001-Sankhya
TL;DR: In this article, the authors proposed a wavelet shrinkage method that incorporates information on neighboring coe cients into the decision making, and investigated the asymptotic and numerical performances of two particular versions of the estimator.
Abstract: In standard wavelet methods, the empirical wavelet coe cients are thresholded term by term, on the basis of their individual magnitudes. Information on other coe cients has no in uence on the treatment of particular coe cients. We propose a wavelet shrinkage method that incorporates information on neighboring coe cients into the decision making. The coe cients are considered in overlapping blocks; the treatment of coe cients in the middle of each block depends on the data in the whole block. The asymptotic and numerical performances of two particular versions of the estimator are investigated. We show that, asymptotically, one version of the estimator achieves the exact optimal rates of convergence over a range of Besov classes for global estimation, and attains adaptive minimax rate for estimating functions at a point. In numerical comparisons with various methods, both versions of the estimator perform excellently.

Proceedings ArticleDOI
07 Oct 2001
TL;DR: In this paper experiments with acceleration sensors are described for human activity recognition of a wearable device user and the use of principal component analysis and independent component analysis with a wavelet transform is tested for feature generation.
Abstract: In this paper experiments with acceleration sensors are described for human activity recognition of a wearable device user. The use of principal component analysis and independent component analysis with a wavelet transform is tested for feature generation. Recognition of human activity is examined with a multilayer perceptron classifier. Best classification results for recognition of different human motion were 83-90%, and they were achieved by utilizing independent component analysis and principal component analysis. The difference between these methods turned out to be negligible.

Journal ArticleDOI
TL;DR: This paper gives an alternative derivation and explanation for the result by Kingsbury (1999), that the dual-tree DWT is (nearly) shift-invariant when the scaling filters satisfy the same offset.
Abstract: This paper considers the design of pairs of wavelet bases where the wavelets form a Hilbert transform pair. The derivation is based on the limit functions defined by the infinite product formula. It is found that the scaling filters should be offset from one another by a half sample. This gives an alternative derivation and explanation for the result by Kingsbury (1999), that the dual-tree DWT is (nearly) shift-invariant when the scaling filters satisfy the same offset.

Journal ArticleDOI
TL;DR: This paper discusses in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provides an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data.
Abstract: Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced.

Journal ArticleDOI
TL;DR: The proposed method for the digital watermarking is based on the wavelet transform and is robust to a variety of signal distortions, such as JPEG, image cropping, sharpening, median filtering, and incorporating attacks.
Abstract: In this paper, an image accreditation technique by embedding digital watermarks in images is proposed. The proposed method for the digital watermarking is based on the wavelet transform. This is unlike most previous work, which used a random number of a sequence of bits as a watermark and where the watermark can only be detected by comparing an experimental threshold value to determine whether a sequence of random signals is the watermark. The proposed approach embeds a watermark with visual recognizable patterns, such as binary, gray, or color image in images by modifying the frequency part of the images. In the proposed approach, an original image is decomposed into wavelet coefficients. Then, multi-energy watermarking scheme based on the qualified significant wavelet tree (QSWT) is used to achieve the robustness of the watermarking. Unlike other watermarking techniques that use a single casting energy, QSWT adopts adaptive casting energy in different resolutions. The performance of the proposed watermarking is robust to a variety of signal distortions, such as JPEG, image cropping, sharpening, median filtering, and incorporating attacks.

Journal ArticleDOI
TL;DR: In this article, the limits of the mentioned methodologies are presented by showing their application to bearings affected by different pitting failures on the outer or inner race or a rolling element and subjected to a very low radial load.

Book
16 Nov 2001
TL;DR: The Wavelet Transforms and Their Applications as discussed by the authors is an introduction to wavelet transforms and accessible to a larger audience with diverse backgrounds and interests in mathematics, science, and engineering.
Abstract: This textbook is an introduction to wavelet transforms and accessible to a larger audience with diverse backgrounds and interests in mathematics, science, and engineering. Emphasis is placed on the logical development of fundamental ideas and systematic treatment of wavelet analysis and its applications to a wide variety of problems as encountered in various interdisciplinary areas. Topics and Features: * This second edition heavily reworks the chapters on Extensions of Multiresolution Analysis and Newlandss Harmonic Wavelets and introduces a new chapter containing new applications of wavelet transforms * Uses knowledge of Fourier transforms, some elementary ideas of Hilbert spaces, and orthonormal systems to develop the theory and applications of wavelet analysis * Offers detailed and clear explanations of every concept and method, accompanied by carefully selected worked examples, with special emphasis given to those topics in which students typically experience difficulty * Includes carefully chosen end-of-chapter exercises directly associated with applications or formulated in terms of the mathematical, physical, and engineering context and provides answers to selected exercises for additional help Mathematicians, physicists, computer engineers, and electrical and mechanical engineers will find Wavelet Transforms and Their Applications an exceptionally complete and accessible text and reference. It is also suitable as a self-study or reference guide for practitioners and professionals.