scispace - formally typeset
Search or ask a question
Author

Wendong Qu

Bio: Wendong Qu is an academic researcher from California Institute of Technology. The author has contributed to research in topics: Hilbert–Huang transform & Fourier transform. The author has an hindex of 2, co-authored 2 publications receiving 1536 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The confidence limit of the method here termed EMD/HSA (for empirical mode decomposition/Hilbert spectral analysis) is introduced by using various adjustable stopping criteria in the sifting processes of the EMD step to generate a sample set of intrinsic mode functions (IMFs) as mentioned in this paper.
Abstract: The confidence limit is a standard measure of the accuracy of the result in any statistical analysis. Most of the confidence limits are derived as follows. The data are first divided into subsections and then, under the ergodic assumption, the temporal mean is substituted for the ensemble mean. Next, the confidence limit is defined as a range of standard deviations from this mean. However, such a confidence limit is valid only for linear and stationary processes. Furthermore, in order for the ergodic assumption to be valid, the subsections have to be statistically independent. For non‐stationary and nonlinear processes, such an analysis is no longer valid. The confidence limit of the method here termed EMD/HSA (for empirical mode decomposition/Hilbert spectral analysis) is introduced by using various adjustable stopping criteria in the sifting processes of the EMD step to generate a sample set of intrinsic mode functions (IMFs). The EMD technique acts as a pre‐processor for HSA on the original data, producing a set of components (IMFs) from the original data that equal the original data when added back together. Each IMF represents a scale in the data, from smallest to largest. The ensemble mean and standard deviation of the IMF sample sets obtained with different stopping criteria are calculated, and these form a simple random sample set. The confidence limit for EMD/HSA is then defined as a range of standard deviations from the ensemble mean. Without evoking the ergodic assumption, subdivision of the data stream into short sections is unnecessary; hence, the results and the confidence limit retain the full‐frequency resolution of the full dataset. This new confidence limit can be applied to the analysis of nonlinear and non‐stationary processes by these new techniques. Data from length‐of‐day measurements and a particularly violent recent earthquake are used to demonstrate how the confidence limit is obtained and applied. By providing a confidence limit for this new approach, a stable range of stopping criteria for the decomposition or sifting phase (EMD) has been established, making the results of the final processing with HSA, and the entire EMD/HSA method, more definitive.

1,178 citations

Journal ArticleDOI
TL;DR: The Hilbert-Huang Transform (HHT) was originally developed for natural and engineering sciences and has now been applied to financial data as mentioned in this paper, where the first step is the EMD, with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions (IMF).
Abstract: A new method, the Hilbert–Huang Transform (HHT), developed initially for natural and engineering sciences has now been applied to financial data. The HHT method is specially developed for analysing non-linear and non-stationary data. The method consists of two parts: (1) the empirical mode decomposition (EMD), and (2) the Hilbert spectral analysis. The key part of the method is the first step, the EMD, with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions (IMF). An IMF is defined here as any function having the same number of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima, and minima respectively. The IMF also thus admits well-behaved Hilbert transforms. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to non-linear and non-stationary processes. With the Hilbert transform, the IMF yield instantaneous frequencies as functions of time that give sharp identifications of imbedded structures. The final presentation of the results is an energy–frequency–time distribution, which we designate as the Hilbert Spectrum. Comparisons with Wavelet and Fourier analyses show the new method offers much better temporal and frequency resolutions. The EMD is also useful as a filter to extract variability of different scales. In the present application, HHT has been used to examine the changeability of the market, as a measure of volatility of the market. Published in 2003 by John Wiley & Sons, Ltd.

489 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF.
Abstract: A new Ensemble Empirical Mode Decomposition (EEMD) is presented. This new approach consists of sifting an ensemble of white noise-added signal (data) and treats the mean as the final true result. Finite, not infinitesimal, amplitude white noise is necessary to force the ensemble to exhaust all possible solutions in the sifting process, thus making the different scale signals to collate in the proper intrinsic mode functions (IMF) dictated by the dyadic filter banks. As EEMD is a time–space analysis method, the added white noise is averaged out with sufficient number of trials; the only persistent part that survives the averaging process is the component of the signal (original data), which is then treated as the true and more physical meaningful answer. The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF. With this ensemble mean, one can separate scales naturall...

6,437 citations

Journal ArticleDOI
TL;DR: Hilbert-Huang transform, consisting of empirical mode decomposition and Hilbert spectral analysis, is a newly developed adaptive data analysis method, which has been used extensively in geophysical research.
Abstract: [1] Data analysis has been one of the core activities in scientific research, but limited by the availability of analysis methods in the past, data analysis was often relegated to data processing. To accommodate the variety of data generated by nonlinear and nonstationary processes in nature, the analysis method would have to be adaptive. Hilbert-Huang transform, consisting of empirical mode decomposition and Hilbert spectral analysis, is a newly developed adaptive data analysis method, which has been used extensively in geophysical research. In this review, we will briefly introduce the method, list some recent developments, demonstrate the usefulness of the method, summarize some applications in various geophysical research areas, and finally, discuss the outstanding open problems. We hope this review will serve as an introduction of the method for those new to the concepts, as well as a summary of the present frontiers of its applications for experienced research scientists.

1,533 citations

BookDOI
01 Sep 2005
TL;DR: The principle and insufficiency of Hilbert-Huang transform is introduced, several improved strategies are put forward, and some simulations are proceeds some simulations.
Abstract: The Hilbert-Huang Transform (HHT) represents a desperate attempt to break the suffocating hold on the field of data analysis by the twin assumptions of linearity and stationarity. Unlike spectrograms, wavelet analysis, or the Wigner-Ville Distribution, HHT is truly a time-frequency analysis, but it does not require an a priori functional basis and, therefore, the convolution computation of frequency. The method provides a magnifying glass to examine the data, and also offers a different view of data from nonlinear processes, with the results no longer shackled by spurious harmonics — the artifacts of imposing a linearity property on a nonlinear system or of limiting by the uncertainty principle, and a consequence of Fourier transform pairs in data analysis. This is the first HHT book containing papers covering a wide variety of interests. The chapters are divided into mathematical aspects and applications, with the applications further grouped into geophysics, structural safety and visualization.

847 citations

Journal ArticleDOI
TL;DR: The proposed algorithm to use real-valued projections along multiple directions on hyperspheres in order to calculate the envelopes and the local mean of multivariate signals, leading to multivariate extension of EMD.
Abstract: Despite empirical mode decomposition (EMD) becoming a de facto standard for time-frequency analysis of nonlinear and non-stationary signals, its multivariate extensions are only emerging; yet, they are a prerequisite for direct multichannel data analysis. An important step in this direction is the computation of the local mean, as the concept of local extrema is not well defined for multivariate signals. To this end, we propose to use real-valued projections along multiple directions on hyperspheres ( n -spheres) in order to calculate the envelopes and the local mean of multivariate signals, leading to multivariate extension of EMD. To generate a suitable set of direction vectors, unit hyperspheres ( n -spheres) are sampled based on both uniform angular sampling methods and quasi-Monte Carlo-based low-discrepancy sequences. The potential of the proposed algorithm to find common oscillatory modes within multivariate data is demonstrated by simulations performed on both hexavariate synthetic and real-world human motion signals.

800 citations

Journal ArticleDOI
TL;DR: A simple and logical definition of trend is given for any nonlinear and nonstationary time series as an intrinsically determined monotonic function within a certain temporal span, or a function in which there can be at most one extremum within that temporal span.
Abstract: Determining trend and implementing detrending operations are important steps in data analysis. Yet there is no precise definition of “trend” nor any logical algorithm for extracting it. As a result, various ad hoc extrinsic methods have been used to determine trend and to facilitate a detrending operation. In this article, a simple and logical definition of trend is given for any nonlinear and nonstationary time series as an intrinsically determined monotonic function within a certain temporal span (most often that of the data span), or a function in which there can be at most one extremum within that temporal span. Being intrinsic, the method to derive the trend has to be adaptive. This definition of trend also presumes the existence of a natural time scale. All these requirements suggest the Empirical Mode Decomposition (EMD) method as the logical choice of algorithm for extracting various trends from a data set. Once the trend is determined, the corresponding detrending operation can be implemented. With this definition of trend, the variability of the data on various time scales also can be derived naturally. Climate data are used to illustrate the determination of the intrinsic trend and natural variability.

787 citations