scispace - formally typeset
Search or ask a question
Author

Samuel S. P. Shen

Bio: Samuel S. P. Shen is an academic researcher from San Diego State University. The author has contributed to research in topics: Sea surface temperature & Empirical orthogonal functions. The author has an hindex of 32, co-authored 116 publications receiving 5194 citations. Previous affiliations of Samuel S. P. Shen include Goddard Space Flight Center & Texas A&M University.


Papers
More filters
Journal ArticleDOI
TL;DR: The confidence limit of the method here termed EMD/HSA (for empirical mode decomposition/Hilbert spectral analysis) is introduced by using various adjustable stopping criteria in the sifting processes of the EMD step to generate a sample set of intrinsic mode functions (IMFs) as mentioned in this paper.
Abstract: The confidence limit is a standard measure of the accuracy of the result in any statistical analysis. Most of the confidence limits are derived as follows. The data are first divided into subsections and then, under the ergodic assumption, the temporal mean is substituted for the ensemble mean. Next, the confidence limit is defined as a range of standard deviations from this mean. However, such a confidence limit is valid only for linear and stationary processes. Furthermore, in order for the ergodic assumption to be valid, the subsections have to be statistically independent. For non‐stationary and nonlinear processes, such an analysis is no longer valid. The confidence limit of the method here termed EMD/HSA (for empirical mode decomposition/Hilbert spectral analysis) is introduced by using various adjustable stopping criteria in the sifting processes of the EMD step to generate a sample set of intrinsic mode functions (IMFs). The EMD technique acts as a pre‐processor for HSA on the original data, producing a set of components (IMFs) from the original data that equal the original data when added back together. Each IMF represents a scale in the data, from smallest to largest. The ensemble mean and standard deviation of the IMF sample sets obtained with different stopping criteria are calculated, and these form a simple random sample set. The confidence limit for EMD/HSA is then defined as a range of standard deviations from the ensemble mean. Without evoking the ergodic assumption, subdivision of the data stream into short sections is unnecessary; hence, the results and the confidence limit retain the full‐frequency resolution of the full dataset. This new confidence limit can be applied to the analysis of nonlinear and non‐stationary processes by these new techniques. Data from length‐of‐day measurements and a particularly violent recent earthquake are used to demonstrate how the confidence limit is obtained and applied. By providing a confidence limit for this new approach, a stable range of stopping criteria for the decomposition or sifting phase (EMD) has been established, making the results of the final processing with HSA, and the entire EMD/HSA method, more definitive.

1,178 citations

BookDOI
01 Sep 2005
TL;DR: The principle and insufficiency of Hilbert-Huang transform is introduced, several improved strategies are put forward, and some simulations are proceeds some simulations.
Abstract: The Hilbert-Huang Transform (HHT) represents a desperate attempt to break the suffocating hold on the field of data analysis by the twin assumptions of linearity and stationarity. Unlike spectrograms, wavelet analysis, or the Wigner-Ville Distribution, HHT is truly a time-frequency analysis, but it does not require an a priori functional basis and, therefore, the convolution computation of frequency. The method provides a magnifying glass to examine the data, and also offers a different view of data from nonlinear processes, with the results no longer shackled by spurious harmonics — the artifacts of imposing a linearity property on a nonlinear system or of limiting by the uncertainty principle, and a consequence of Fourier transform pairs in data analysis. This is the first HHT book containing papers covering a wide variety of interests. The chapters are divided into mathematical aspects and applications, with the applications further grouped into geophysics, structural safety and visualization.

847 citations

Journal ArticleDOI
TL;DR: The Third Pole (TP) is experiencing rapid warming and is currently in its warmest period in the past 2,000 years as mentioned in this paper, and the latest development in multidisciplinary TP research is reviewed in this paper.
Abstract: The Third Pole (TP) is experiencing rapid warming and is currently in its warmest period in the past 2,000 years. This paper reviews the latest development in multidisciplinary TP research ...

530 citations

Journal ArticleDOI
TL;DR: The Hilbert-Huang Transform (HHT) was originally developed for natural and engineering sciences and has now been applied to financial data as mentioned in this paper, where the first step is the EMD, with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions (IMF).
Abstract: A new method, the Hilbert–Huang Transform (HHT), developed initially for natural and engineering sciences has now been applied to financial data. The HHT method is specially developed for analysing non-linear and non-stationary data. The method consists of two parts: (1) the empirical mode decomposition (EMD), and (2) the Hilbert spectral analysis. The key part of the method is the first step, the EMD, with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions (IMF). An IMF is defined here as any function having the same number of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima, and minima respectively. The IMF also thus admits well-behaved Hilbert transforms. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to non-linear and non-stationary processes. With the Hilbert transform, the IMF yield instantaneous frequencies as functions of time that give sharp identifications of imbedded structures. The final presentation of the results is an energy–frequency–time distribution, which we designate as the Hilbert Spectrum. Comparisons with Wavelet and Fourier analyses show the new method offers much better temporal and frequency resolutions. The EMD is also useful as a filter to extract variability of different scales. In the present application, HHT has been used to examine the changeability of the market, as a measure of volatility of the market. Published in 2003 by John Wiley & Sons, Ltd.

489 citations

Journal ArticleDOI
TL;DR: In this article, the authors present the first analysis of global and hemispheric surface warming trends that attempts to quantify the major sources of uncertainty, such as urbanization, changing land-based observing practices and SST bias corrections.
Abstract: We present the first analysis of global and hemispheric surface warming trends that attempts to quantify the major sources of uncertainty. We calculate global and hemispheric annual temperature anomalies by combining land surface air temperature and sea surface temperature (SST) through an optimal averaging technique. The technique allows estimation of uncertainties in the annual anomalies resulting from data gaps and random errors. We add independent uncertainties due to urbanisation, changing land-based observing practices and SST bias corrections. We test the accuracy of the SST bias corrections, which represent the largest source of uncertainty in the data, through a suite of climate model simulations. These indicate that the corrections are likely to be fairly accurate on an annual average and on large space scales. Allowing for serial correlation and annual uncertainties, the best linear fit to annual global surface temperature gives an increase of 0.61 ± 0.16°C between 1861 and 2000.

354 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: HadISST1 as mentioned in this paper replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1° latitude-longitude grid from 1871.
Abstract: [1] We present the Met Office Hadley Centre's sea ice and sea surface temperature (SST) data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1° latitude-longitude grid from 1871. The companion HadMAT1 runs monthly from 1856 on a 5° latitude-longitude grid and incorporates new corrections for the effect on NMAT of increasing deck (and hence measurement) heights. HadISST1 and HadMAT1 temperatures are reconstructed using a two-stage reduced-space optimal interpolation procedure, followed by superposition of quality-improved gridded observations onto the reconstructions to restore local detail. The sea ice fields are made more homogeneous by compensating satellite microwave-based sea ice concentrations for the impact of surface melt effects on retrievals in the Arctic and for algorithm deficiencies in the Antarctic and by making the historical in situ concentrations consistent with the satellite data. SSTs near sea ice are estimated using statistical relationships between SST and sea ice concentration. HadISST1 compares well with other published analyses, capturing trends in global, hemispheric, and regional SST well, containing SST fields with more uniform variance through time and better month-to-month persistence than those in GISST. HadMAT1 is more consistent with SST and with collocated land surface air temperatures than previous NMAT data sets.

8,958 citations

01 Jan 2007
TL;DR: Drafting Authors: Neil Adger, Pramod Aggarwal, Shardul Agrawala, Joseph Alcamo, Abdelkader Allali, Oleg Anisimov, Nigel Arnell, Michel Boko, Osvaldo Canziani, Timothy Carter, Gino Casassa, Ulisses Confalonieri, Rex Victor Cruz, Edmundo de Alba Alcaraz, William Easterling, Christopher Field, Andreas Fischlin, Blair Fitzharris.
Abstract: Drafting Authors: Neil Adger, Pramod Aggarwal, Shardul Agrawala, Joseph Alcamo, Abdelkader Allali, Oleg Anisimov, Nigel Arnell, Michel Boko, Osvaldo Canziani, Timothy Carter, Gino Casassa, Ulisses Confalonieri, Rex Victor Cruz, Edmundo de Alba Alcaraz, William Easterling, Christopher Field, Andreas Fischlin, Blair Fitzharris, Carlos Gay García, Clair Hanson, Hideo Harasawa, Kevin Hennessy, Saleemul Huq, Roger Jones, Lucka Kajfež Bogataj, David Karoly, Richard Klein, Zbigniew Kundzewicz, Murari Lal, Rodel Lasco, Geoff Love, Xianfu Lu, Graciela Magrín, Luis José Mata, Roger McLean, Bettina Menne, Guy Midgley, Nobuo Mimura, Monirul Qader Mirza, José Moreno, Linda Mortsch, Isabelle Niang-Diop, Robert Nicholls, Béla Nováky, Leonard Nurse, Anthony Nyong, Michael Oppenheimer, Jean Palutikof, Martin Parry, Anand Patwardhan, Patricia Romero Lankao, Cynthia Rosenzweig, Stephen Schneider, Serguei Semenov, Joel Smith, John Stone, Jean-Pascal van Ypersele, David Vaughan, Coleen Vogel, Thomas Wilbanks, Poh Poh Wong, Shaohong Wu, Gary Yohe

7,720 citations

Journal ArticleDOI
TL;DR: The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF.
Abstract: A new Ensemble Empirical Mode Decomposition (EEMD) is presented. This new approach consists of sifting an ensemble of white noise-added signal (data) and treats the mean as the final true result. Finite, not infinitesimal, amplitude white noise is necessary to force the ensemble to exhaust all possible solutions in the sifting process, thus making the different scale signals to collate in the proper intrinsic mode functions (IMF) dictated by the dyadic filter banks. As EEMD is a time–space analysis method, the added white noise is averaged out with sufficient number of trials; the only persistent part that survives the averaging process is the component of the signal (original data), which is then treated as the true and more physical meaningful answer. The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF. With this ensemble mean, one can separate scales naturall...

6,437 citations

Journal Article
TL;DR: In this article, the authors present a document, redatto, voted and pubblicato by the Ipcc -Comitato intergovernativo sui cambiamenti climatici - illustra la sintesi delle ricerche svolte su questo tema rilevante.
Abstract: Cause, conseguenze e strategie di mitigazione Proponiamo il primo di una serie di articoli in cui affronteremo l’attuale problema dei mutamenti climatici. Presentiamo il documento redatto, votato e pubblicato dall’Ipcc - Comitato intergovernativo sui cambiamenti climatici - che illustra la sintesi delle ricerche svolte su questo tema rilevante.

4,187 citations