scispace - formally typeset
Search or ask a question
Author

Gilbert P. Compo

Bio: Gilbert P. Compo is an academic researcher from National Oceanic and Atmospheric Administration. The author has contributed to research in topics: Data assimilation & Climate change. The author has an hindex of 33, co-authored 79 publications receiving 18347 citations. Previous affiliations of Gilbert P. Compo include Cooperative Institute for Research in Environmental Sciences & University of Colorado Boulder.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino-Southern Oscillation (ENSO).
Abstract: A practical step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino–Southern Oscillation (ENSO). The guide includes a comparison to the windowed Fourier transform, the choice of an appropriate wavelet basis function, edge effects due to finite-length time series, and the relationship between wavelet scale and Fourier frequency. New statistical significance tests for wavelet power spectra are developed by deriving theoretical wavelet spectra for white and red noise processes and using these to establish significance levels and confidence intervals. It is shown that smoothing in time or scale can be used to increase the confidence of the wavelet spectrum. Empirical formulas are given for the effect of smoothing on significance levels and confidence intervals. Extensions to wavelet analysis such as filtering, the power Hovmoller, cross-wavelet spectra, and coherence are described. The statistical significance tests are used to give a quantitative measure of change...

12,803 citations

Journal ArticleDOI
TL;DR: The Twentieth Century Reanalysis (20CR) dataset as discussed by the authors provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions.
Abstract: The Twentieth Century Reanalysis (20CR) project is an international effort to produce a comprehensive global atmospheric circulation dataset spanning the twentieth century, assimilating only surface pressure reports and using observed monthly sea-surface temperature and sea-ice distributions as boundary conditions. It is chiefly motivated by a need to provide an observational dataset with quantified uncertainties for validations of climate model simulations of the twentieth century on all time-scales, with emphasis on the statistics of daily weather. It uses an Ensemble Kalman Filter data assimilation method with background ‘first guess’ fields supplied by an ensemble of forecasts from a global numerical weather prediction model. This directly yields a global analysis every 6 hours as the most likely state of the atmosphere, and also an uncertainty estimate of that analysis. The 20CR dataset provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions. Intercomparisons with independent radiosonde data indicate that the reanalyses are generally of high quality. The quality in the extratropical Northern Hemisphere throughout the century is similar to that of current three-day operational NWP forecasts. Intercomparisons over the second half-century of these surface-based reanalyses with other reanalyses that also make use of upper-air and satellite data are equally encouraging. It is anticipated that the 20CR dataset will be a valuable resource to the climate research community for both model validations and diagnostic studies. Some surprising results are already evident. For instance, the long-term trends of indices representing the North Atlantic Oscillation, the tropical Pacific Walker Circulation, and the Pacific–North American pattern are weak or non-existent over the full period of record. The long-term trends of zonally averaged precipitation minus evaporation also differ in character from those in climate model simulations of the twentieth century. Copyright © 2011 Royal Meteorological Society and Crown Copyright.

3,043 citations

Journal ArticleDOI
TL;DR: In this article, the authors modeled the variability of the Pacific decadal oscillation (PDO) on both interannual and decadal timescales as the sum of direct forcing by El Nino-Southern Oscillation (ENSO), the ''reemergence'' of North Pacific sea surface temperature anomalies in subsequent winters, and white noise atmospheric forcing.
Abstract: Variability of the Pacific decadal oscillation (PDO), on both interannual and decadal timescales, is well modeled as the sum of direct forcing by El Nino-Southern Oscillation (ENSO), the ''reemergence'' of North Pacific sea surface temperature anomalies in subsequent winters, and white noise atmospheric forcing. This simple model may be taken as a null hypothesis for the PDO, and may also be relevant for other climate integrators that have been previously related to the PDO.

712 citations

Journal ArticleDOI
TL;DR: The 20CRv2c dataset as mentioned in this paper is the first ensemble of sub-daily global atmospheric conditions spanning over 100 years, which provides a best estimate of the weather at any given place and time as well as an estimate of its confidence and uncertainty.
Abstract: Historical reanalyses that span more than a century are needed for a wide range of studies, from understanding large‐scale climate trends to diagnosing the impacts of individual historical extreme weather events. The Twentieth Century Reanalysis (20CR) Project is an effort to fill this need. It is supported by the National Oceanic and Atmospheric Administration (NOAA), the Cooperative Institute for Research in Environmental Sciences (CIRES), and the U.S. Department of Energy (DOE), and is facilitated by collaboration with the international Atmospheric Circulation Reconstructions over the Earth initiative. 20CR is the first ensemble of sub‐daily global atmospheric conditions spanning over 100 years. This provides a best estimate of the weather at any given place and time as well as an estimate of its confidence and uncertainty. While extremely useful, version 2c of this dataset (20CRv2c) has several significant issues, including inaccurate estimates of confidence and a global sea level pressure bias in the mid‐19th century. These and other issues can reduce its effectiveness for studies at many spatial and temporal scales. Therefore, the 20CR system underwent a series of developments to generate a significant new version of the reanalysis. The version 3 system (NOAA‐CIRES‐DOE 20CRv3) uses upgraded data assimilation methods including an adaptive inflation algorithm; has a newer, higher‐resolution forecast model that specifies dry air mass; and assimilates a larger set of pressure observations. These changes have improved the ensemble‐based estimates of confidence, removed spin‐up effects in the precipitation fields, and diminished the sea‐level pressure bias. Other improvements include more accurate representations of storm intensity, smaller errors, and large‐scale reductions in model bias. The 20CRv3 system is comprehensively reviewed, focusing on the aspects that have ameliorated issues in 20CRv2c. Despite the many improvements, some challenges remain, including a systematic bias in tropical precipitation and time‐varying biases in southern high‐latitude pressure fields.

409 citations

Journal ArticleDOI
TL;DR: In this paper, an ensemble Kalman filter based data assimilation system was used to generate weather maps of the lower-tropospheric extratropical circulation back to 1890 over the Northern Hemisphere, and back to 1930 over the Southern Hemisphere.
Abstract: Climate variability and global change studies are increasingly focused on understanding and predicting regional changes of daily weather statistics. Assessing the evidence for such variations over the last 100 yr requires a daily tropospheric circulation dataset. The only dataset available for the early twentieth century consists of error-ridden hand-drawn analyses of the mean sea level pressure field over the Northern Hemisphere. Modern data assimilation systems have the potential to improve upon these maps, but prior to 1948, few digitized upper-air sounding observations are available for such a “reanalysis.” We investigate the possibility that the additional number of newly recovered surface pressure observations is sufficient to generate useful weather maps of the lower-tropospheric extratropical circulation back to 1890 over the Northern Hemisphere, and back to 1930 over the Southern Hemisphere. Surprisingly, we find that by using an advanced data assimilation system based on an ensemble Kalman filte...

405 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: ERA-Interim as discussed by the authors is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF), which will extend back to the early part of the twentieth century.
Abstract: ERA-Interim is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF). The ERA-Interim project was conducted in part to prepare for a new atmospheric reanalysis to replace ERA-40, which will extend back to the early part of the twentieth century. This article describes the forecast model, data assimilation method, and input datasets used to produce ERA-Interim, and discusses the performance of the system. Special emphasis is placed on various difficulties encountered in the production of ERA-40, including the representation of the hydrological cycle, the quality of the stratospheric circulation, and the consistency in time of the reanalysed fields. We provide evidence for substantial improvements in each of these aspects. We also identify areas where further work is needed and describe opportunities and objectives for future reanalysis projects at ECMWF. Copyright © 2011 Royal Meteorological Society

22,055 citations

Journal ArticleDOI
TL;DR: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions as mentioned in this paper.
Abstract: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the ‘second-generation’ ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.

7,110 citations

Journal ArticleDOI
14 Jun 2007-Nature
TL;DR: Functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project are reported, providing convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts.
Abstract: We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

5,091 citations

Journal ArticleDOI
TL;DR: It is demonstrated how phase angle statistics can be used to gain confidence in causal relation- ships and test mechanistic models of physical relationships between the time series and Monte Carlo methods are used to assess the statistical significance against red noise backgrounds.
Abstract: Many scientists have made use of the wavelet method in analyzing time series, often using popular free software. However, at present there are no similar easy to use wavelet packages for analyzing two time series together. We discuss the cross wavelet transform and wavelet coher- ence for examining relationships in time frequency space be- tween two time series. We demonstrate how phase angle statistics can be used to gain confidence in causal relation- ships and test mechanistic models of physical relationships between the time series. As an example of typical data where such analyses have proven useful, we apply the methods to the Arctic Oscillation index and the Baltic maximum sea ice extent record. Monte Carlo methods are used to assess the statistical significance against red noise backgrounds. A software package has been developed that allows users to perform the cross wavelet transform and wavelet coherence (http://www.pol.ac.uk/home/research/waveletcoherence/). As we are interested in extracting low s/n ratio signals in time series we discuss only CWT in this paper. While CWT is a common tool for analyzing localized intermittent os- cillations in a time series, it is very often desirable to ex- amine two time series together that may be expected to be linked in some way. In particular, to examine whether re- gions in time frequency space with large common power have a consistent phase relationship and therefore are sug- gestive of causality between the time series. Many geophys- ical time series are not Normally distributed and we suggest methods of applying the CWT to such time series. From two CWTs we construct the Cross Wavelet Transform (XWT) which will expose their common power and relative phase in time-frequency space. We will further define a measure of Wavelet Coherence (WTC) between two CWT, which can find significant coherence even though the common power is low, and show how confidence levels against red noise back- grounds are calculated. We will present the basic CWT theory before we move on to XWT and WTC. New developments such as quanti- fying the phase relationship and calculating the WTC sig- nificance level will be treated more fully. When using the methods on time series it is important to have solid mecha- nistic foundations on which to base any relationships found, and we caution against using the methods in a "scatter-gun" approach (particularly if the time series probability density functions are modified). To illustrate how the various meth- ods are used we apply them to two data sets from meteo- rology and glaciology. Finally, we will provide links to a MatLab software package.

4,586 citations