scispace - formally typeset

Author

Gilbert P. Compo

Bio: Gilbert P. Compo is an academic researcher from National Oceanic and Atmospheric Administration. The author has contributed to research in topic(s): Data assimilation & Climate change. The author has an hindex of 33, co-authored 79 publication(s) receiving 18347 citation(s). Previous affiliations of Gilbert P. Compo include Cooperative Institute for Research in Environmental Sciences & University of Colorado Boulder.
Papers
More filters


Journal ArticleDOI
Abstract: The 1933 Atlantic hurricane season was extremely active, with 20 named storms and 11 hurricanes including 6 major (category 3+; 1-min maximum sustained winds ≥96 kt) hurricanes occurring. The 1933 hurricane season also generated the most accumulated cyclone energy (an integrated metric that accounts for frequency, intensity, and duration) of any Atlantic hurricane season on record. A total of 8 hurricanes tracked through the Caribbean in 1933—the most on record. In addition, two category 3 hurricanes made landfall in the United States just 23 h apart: the Treasure Coast hurricane in southeast Florida followed by the Cuba–Brownsville hurricane in south Texas. This manuscript examines large-scale atmospheric and oceanic conditions that likely led to such an active hurricane season. Extremely weak vertical wind shear was prevalent over both the Caribbean and the tropical Atlantic throughout the peak months of the hurricane season, likely in part due to a weak-to-moderate La Niña event. These favorable dynamic conditions, combined with above-normal tropical Atlantic sea surface temperatures, created a very conducive environment for hurricane formation and intensification. The Madden–Julian oscillation was relatively active during the summer and fall of 1933, providing subseasonal conditions that were quite favorable for tropical cyclogenesis during mid- to late August and late September to early October. The current early June and August statistical models used by Colorado State University would have predicted a very active 1933 hurricane season. A better understanding of these extremely active historical Atlantic hurricane seasons may aid in anticipation of future hyperactive seasons.

Journal ArticleDOI
Abstract: The performance of a new historical reanalysis, the NOAA–CIRES–DOE Twentieth Century Reanalysis version 3 (20CRv3), is evaluated via comparisons with other reanalyses and independent observations. This dataset provides global, 3-hourly estimates of the atmosphere from 1806 to 2015 by assimilating only surface pressure observations and prescribing sea surface temperature, sea ice concentration, and radiative forcings. Comparisons with independent observations, other reanalyses, and satellite products suggest that 20CRv3 can reliably produce atmospheric estimates on scales ranging from weather events to long-term climatic trends. Not only does 20CRv3 recreate a “best estimate” of the weather, including extreme events, it also provides an estimate of its confidence through the use of an ensemble. Surface pressure statistics suggest that these confidence estimates are reliable. Comparisons with independent upper-air observations in the Northern Hemisphere demonstrate that 20CRv3 has skill throughout the twentieth century. Upper-air fields from 20CRv3 in the late twentieth century and early twenty-first century correlate well with full-input reanalyses, and the correlation is predicted by the confidence fields from 20CRv3. The skill of analyzed 500-hPa geopotential heights from 20CRv3 for 1979–2015 is comparable to that of modern operational 3–4-day forecasts. Finally, 20CRv3 performs well on climate time scales. Long time series and multidecadal averages of mass, circulation, and precipitation fields agree well with modern reanalyses and station- and satellite-based products. 20CRv3 is also able to capture trends in tropospheric-layer temperatures that correlate well with independent products in the twentieth century, placing recent trends in a longer historical context.

11 citations


Journal ArticleDOI
Abstract: While gridded seasonal pressure reconstructions poleward of 60°S extending back to 1905 have been recently completed, their skill has not been assessed prior to 1958 To provide a more thorough evaluation of the skill and performance in the early 20th century, these reconstructions are compared to other gridded datasets, historical data from early Antarctic expeditions, ship records, and temporary bases Overall, the comparison confirms that the reconstruction uncertainty of 2–4 hPa (evaluated after 1979) over the Southern Ocean is a valid estimate of the reconstruction error in the early 20th century Over the interior and near the coast of Antarctica, direct comparisons with historical data are challenged by elevation‐based reductions to sea level pressure In a few cases, a simple linear adjustment of the reconstruction to sea level matches the historical data well, but in other cases, the differences remain greater than 10 hPa Despite these large errors, comparisons with continuous multi‐season observations demonstrate that aspects of the interannual variability are often still captured, suggesting that the reconstructions have skill representing variations on this timescale, even if it is difficult to determine how well they capture the mean pressure at these higher elevations Additional comparisons with various 20th century reanalysis products demonstrate the value of assimilating the historical observations in these datasets, which acts to substantially reduce the reanalysis ensemble spread, and bring the reanalysis ensemble mean within the reconstruction and observational uncertainty

2 citations


Journal ArticleDOI
Abstract: Observations from the historical meteorological observing network contain many artefacts of non-climatic origin which must be accounted for prior to using these data in climate applications. State-of-the-art homogenisation approaches use various flavours of pairwise comparison between a target station and candidate neighbour station series. Such approaches require an adequate number of neighbours of sufficient quality and comparability - a condition that is met for most station series since the mid-20th Century. However, pairwise approaches have challenges where suitable neighbouring stations are sparse, as remains the case in vast regions of the globe and is common almost everywhere prior to the early 20th Century. Modern sparse-input centennial reanalysis products continue to improve and offer a potential alternative to pairwise comparison, particularly where and when observations are sparse. They do not directly ingest or use land-based surface temperature observations, so they are a formally independent estimate. This may be particularly helpful in cases where structurally similar changes exist across broad networks, which challenges current techniques in the absence of metadata. They also potentially offer a valuable methodologically distinct method, which would help explore structural uncertainty in homogenisation techniques. The present study compares the potential of spatially-interpolated sparse-input reanalysis products to neighbour-based approaches to perform homogenisation of global monthly land surface air temperature records back to 1850 based upon the statistical properties of station-minus-reanalysis and station-minus-neighbour series. This shows that neighbour-based approaches likely remain preferable in data dense regions and epochs. However, the most recent reanalysis product, NOAA-CIRES-DOE 20CRv3, is potentially preferable in cases where insufficient neighbours are available. This may in particular affect long-term global average estimates where a small number of long-term stations in data sparse regions will make substantial contributions to global estimates and may contain missed data artefacts in present homogenisation approaches.

2 citations


Cited by
More filters

Journal ArticleDOI
Huiwen Tian1, Junhua Zhang1, Lianqi Zhu1, Jingting Qin1  +3 moreInstitutions (1)
01 Mar 2022-Geoderma
Abstract: The spatial variability of soil organic carbon (SOC) is scale- and location-dependent and controlled by various environmental factors. However, the location- and scale specific factors underpinning SOC variation often demand further investigation. This holds particularly true for topographically complex environments like China’s north-south transitional zone. In this paper, wavelet analysis was used to determine the relationship between SOC and environmental factors in the region. The results showed that SOC exhibits an obvious transition effect from north to south, especially at 140 km north of the transect. Elevation was the strongest single factor to explain SOC variations (percent area of significant coherence (PASC) = 25.20%, average wavelet coherence = 0.53 at all scales). Normalized difference vegetation index (NDVI), elevation, land surface temperature (LST) and topographic moisture index (TWI) were the strongest combination of factors explaining variation in SOC (PASC = 43.80%, average wavelet coherence = 0.94 at all scales). Edaphic factors, vegetative factors, climatic factors, topographic factors and their interactions jointly controlled the variation of SOC, and showed increasing predictive power at increasing spatial scales. Our results reveal the spatial sequence changes of SOC and the scale-location dependencies between SOC and environmental factors in China’s north-south transition zone, which can be used for modeling, mapping and management of SOC at different scales.

Journal ArticleDOI
Abstract: Experimental evaluation of cyclic variability or combustion instabilities of waste cooking oil biodiesel–diesel blends powered compression ignition engine is presented in this article. An advanced in-vehicle combustion analyzer armed with a piezoelectric pressure sensor was used for accurate measurements eradicating the experimental uncertainty. Cyclic variation in the combustion was investigated using the statistical and wavelet analysis method. Results of statistical methods and wavelet analysis were agreeing with each other toward self-validation. Statistical methods were used to calculate the mean and coefficient of variations, while the wavelet method has the potential to analyze the cyclic variation topography together with the intensity of variations in the engine combustion cycle, especially at low engine load conditions. Overall combustion analysis including wavelet analysis and statistical method indicates a more silent and smoother engine operation with biodiesel blending as it enhances combustion stability in unmodified diesel engines in comparison with conventional diesel fuel.

Journal ArticleDOI
Abstract: EEG experiments yield high-dimensional event-related potential (ERP) data in response to repeatedly presented stimuli throughout the experiment. Changes in the high-dimensional ERP signal throughout the duration of an experiment (longitudinally) is the main quantity of interest in learning paradigms, where they represent the learning dynamics. Typical analysis, which can be performed in the time or the frequency domain, average the ERP waveform across all trials, leading to the loss of the potentially valuable longitudinal information in the data. Longitudinal time-frequency transformation of ERP (LTFT-ERP) is proposed to retain information from both the time and frequency domains, offering distinct but complementary information on the underlying cognitive processes evoked, while still retaining the longitudinal dynamics in the ERP waveforms. LTFT-ERP begins by time-frequency transformations of the ERP data, collected across subjects, electrodes, conditions and trials throughout the duration of the experiment, followed by a data driven multidimensional principal components analysis (PCA) approach for dimension reduction. Following projection of the data onto leading directions of variation in the time and frequency domains, longitudinal learning dynamics are modeled within a mixed effects modeling framework. Applications to a learning paradigm in autism depict distinct learning patterns throughout the experiment among children diagnosed with Autism Spectrum Disorder and their typically developing peers. LTFT-ERP time-frequency joint transformations are shown to bring an additional level of specificity to interpretations of the longitudinal learning patterns related to underlying cognitive processes, which is lacking in single domain analysis (in the time or the frequency domain only). Simulation studies show the efficacy of the proposed methodology.

Journal ArticleDOI
Han Ping Hong1, X.Z. Cui1, D. Qiao2Institutions (2)
Abstract: In the present study, we propose a new iterative algorithm based on discretized continuous wavelet transform (CWT) to simulate nonstationary non-Gaussian vector process for prescribed marginal probability distribution functions and the time-scale power spectral density function matrix. The algorithm is designed for a CWT pair within the wavelet analysis framework. It can be used with analytical wavelets satisfying the admissibility condition and can cope with time-dependent and time-independent coherence. The proposed algorithm is applied to generate nonstationary downburst winds and seismic ground motions at multiple sites within the wavelet analysis framework. The numerical validation analysis confirms that the proposed algorithm can lead to the simulated records matching the prescribed marginal probability distribution, marginal time-scale power spectral density function and coherence function.

Journal ArticleDOI
Abstract: In a changing climate and in social context, tools and databases with high spatiotemporal resolution are needed for increasing the knowledge on the relationship between meteorological events and flood impacts; hence, analysis of high-resolution spatiotemporal databases with detailed information on the frequency, intensity, and impact of floods is necessary. However, the methodological nature of flood databases hinders relating specific flood events to the weather events that cause them; hence, methodologies for classifying flood cases according to the synoptic patterns that generate them are also necessary. Knowing which synoptic patterns are likely to generate risk situations allows for a probabilistic approach with high spatial resolution regarding the timing of occurrence, affected area, and expected damage from floods. To achieve these objectives, we use the SMC-Flood Database, a high-resolution spatiotemporal flood database covering the 1960–2015 period for all municipalities along the Spanish Mediterranean coast. To relate floods with the synoptic conditions that generated them, we used a multivariate analysis method on the corrected daily anomalies of the surface pressure fields, 850 hPa temperature, and 500 hPa geopotential height, all of which were obtained from the 20th Century Reanalysis Project V2. Results show that 12 atmospheric synoptic patterns can statistically explain the 3608 flood cases that occurred in the study area between 1960 and 2015. These flood cases were classified into 847 atmospherically induced flood events. These results reduce the uncertainty during decision making because of the classification of potential risk situations. The Mediterranean Basin is a region where floods have serious socioeconomic impacts; hence, this work helps improving prevention measures and providing information for policymakers, mainly regarding land use planning and early warning systems.

Network Information
Related Authors (5)
Prashant D. Sardeshmukh

100 papers, 10.2K citations

99% related
Jeffrey S. Whitaker

81 papers, 11.2K citations

98% related
Xiaolan L. Wang

74 papers, 8.7K citations

98% related
Rob Allan

97 papers, 10.6K citations

95% related
Alexey Kaplan

75 papers, 16.5K citations

95% related
Performance
Metrics

Author's H-index: 33

No. of papers from the Author in previous years
YearPapers
20215
20203
20196
20184
20174
20166