scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late nineteenth century

TL;DR: HadISST1 as mentioned in this paper replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1° latitude-longitude grid from 1871.
Abstract: [1] We present the Met Office Hadley Centre's sea ice and sea surface temperature (SST) data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1° latitude-longitude grid from 1871. The companion HadMAT1 runs monthly from 1856 on a 5° latitude-longitude grid and incorporates new corrections for the effect on NMAT of increasing deck (and hence measurement) heights. HadISST1 and HadMAT1 temperatures are reconstructed using a two-stage reduced-space optimal interpolation procedure, followed by superposition of quality-improved gridded observations onto the reconstructions to restore local detail. The sea ice fields are made more homogeneous by compensating satellite microwave-based sea ice concentrations for the impact of surface melt effects on retrievals in the Arctic and for algorithm deficiencies in the Antarctic and by making the historical in situ concentrations consistent with the satellite data. SSTs near sea ice are estimated using statistical relationships between SST and sea ice concentration. HadISST1 compares well with other published analyses, capturing trends in global, hemispheric, and regional SST well, containing SST fields with more uniform variance through time and better month-to-month persistence than those in GISST. HadMAT1 is more consistent with SST and with collocated land surface air temperatures than previous NMAT data sets.
Citations
More filters
Journal ArticleDOI
TL;DR: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions as mentioned in this paper.
Abstract: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the ‘second-generation’ ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.

7,110 citations

Journal ArticleDOI
04 Aug 2005-Nature
TL;DR: An index of the potential destructiveness of hurricanes based on the total dissipation of power, integrated over the lifetime of the cyclone, is defined and shows that this index has increased markedly since the mid-1970s, due to both longer storm lifetimes and greater storm intensities.
Abstract: Theory and modelling predict that hurricane intensity should increase with increasing global mean temperatures, but work on the detection of trends in hurricane activity has focused mostly on their frequency and shows no trend. Here I define an index of the potential destructiveness of hurricanes based on the total dissipation of power, integrated over the lifetime of the cyclone, and show that this index has increased markedly since the mid-1970s. This trend is due to both longer storm lifetimes and greater storm intensities. I find that the record of net hurricane power dissipation is highly correlated with tropical sea surface temperature, reflecting well-documented climate signals, including multi-decadal oscillations in the North Atlantic and North Pacific, and global warming. My results suggest that future warming may lead to an upward trend in tropical cyclone destructive potential, and--taking into account an increasing coastal population--a substantial increase in hurricane-related losses in the twenty-first century.

3,518 citations

Journal ArticleDOI
TL;DR: In this paper, two new high-resolution sea surface temperature (SST) analysis products have been developed using optimum interpolation (OI), which have a spatial grid resolution of 0.25° and a temporal resolution of 1 day.
Abstract: Two new high-resolution sea surface temperature (SST) analysis products have been developed using optimum interpolation (OI). The analyses have a spatial grid resolution of 0.25° and a temporal resolution of 1 day. One product uses the Advanced Very High Resolution Radiometer (AVHRR) infrared satellite SST data. The other uses AVHRR and Advanced Microwave Scanning Radiometer (AMSR) on the NASA Earth Observing System satellite SST data. Both products also use in situ data from ships and buoys and include a large-scale adjustment of satellite biases with respect to the in situ data. Because of AMSR’s near-all-weather coverage, there is an increase in OI signal variance when AMSR is added to AVHRR. Thus, two products are needed to avoid an analysis variance jump when AMSR became available in June 2002. For both products, the results show improved spatial and temporal resolution compared to previous weekly 1° OI analyses. The AVHRR-only product uses Pathfinder AVHRR data (currently available from January 1985 to December 2005) and operational AVHRR data for 2006 onward. Pathfinder AVHRR was chosen over operational AVHRR, when available, because Pathfinder agrees better with the in situ data. The AMSR– AVHRR product begins with the start of AMSR data in June 2002. In this product, the primary AVHRR contribution is in regions near land where AMSR is not available. However, in cloud-free regions, use of both infrared and microwave instruments can reduce systematic biases because their error characteristics are independent.

3,422 citations


Cites background or methods from "Global analyses of sea surface temp..."

  • ...In the OI.v2, sea ice concentrations were bias adjusted following the procedure in Rayner et al. (2003) to account for melt pond summer biases....

    [...]

  • ...In the OI.v2 based on Rayner et al. (2003) a quadratic relationship was defined between sea ice concentration and SST: TI aI 2 bI cI I0, 1 where TI is the simulated SST, I is the ice concentration fraction, which varies from 0 (0...

    [...]

  • ...of the climate-scale SST analyses produced at the National Oceanic and Atmospheric Administration (NOAA) as described by Reynolds and Smith (1994) and Reynolds et al....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors look at observations and model projections from 1923 to 2010, to test the ability of models to predict future drought conditions, which inspires confidence in their projections of drought.
Abstract: Historical records show increased aridity over many land areas since 1950. This study looks at observations and model projections from 1923 to 2010, to test the ability of models to predict future drought conditions. Models are able to capture the greenhouse-gas forcing and El Nino–Southern Oscillation mode for historical periods, which inspires confidence in their projections of drought.

3,385 citations

Journal ArticleDOI
TL;DR: The Twentieth Century Reanalysis (20CR) dataset as discussed by the authors provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions.
Abstract: The Twentieth Century Reanalysis (20CR) project is an international effort to produce a comprehensive global atmospheric circulation dataset spanning the twentieth century, assimilating only surface pressure reports and using observed monthly sea-surface temperature and sea-ice distributions as boundary conditions. It is chiefly motivated by a need to provide an observational dataset with quantified uncertainties for validations of climate model simulations of the twentieth century on all time-scales, with emphasis on the statistics of daily weather. It uses an Ensemble Kalman Filter data assimilation method with background ‘first guess’ fields supplied by an ensemble of forecasts from a global numerical weather prediction model. This directly yields a global analysis every 6 hours as the most likely state of the atmosphere, and also an uncertainty estimate of that analysis. The 20CR dataset provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions. Intercomparisons with independent radiosonde data indicate that the reanalyses are generally of high quality. The quality in the extratropical Northern Hemisphere throughout the century is similar to that of current three-day operational NWP forecasts. Intercomparisons over the second half-century of these surface-based reanalyses with other reanalyses that also make use of upper-air and satellite data are equally encouraging. It is anticipated that the 20CR dataset will be a valuable resource to the climate research community for both model validations and diagnostic studies. Some surprising results are already evident. For instance, the long-term trends of indices representing the North Atlantic Oscillation, the tropical Pacific Walker Circulation, and the Pacific–North American pattern are weak or non-existent over the full period of record. The long-term trends of zonally averaged precipitation minus evaporation also differ in character from those in climate model simulations of the twentieth century. Copyright © 2011 Royal Meteorological Society and Crown Copyright.

3,043 citations


Cites background or methods from "Global analyses of sea surface temp..."

  • ...1 dataset (Rayner et al., 2003)....

    [...]

  • ...As noted by Rayner et al. (2003), the quality of the HadISST varies throughout the period from a sparse network of marine SST observations in the 1870s to a network comprising additional numerous marine, buoy, and satellite SST observations in the late twentieth century (with the International…...

    [...]

  • ...…fields with interpolated monthly sea-surface temperature and sea-ice concentration fields from the Hadley Centre Sea Ice and SST dataset (HadISST; Rayner et al., 2003) as prescribed boundary conditions, and newly compiled surface pressure and SLP reports and observations, to produce a…...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors present an overview of the climate system and its dynamics, including observed climate variability and change, the carbon cycle, atmospheric chemistry and greenhouse gases, and their direct and indirect effects.
Abstract: Summary for policymakers Technical summary 1. The climate system - an overview 2. Observed climate variability and change 3. The carbon cycle and atmospheric CO2 4. Atmospheric chemistry and greenhouse gases 5. Aerosols, their direct and indirect effects 6. Radiative forcing of climate change 7. Physical climate processes and feedbacks 8. Model evaluation 9. Projections of future climate change 10. Regional climate simulation - evaluation and projections 11. Changes in sea level 12. Detection of climate change and attribution of causes 13. Climate scenario development 14. Advancing our understanding Glossary Index Appendix.

13,366 citations


"Global analyses of sea surface temp..." refers methods in this paper

  • ...It is recommended that the noninterpolated SST data set HadSST [Jones et al., 2001] be used alongside HadISST1 for climate monitoring and climate change detection studies, as was done in the Third Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) [Folland et al., 2001a]....

    [...]

  • ...The HadISST1 sea ice analysis has been used both for climate monitoring [Folland et al., 2001a] and for model validation [Gregory et al., 2002]....

    [...]

  • ...NMAT has been used to monitor climate and detect its changes and to corroborate estimates of climatic variations made using land air temperature and/or SST [e.g., Folland et al., 2001a]....

    [...]

  • ...Published tests have given support to the Folland and Parker [1995] adjustments [Folland and Salinger, 1995; Folland et al., 1997, 2001b; Hanawa et al., 2000; Smith and Reynolds, 2002]: see also section 7....

    [...]

Journal ArticleDOI
23 Sep 1999-Nature
TL;DR: An analysis of observational data over the past 40 years shows a dipole mode in the Indian Ocean: a pattern of internal variability with anomalously low sea surface temperatures off Sumatra and high seasurface temperatures in the western Indian Ocean, with accompanying wind and precipitation anomalies.
Abstract: For the tropical Pacific and Atlantic oceans, internal modes of variability that lead to climatic oscillations have been recognized1,2, but in the Indian Ocean region a similar ocean–atmosphere interaction causing interannual climate variability has not yet been found3. Here we report an analysis of observational data over the past 40 years, showing a dipole mode in the Indian Ocean: a pattern of internal variability with anomalously low sea surface temperatures off Sumatra and high sea surface temperatures in the western Indian Ocean, with accompanying wind and precipitation anomalies. The spatio-temporal links between sea surface temperatures and winds reveal a strong coupling through the precipitation field and ocean dynamics. This air–sea interaction process is unique and inherent in the Indian Ocean, and is shown to be independent of the El Nino/Southern Oscillation. The discovery of this dipole mode that accounts for about 12% of the sea surface temperature variability in the Indian Ocean—and, in its active years, also causes severe rainfall in eastern Africa and droughts in Indonesia—brightens the prospects for a long-term forecast of rainfall anomalies in the affected countries.

4,385 citations

Journal ArticleDOI
TL;DR: A weekly 1° spatial resolution optimum interpolation (OI) sea surface temperature (SST) analysis has been produced at the National Oceanic and Atmospheric Administration (NOAA) using both in situ and satellite data from November 1981 to the present as mentioned in this paper.
Abstract: A weekly 1° spatial resolution optimum interpolation (OI) sea surface temperature (SST) analysis has been produced at the National Oceanic and Atmospheric Administration (NOAA) using both in situ and satellite data from November 1981 to the present. The weekly product has been available since 1993 and is widely used for weather and climate monitoring and forecasting. Errors in the satellite bias correction and the sea ice to SST conversion algorithm are discussed, and then an improved version of the OI analysis is developed. The changes result in a modest reduction in the satellite bias that leaves small global residual biases of roughly −0.03°C. The major improvement in the analysis occurs at high latitudes due to the new sea ice algorithm where local differences between the old and new analysis can exceed 1°C. Comparisons with other SST products are needed to determine the consistency of the OI. These comparisons show that the differences among products occur on large time- and space scales wit...

4,346 citations


"Global analyses of sea surface temp..." refers background or methods in this paper

  • ...[86] We compare the monthly climatology for 1971– 2000 derived from HadISST1 with the adjusted OI.v2 1971–2000 climatology [Reynolds et al., 2002]....

    [...]

  • ...The companion HadMAT1 runs monthly from 1856 on a 5 latitude-longitude grid and incorporates new corrections for the effect on NMAT of increasing deck (and hence measurement) heights....

    [...]

  • ...HadISST1 has also been used to supply information for the ocean surface for the period 1958 through 1981 in the 40-year ECMWF Reanalysis (ERA40), with 2DVAR and OI.v2 [Reynolds et al., 2002] used thereafter....

    [...]

  • ...These formed the outer boundary condition for the completion of the global fields on 1 area resolution (section 4.2) using the Poisson blending technique [Reynolds, 1988], which extended the observed and reconstructed data over the remaining data-void regions....

    [...]

  • ...[4] In HadISST1, broad-scale fields of SST are reconstructed using one of these EOF-based techniques, reduced space optimal interpolation (RSOI)....

    [...]

Journal ArticleDOI
TL;DR: The new NOAA operational global sea surface temperature (SST) analysis is described in this paper, which uses 7 days of in situ (ship and buoy) and satellite SST.
Abstract: The new NOAA operational global sea surface temperature (SST) analysis is described. The analyses use 7 days of in situ (ship and buoy) and satellite SST. These analyses are produced weekly and daily using optimum interpolation (OI) on a 1° grid. The OI technique requires the specification of data and analysis error statistics. These statistics are derived and show that the SST rms data errors from ships are almost twice as large as the data errors from buoys or satellites. In addition, the average e-folding spatial error scales have been found to be 850 km in the zonal direction and 615 km in the meridional direction. The analysis also includes a preliminary step that corrects any satellite biases relative to the in situ data using Poisson's equation. The importance of this correction is demonstrated using recent data following the 1991 eruptions of Mt. Pinatubo. The OI analysis has been computed using the in situ and bias-corrected satellite data for the period 1985 to present.

2,766 citations


"Global analyses of sea surface temp..." refers background or methods in this paper

  • ...National Oceanic and Atmospheric Administration (NOAA) optimal interpolation (OI) SST data sets [Reynolds and Smith, 1994; Reynolds et al., 2002] are globally complete, contain varying sea ice, have a spatial resolution of 1 latitude by 1 longitude (hereafter 1 area) and weekly temporal resolution....

    [...]

  • ...The U.S. National Oceanic and Atmospheric Administration (NOAA) optimal interpolation (OI) SST data sets [Reynolds and Smith, 1994; Reynolds et al., 2002] are globally complete, contain varying sea ice, have a spatial resolution of 1 latitude by 1 longitude (hereafter 1 area) and weekly temporal…...

    [...]

  • ...[50] Optimal interpolation techniques tend to the first guess (in this case, zero anomaly, or the 1961–1990 climatology) in areas where there is no information [Reynolds and Smith, 1994]....

    [...]

  • ...Application of Reduced Space Optimal Interpolation [50] Optimal interpolation techniques tend to the first guess (in this case, zero anomaly, or the 1961–1990 climatology) in areas where there is no information [Reynolds and Smith, 1994]....

    [...]

Journal ArticleDOI
TL;DR: Van Der Heijden et al. as discussed by the authors used correspondence analysis for the analysis of transitions between more than two time points, where the transition matrix is the product of the margins of the table divided by the total sample size.
Abstract: Correspondence analysis is an exploratory tool for the analysis of associations between categorical variables, the results of which may be displayed graphically. For longitudinal data, two types of analysis can be distinguished: the first focuses on transitions, whereas the second investigates trends. For transitional analysis with two time points, an analysis of the transition matrix (showing the relative frequencies for pairs of categories) provides insight into the structure of departures from independence in the transitions. Transitions between more than two time points can also be studied simultaneously. In trend analyses often the trajectories of different groups are compared. Examples for all these analyses are provided. Correspondence analysis is an exploratory tool for the analysis of association(s) between categorical variables. Usually, the results are displayed in a graphical way. There are many interpretations of correspondence analysis. Here, we make use of two of them. A first interpretation is that the observed categorical data are collected in a matrix, and correspondence analysis approximates this matrix by a matrix of lower rank[1]. This lower rank approximation of, say, rank M + 1 is then displayed graphically in an M-dimensional representation in which each row and each column of the matrix is displayed as a point. The difference in rank between the rank M + 1 matrix and the rank M representation is matrix of rank 1, and this matrix is the product of the marginal counts of the matrix, that is most often considered uninteresting. This brings us to the second interpretation, that is, that when the two-way matrix is a contingency table, correspondence analysis decomposes the departure from a matrix where the row and column variables are independent[2,3]. Thus, correspondence analysis is a tool for residual analysis. This interpretation holds because for a contingency table estimates under the independence model are obtained from the product of the margins of the table divided by the total sample size. Longitudinal data are data where observations (e.g., individuals) are measured at least twice using the same variables. We consider here only categorical (i.e., nominal or ordinal) variables, as only this kind of variables is analyzed in standard applications of correspondence analysis[4]. We first discuss correspondence analysis for the analysis of transitions. Thereafter, we consider analysis of trends with canonical correspondence analysis. 1 Leiden University, Leiden, The Netherlands 2 Utrecht University, Utrecht, The Netherlands Update based on original article by Peter G. M. Van Der Heijden, Wiley StatsRef: Statistics Reference Online, © 2014, John Wiley & Sons, Ltd Wiley StatsRef: Statistics Reference Online, © 2014–2015 John Wiley & Sons, Ltd. This article is © 2015 John Wiley & Sons, Ltd. DOI: 10.1002/9781118445112.stat05497.pub2 1 Correspondence Analysis of Longitudinal Data 1 Transitional Analysis

2,104 citations

Related Papers (5)