scispace - formally typeset
Search or ask a question
Journal ArticleDOI

High-resolution palaeoclimatic records for the last millennium: Interpretation, integration and comparison with General Circulation Model control-run temperatures

01 May 1998-The Holocene (SAGE Publications)-Vol. 8, Iss: 4, pp 455-471
TL;DR: In this article, a Principal Component Analysis (PCA) is performed on yearly values for the 17 reconstructions over the period AD 1660-1970, all extending back at least to the mid-seventeenth century, to form two annually resolved hemispheric series (NH10 and SH7).
Abstract: Palaeoclimatology provides our only means of assessing climatic variations before the beginning of instrumental records. The various proxy variables used, however, have a number of limitations which must be adequately addressed and understood. Besides their obvious spatial and seasonal limitations, different proxies are also potentially limited in their ability to represent climatic variations over a range of different timescales. Simple correlations with instrumental data over the period since AD 1881 give some guide to which are the better proxies, indicating that coral- and ice-core-based reconstructions are poorer than tree-ring and historical ones. However, the quality of many proxy time series can deteriorate during earlier times. Suggestions are made for assessing proxy quality over longer periods than the last century by intercomparing neighbouring proxies and, by comparisons with less temporally resolved proxies such as borehole temperatures. We have averaged 17 temperature reconstructions (representing various seasons of the year), all extending back at least to the mid-seventeenth century, to form two annually resolved hemispheric series (NH10 and SH7). Over the 1901-91 period, NH10 has 36% variance in common with average NH summer (June to August) temperatures and 70% on decadal timescales. SH7 has 16% variance in common with average SH summer (December to February) temperatures and 49% on decadal timescales, markedly poorer than the reconstructed NH series. The coldest year of the millennium over the NH is AD 1601, the coldest decade 1691-1700 and the seventeenth is the coldest century. A Principal Components Analysis (PCA) is performed on yearly values for the 17 reconstructions over the period AD 1660-1970. The correlation between PC1 and NH10 is 0.92, even though PC1 explains only 13.6% of the total variance of all 17 series. Similar PCA is performed on thousand-year-long General Circulation Model (GCM) data from the Geophysical Fluid Dynamics Laboratory (GFDL) and the Hadley Centre (HADCM2), sampling these for the same locations and seasons as the proxy data. For GFDL, the correlation between its PC1 and its NH10 is 0,89, while for HADCM2 the PCs group markedly differently. Cross-spectral analyses are performed on the proxy data and the GFDL model data at two different frequency bands (0.02 and 0.03 cycles per year). Both analyses suggest that there is no large-scale coherency in the series on these timescales. This implies that if the proxy data are meaningful, it should be relatively straightforward to detect a coherent near-global anthropogenic signal in surface temperature data.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
14 Jul 2000-Science
TL;DR: A 21st-century global warming projection far exceeds the natural variability of the past 1000 years and is greater than the best estimate of global temperature change for the last interglacial.
Abstract: Recent reconstructions of Northern Hemisphere temperatures and climate forcing over the past 1000 years allow the warming of the 20th century to be placed within a historical context and various mechanisms of climate change to be tested. Comparisons of observations with simulations from an energy balance climate model indicate that as much as 41 to 64% of preanthropogenic (pre-1850) decadal-scale temperature variations was due to changes in solar irradiance and volcanism. Removal of the forced response from reconstructed temperature time series yields residuals that show similar variability to those of control runs of coupled models, thereby lending support to the models' value as estimates of low-frequency variability in the climate system. Removal of all forcing except greenhouse gases from the ∼1000-year time series results in a residual with a very large late-20th-century warming that closely agrees with the response predicted from greenhouse gas forcing. The combination of a unique level of temperature increase in the late 20th century and improved constraints on the role of natural variability provides further evidence that the greenhouse effect has already established itself above the level of natural variability in the climate system. A 21st-century global warming projection far exceeds the natural variability of the past 1000 years and is greater than the best estimate of global temperature change for the last interglacial.

1,971 citations

Journal ArticleDOI
TL;DR: In this article, the authors attempt hemispheric temperature reconstructions with proxy data networks for the past millennium, focusing not just on the reconstructions, but the uncertainties therein, and important caveats.
Abstract: Building on recent studies, we attempt hemispheric temperature reconstructions with proxy data networks for the past millennium. We focus not just on the reconstructions, but the uncertainties therein, and important caveats. Though expanded uncertainties prevent decisive conclusions for the period prior to AD 1400, our results suggest that the latter 20th century is anomalous in the context of at least the past millennium. The 1990s was the warmest decade, and 1998 the warmest year, at moderately high levels of confidence. The 20th century warming counters a millennial-scale cooling trend which is consistent with long-term astronomical forcing.

1,742 citations

Journal ArticleDOI
05 Mar 2004-Science
TL;DR: Multiproxy reconstructions of monthly and seasonal surface temperature fields for Europe back to 1500 show that the late 20th- and early 21st-century European climate is very likely (>95% confidence level) warmer than that of any time during the past 500 years.
Abstract: Multiproxy reconstructions of monthly and seasonal surface temperature fields for Europe back to 1500 show that the late 20th- and early 21st-century European climate is very likely (>95% confidence level) warmer than that of any time during the past 500 years. This agrees with findings for the entire Northern Hemisphere. European winter average temperatures during the period 1500 to 1900 were reduced by ∼0.5°C (0.25°C for annual mean temperatures) compared to the 20th century. Summer temperatures did not experience systematic century-scale cooling relative to present conditions. The coldest European winter was 1708/1709; 2003 was by far the hottest summer.

1,665 citations

Journal ArticleDOI
10 Feb 2005-Nature
TL;DR: This reconstruction of Northern Hemisphere temperatures for the past 2,000 years is reconstructed by combining low-resolution proxies with tree-ring data, using a wavelet transform technique to achieve timescale-dependent processing of the data.
Abstract: A number of reconstructions of millennial-scale climate variability have been carried out in order to understand patterns of natural climate variability, on decade to century timescales, and the role of anthropogenic forcing These reconstructions have mainly used tree-ring data and other data sets of annual to decadal resolution Lake and ocean sediments have a lower time resolution, but provide climate information at multicentennial timescales that may not be captured by tree-ring data Here we reconstruct Northern Hemisphere temperatures for the past 2,000 years by combining low-resolution proxies with tree-ring data, using a wavelet transform technique to achieve timescale-dependent processing of the data Our reconstruction shows larger multicentennial variability than most previous multi-proxy reconstructions, but agrees well with temperatures reconstructed from borehole measurements and with temperatures obtained with a general circulation model According to our reconstruction, high temperatures--similar to those observed in the twentieth century before 1990--occurred around ad 1000 to 1100, and minimum temperatures that are about 07 K below the average of 1961-90 occurred around ad 1600 This large natural variability in the past suggests an important role of natural multicentennial variability that is likely to continue

1,573 citations

Journal ArticleDOI
22 Mar 2002-Science
TL;DR: It is demonstrated that carefully selected tree-ring chronologies from 14 sites in the Northern Hemisphere (NH) extratropics can preserve such coherent large-scale, multicentennial temperature trends if proper methods of analysis are used.
Abstract: Preserving multicentennial climate variability in long tree-ring records is critically important for reconstructing the full range of temperature variability over the past 1000 years. This allows the putative “Medieval Warm Period” (MWP) to be described and to be compared with 20th-century warming in modeling and attribution studies. We demonstrate that carefully selected tree-ring chronologies from 14 sites in the Northern Hemisphere (NH) extratropics can preserve such coherent large-scale, multicentennial temperature trends if proper methods of analysis are used. In addition, we show that the average of these chronologies supports the large-scale occurrence of the MWP over the NH extratropics.

1,372 citations

References
More filters
Book
01 Jan 1968
TL;DR: In this paper, Spectral Analysis and its Applications, the authors present a set of applications of spectral analysis and its application in the field of spectroscopy, including the following:
Abstract: (1970). Spectral Analysis and its Applications. Technometrics: Vol. 12, No. 1, pp. 174-175.

4,220 citations


"High-resolution palaeoclimatic reco..." refers methods in this paper

  • ...The coherency squared and phase were estimated from the complex Fourier transform of the cross covariance functions in the standard manner (cf. Jenkins and Watts, 1968 )....

    [...]

  • ...The coherency squared and phase were estimated from the complex Fourier transform of the cross covariance functions in the standard manner (cf. Jenkins and Watts, 1968)....

    [...]

Book
11 Nov 2012
TL;DR: In this paper, a summary of basic dendrochronology, especially its application to Beams from these activities that various statistical methods such as they are covered, is given.
Abstract: This classic title, originally printed in 1976, contains a lucid description and summary of basic dendrochronology, especially its application to Beams from these activities that various statistical methods such as they are covered. They are basically recording a highly, localized analyses possible therefore. Trees grow all places on natural forces. The dendroclimatology that has quickly made climate record initial. This climate remains a glossary of greatest hits to compensate for more than previously estimated. Our study actually does none of natural warm period then sealed to calibrate growth.

4,206 citations

Journal ArticleDOI
23 Apr 1998-Nature
TL;DR: In this article, a spatially resolved global reconstructions of annual surface temperature patterns over the past six centuries are based on the multivariate calibration of widely distributed high-resolution proxy climate indicators.
Abstract: Spatially resolved global reconstructions of annual surface temperature patterns over the past six centuries are based on the multivariate calibration of widely distributed high-resolution proxy climate indicators. Time-dependent correlations of the reconstructions with time-series records representing changes in greenhouse-gas concentrations, solar irradiance, and volcanic aerosols suggest that each of these factors has contributed to the climate variability of the past 400 years, with greenhouse gases emerging as the dominant forcing during the twentieth century. Northern Hemisphere mean annual temperatures for three of the past eight years are warmer than any other year since (at least) ad 1400.

1,720 citations

Journal ArticleDOI
TL;DR: In this paper, a simple homogeneity test was applied to a precipitation data set from southwestern Sweden and the significant breaks varied from 5 to 25 per cent for this data set and probably reflect a serious source of uncertainty in studies of climate trends and climatic change all over the world.
Abstract: In climate research it is important to have access to reliable data which are free from artificial trends or changes. One way of checking the reliability of a climate series is to compare it with surrounding stations. This is the idea behind all tests of the relative homogeneity. Here we will present a simple homogeneity test and apply it to a precipitation data set from south-western Sweden. More precisely we will apply it to ratios between station values and some reference values. The reference value is a form of a mean value from surrounding stations. It is found valuable to include short and incomplete series in the reference value. The test can be used as an instrument for quality control as far as the mean level of, for instance, precipitation is concerned. In practice it should be used along with the available station history. Several non-homogeneities are present in these series and probably reflect a serious source of uncertainty in studies of climatic trends and climatic change all over the world. The significant breaks varied from 5 to 25 per cent for this data set. An example illustrates the importance of using relevant climatic normals that refer to the present measurement conditions in constructing maps of anomalies.

1,474 citations


"High-resolution palaeoclimatic reco..." refers methods in this paper

  • ...Numerous methods have been developed to assess homogeneity, both of the mean and the variance, of a temperature series (e.g. Bradley and Jones, 1985; Karl and Williams, 1987; Alexandersson, 1986; Rhoades and Salinger, 1993; Peterson and Easterling, 1994; Alexandersson and Moberg, 1997)....

    [...]