scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Explaining Extreme Events of 2012 from a Climate Perspective

01 Sep 2013-Bulletin of the American Meteorological Society (American Meteorological Society)-Vol. 94, Iss: 9
TL;DR: In this paper, 19 analyses by 18 different research groups, often using quite different methodologies, of 12 extreme events that occurred in 2012 are presented, and the differences also provide insights into the structural uncertainty of event attribution, the uncertainty that arises directly from the differences in analysis methodology.
Abstract: Attribution of extreme events is a challenging science and one that is currently undergoing considerable evolution. In this paper are 19 analyses by 18 different research groups, often using quite different methodologies, of 12 extreme events that occurred in 2012. In addition to investigating the causes of these extreme events, the multiple analyses of four of the events, the high temperatures in the United States, the record low levels of Arctic sea ice, and the heavy rain in northern Europe and eastern Australia, provide an opportunity to compare and contrast the strengths and weaknesses of the various methodologies. The differences also provide insights into the structural uncertainty of event attribution, that is, the uncertainty that arises directly from the differences in analysis methodology. In these cases, there was considerable agreement between the different assessments of the same event. However, different events had very different causes. Approximately half the analyses found some evidence that anthropogenically caused climate change was a contributing factor to the extreme event examined, though the effects of natural fluctuations of weather and climate on the evolution of many of the extreme events played key roles as well.

Summary (3 min read)

Introduction

  • Furthermore, it is also desirable that the samples on the sphere are uniformly distributed and/or well separated on the sphere [10]–[12].
  • These schemes have well separated points and therefore the sampling points exhibit superior geometrical properties.
  • In section IV, the authors formulate the geometrical properties and carry out the comparative analysis.

III. SAMPLING SCHEMES ON THE SPHERE

  • The authors focus on the recently developed sampling schemes on the sphere which permit accurate computation of SHT of a bandlimited signal from its samples.
  • The authors first review these sampling schemes before analysing their geometrical properties in the next section.
  • For a signal band-limited at L, the authors use N to denote the spatial dimensionality, that is the number of samples, required by each of the sampling scheme to compute SHT or equivalently represent the band-limited signal accurately.

A. Guass-Legendre Quadrature based Sampling

  • This sampling scheme is devised on the basis of the well known Gauss-Legendre quadrature on the sphere [21] and is therefore referred to as Gauss-Legendre (GL) sampling scheme.
  • For a signal band-limited at L, this scheme takes samples on L iso-latitude rings with 2L−1 equiangular placed samples along longitude φ, resulting in a total requirement of NGL = L(2L − 1) samples, for the exact computation of SHT.
  • The location of the rings along colatitude θ is given by the roots of the Legendre polynomials of order L as dictated by the Gauss-Legendre quadrature to discretize the integral given in (3).
  • The variants of the GaussLegendre quadrature scheme have also been proposed (e.g., [22]) that require less number of samples.
  • These sampling schemes do not support exact or sufficiently accurate computation of SHT.

C. Optimal-Dimensionality Sampling Scheme

  • Like GL and equiangular sampling schemes, it is also an isolatitude sampling scheme of the sphere and takes L rings along each latitude.
  • Let θk, k = 0, 1, · · · , L − 1 denotes the sample position of the ring along latitude, where these sample locations are chosen such that the accuracy of the computation of SHT is maximized.
  • In total, the number of samples required by optimal-dimensionality sampling scheme is NO = L−1∑ k=0 (2k + 1) = L2. (7) As an example, the samples on the sphere for optimal dimensionality sampling scheme are shown in Fig.1(c) for L = 10.

D. Spherical Designs

  • A set of points on the sphere is called a spherical design such that the integral of the signal of maximum spherical polynomial degree t or maximum band-limit t + 1 over the sphere can be evaluated as an average value over the samples of the signal [11].
  • For the computation of SHT using the points given by spherical design, the authors first note that the SHT requires to evaluate the integral given in (3), where the integrand is the product of a signal band-limited at L and spherical harmonic Y m (θ, φ).
  • Since the authors require to evaluate the integral for all < L, |m| ≤ , the maximum polynomial degree of integrand is 2L − Consequently, they require (2L − 2)-spherical design for the sampling of band-limited signal such that the SHT can be computed accurately.

E. Extremal Points on the Sphere

  • For a given band-limit L, the extremal (maximum determinant) systems are sets of L2 extremal points on the sphere which, by definition, maximize the determinant of a basis matrix (see [10] for details).
  • For spherical harmonic basis, extremal points are supported by interpolatory cubature rule with positive weights and therefore enables the accurate computation of SHT of a signal band-limited at L using NES = L 2 sampling points of extremal system.
  • The authors analyse the accuracy of SHT computation later in the paper.
  • The sampling scheme based on the points2 of the extremal system will be referred to as extremal system sampling scheme.

A. Sampling Efficiency

  • The sampling efficiency, defined as a ratio of the dimensionality of the subspace formed by the band-limited signals, that is the number of coefficients required to represent a band-limited signal in the harmonic domain, to the number of samples required to accurately compute SHT, is the fundamental property of any sampling scheme.
  • For a band-limit L, the authors define the sampling efficiency, denoted by EL, of any sampling scheme as a ratio of the dimension of the subspace HL formed by the band-limited signals to the number of samples, denoted by N , required to compute SHT of a band-limited signal f ∈ HL.
  • It is evident that the optimal dimensionality sampling and 1The spherical designs are available at http://web.maths.unsw.edu.au/∼rsw/.
  • 2We use the the points of extremal systems publicly available at http://web. maths.unsw.edu.au/∼rsw/Sphere/Extremal/New/extremal1.html.the authors.
  • Extremal points attain almost the twice (exactly as L → ∞) of the sampling efficiency achieved by equiangular, GL and spherical designs sampling schemes.

B. Minimum Geodesic Distance and Packing Radius

  • For a set of sampling points on the sphere, the minimum geodesic distance is defined as the minimum distance between any two points in the set.
  • It is also defined as twice the packing radius on the sphere.
  • For each of the sampling schemes presented in Section III, the authors plot the normalized minimum geodesic σn(S) for different band-limits 10 ≤ L ≤ 50 in Fig. 2, where it can be observed that extremal system of points, spherical design and optimal dimensionality, all have well separated points on the sphere.
  • The nomralized minimum geodesic distance curves, obtained by using the points of equiangular and GaussLegendre quadrature based sampling schemes, are well below the lower bound values for all degrees 10 ≤ L ≤ 50.

D. Mesh Ratio

  • Mesh ratio is the ratio of the covering to the packing radius of the identical spherical caps on the surface of a sphere.
  • It can also be observed that the extremal system sampling scheme has the smallest mesh ratio.
  • The authors also normalize with the sampling efficiency for a meaningful comparison.
  • But, as s is increased, s-energy for both equiangular and GL sampling points goes away from the extremal system sampling scheme.

F. Discussion

  • Among the geometrical properties analysed for different sampling schemes, sampling efficiency, mesh norm and Riesz s-energy encapsulates the other properties and therefore serve as the measures of the uniform distribution of sampling points.
  • Analysis of geometrical properties of the sampling scheme reveals that the mesh ratio and s-energy grow with the band-limit for the equiangular and Gauss-Legendre sampling schemes which is a consequence of the fact that these sampling schemes require dense sampling at the poles.
  • In contrast, the optimal dimensionality, spherical design and extremal systems sampling schemes exhibit desired geometrical properties.
  • The mesh ratio achieved by optimal dimensionality is a little higher than extremal system, yet, it is very small compared to the equiangular schemes.
  • Summarizing their analysis, the authors propose that the extremal system sampling scheme is suitable for applications [27], [28] where the signals have smaller bandlimits (L = 10 − 50) due to superior geometrical properties.

V. CONCLUSIONS

  • The authors have carried the comparative analysis of the geometrical properties of those sampling schemes that support the accurate representation of band-limited signals on the sphere.
  • These schemes included equiangular sampling, Gauss-Legendre (GL) quadrature based sampling, optimal-dimensionality sampling, sampling points of extremal systems and spherical design.
  • The authors have illustrated that the optimal dimensionality, extremal system and spherical design sampling schemes have a uniform distributions and the points are well separated on the sphere.
  • Equiangular and GL sampling schemes exhibit poor geometrical properties due to the dense sampling near the poles.
  • Extremal system sampling scheme has superior geometrical properties, which the authors propose to use for the representation of band-limited signal at small band-limits.

Did you find this useful? Give us your feedback

Figures (12)

Content maybe subject to copyright    Report

Using a variety of methodologies, six extreme events of the previous year
are explained from a climate perspective.
E
very year, the Bulletin of the AMS publishes an
annual report on the State of the Climate [e.g.,
see the Blunden and Arndt (2012) supplement
to this issue]. That report does an excellent job of
documenting global weather and climate conditions
of the previous year and putting them into accurate
historical perspective. But it does not address the
causes. One of the reasons is that the scientists
working at understanding the causes of various
extreme events are generally not the same scientists
analyzing the magnitude of the events and writing
the State of the Climate. Another reason is that
explaining the causes of specific extreme events in
near-real time is severely stretching the current state
of the science.
Our report is a way to foster the growth of
this science. Other reports, such as those by the
Intergovernmental Panel on Climate Change (IPCC),
have focused on understanding changes over longer
time scales and larger geographic regions. For
example, assessing the state of the climate and science,
IPCC (Field et al. 2012) concluded that “it is likely
that anthropogenic influences have led to warming
of extreme daily minimum and maximum tempera-
tures at the global scale” and that “there is medium
confidence
1
that anthropogenic influences have
EXPLAINING EXTREME EVENTS
OF 2011 FROM A CLIMATE
PERSPECTIVE
Thomas C. PeTerson, PeTer a. sToTT and sTePhanie herring, ediTors
INTRODUCTION
PeTer a. sToTTmeT offiCe hadley CenTre, UniTed Kingdom; Thomas C. PeTersonnoaa naTional
ClimaTiC daTa CenTer, asheville, norTh Carolina; sTePhanie herringnoaa offiCe of Program
Planning and inTegraTion, silver sPring, maryland
AFFILIATIONS: PeTersonNOAA National Climatic Data
Center, Asheville, North Carolina; sToTTMet Office Hadley
Centre, United Kingdom; herringOffice of Program Planning
and Integration, NOAA, Silver Spring, Maryland
CORRESPONDING EDITOR: Thomas C. Peterson, NOAA
National Climatic Data Center, 151 Patton Avenue, Asheville, NC
28803
E-mail: thomas.c.peterson@noaa.gov
The abstract for this article can be found in this issue, following the
table of contents.
DOI:10 .1175/ BAMS - D-12- 0 00 21.1
In final form 4 May 2012
©2012 American Meteorological Society
1
Likely indicates probability greater than 66%; see IPCC
guidance on uncertainty language (Mastrandrea et al.
2010), which also includes guidance on expression of levels
of confidence.
1041
JULY 2012AMERICAN METEOROLOGICAL SOCIETY
|
Unauthenticated | Downloaded 05/17/21 08:57 AM UTC

contributed to intensification of extreme precipitation
at the global scale.
This first edition of what is intended to be an
annual report starts out with an assessment on causes
of historical changes in temperature and precipitation
extremes worldwide to provide a long-term perspec-
tive for the events discussed in 2011. That section also
considers the use of the term “extreme” in climate
science so as to provide a context for the extreme
events discussed in the rest of the report. The report
then goes on to examine only six extreme events
assessed by teams of experts from around the world.
We are not attempting to be comprehensive nor does
our selection of extreme events reflect any judgment
about the importance of the events discussed here
relative to the many other extreme events around
the world in 2011.
By choosing a few noteworthy events to analyze
there could be a risk of selection bias if the events
chosen are thought of as representative of the weather
observed in 2011, which they are not. However, our
purpose here is to provide some illustrations of a
range of possible methodological approaches rather
than to be comprehensive. We hope that the examples
we have chosen will serve to stimulate the develop-
ment of attribution science and lead to submissions
that, in future years, look at different regions and a
wider range of extreme events. Developing objective
criteria for defining extreme weather and climate
events ahead of time, and applying predetermined
methodologies, should minimize the risk of bias
resulting from selective choice of criteria based on
what actually occurred (e.g., Stott et al. 2004).
Currently, attribution of single extreme events to
anthropogenic climate change remains challenging
(Seneviratne et al. 2012). In the past it was often stated
that it simply was not possible to make an attribution
statement about an individual weather or climate
event. However, scientific thinking on this issue has
moved on and now it is widely accepted that attribu-
tion statements about individual weather or climate
events are possible, provided proper account is taken
of the probabilistic nature of attribution (Nature
Publishing Group 2011).
One analogy of the effects of climate change on
extreme weather is with a baseball player (or to choose
another sport, a cricketer) who starts taking steroids
and afterwards hits on average 20% more home runs
(or sixes) in a season than he did before (Meehl 2012).
For any one of his home runs (sixes) during the years
the player was taking steroids, you would not know
for sure whether it was caused by steroids or not. But
you might be able to attribute his increased number
to the steroids. And given that steroids have resulted
in a 20% increased chance that any particular swing
of the player’s bat results in a home run (or a six), you
would be able to make an attribution statement that,
all other things being equal, steroid use had increased
the probability of that particular occurrence by 20%.
The job of the attribution assessment is to distin-
guish the effects of anthropogenic climate change or
some other external factor (steroids in the sporting
analogy) from natural variability (e.g., in the baseball
analogy, the player’s natural ability to hit home runs
or the configuration of a particular stadium).
There have been relatively few studies published in
the literature that attempt to explain specific extreme
events from a climate perspective and this report
covers some of the main methodological approaches
that have been published to date. A position paper
produced for the World Climate Research Program
(Stott et al. 2012) reviewed some of these studies
including attribution assessments of the 2000 UK
floods (Pall et al. 2011), the 2003 European heat wave
(Stott et al. 2004), the cool year of 2008 in the United
States (Perlwitz et al. 2009) and the 2010 Russian
heat wave (Dole et al. 2011). Such studies have dem-
onstrated how the changed odds of an individual
extreme weather or climate event can be calculated
and attributed—very likely more than doubled for
the 2003 European heat wave. In other cases, such
as the case of the cool year of 2008 in the United
States, conditions apparently inconsistent with the
expected effects of ongoing climate change can be
explained by the interplay of human influence on
climate decreasing the odds of such extremes and
natural variability, La Niña in the case of the U.S.
temperatures in 2008, increasing the odds.
This report also considers other approaches dis-
tinct from those that seek to apportion changed odds.
Analyzing how temperatures within particular flow
patterns have changed helps to illustrate how long-
term climate change is altering the typical weather
associated with a particular flow regime. Such a
regime-based approach (Cattiaux et al. 2010a) has
shown how the cold northwestern European winter
of 2009/10, associated largely with a very negative
North Atlantic Oscillation (NAO), would have been
even colder were it not for a long-term warming
associated with ongoing climate change. Other
related approaches involve using statistical models
or climate models to tease apart the effects of climate
variability and long-term warming on the observed
occurrence of particular extreme weather events. By
not quantifying the link to human emissions, such
analyzes do not fully answer the attribution question,
1042
JULY 2012
|
Unauthenticated | Downloaded 05/17/21 08:57 AM UTC

but they do help to put extreme events into a climate
perspective.
While the report includes three examples of the
odds-based attribution analyzes discussed earlier,
the challenges of running models and analyzing data
in time for this report have meant that only the final
analysis (of the cold UK winter of 2010/11, section 8)
has the climate model simulations available to explic-
itly calculate the change odds attributable to human
influence. Therefore this new report is a step along
the road towards the development of the regular near-
real time attribution systems advocated by Stott et al.
(2011) rather than the final product. While there may
be an increasing focus on such near-real time attribu-
tion activities by operational centers around the world,
there remains much underpinning science to be done
in the development of such a service. An informal
group of scientists, the Attribution of Climate-Related
Events group (ACE; Schiermeier 2011), is meeting in
September 2012 to discuss how to take such activities
further (www.metoffice.gov.uk/research/climate
/climate-monitoring/attribution/ace).
One important aspect we hope to help promote
through these reports is a focus on the questions
being asked in attribution studies. Often there is a
perception that some scientists have concluded that a
particular weather or climate event was due to climate
change whereas other scientists disagree. This can, at
times, be due to confusion over exactly what is being
attributed. For example, whereas Dole et al. (2011)
reported that the 2010 Russian heatwave was largely
natural in origin, Rahmstorf and Coumou (2011)
concluded it was largely anthropogenic. In fact, the
different conclusions largely reflect the different ques-
tions being asked, the focus on the magnitude of the
heatwave by Dole et al. (2011) and on its probability by
Rahmstorf and Coumou (2011), as has been demon-
strated by Otto et al. (2012). This can be particularly
confusing when communicated to the public.
We hope that this new venture will help develop
the means of communicating assessments of the
extent to which natural and anthropogenic factors
contribute to the extreme weather or climate events
of a particular year. As such we seek your reactions to
this report, which will be invaluable in determining
how we should continue in future years. It will also
help inform the dialog about how best to enable a
wider public to appreciate the links between the
weather they are experiencing and the effects of long-
term climate change.
T
he occurrence of high-impact extreme weather
and climate variations invariably leads to
questions about whether the frequency or intensity
of such events have changed, and whether human influ-
ence on the climate system has played a role. Research
on these questions has intensified in recent years, cul-
minating in two recent assessments (Karl et al. 2008;
Field et al. 2012), and in proposals to formalize “event
attribution” as a global climate service activity (Stott
et al. 2012). In order to provide historical context for
later sections, this section discusses the extent to which
human influence has caused long-term changes in the
frequency and intensity of some types of extremes.
The nature of extreme events. The term “extreme”
is used in a number of contexts in climate science.
It refers to events that may in fact not be all that
extreme, such as the occurrence of a daily maximum
temperature that exceeds the 90th percentile of daily
variability as estimated from a climatological base
period, or it may refer to rare events that lie in the far
tails of the distribution of the phenomenon of interest.
A characteristic of extremes is that they are under-
stood within a context—and thus seasonal or annual
means may be “extreme” just as an unusual short-term
event, such as a daily precipitation accumulation,
may be extreme. Certain phenomena, such as tropi-
cal cyclones that have been classified on the Saffir–
Simpson scale, or tornadoes that have been classified
on the Fujita scale, are considered extreme as a class.
The general definition of extremes that was adopted
by the IPCC for its Special Report on Extremes (Field
et al. 2012) applies to most extremes considered in this
report, and across the range of space and time scales
that are considered here. That definition describes an
extreme as the “occurrence of a value of a weather or
climate variable above (or below) a threshold value
near the upper (or lower) ends of the range of observed
HISTORICAL CONTEXT
Francis W. ZWiersPaciFic climate imPacts consortium, university oF victoria, victoria, British
columBia, canada; GaBriele c. heGerlschool oF Geosciences, university oF edinBurGh, edinBurGh,
united KinGdom; seunG-Ki mincsiro marine and atmosPheric research, asPendale, victoria,
australia; XueBin ZhanGclimate research division, environment canada, toronto, ontario, canada
1043
july 2012AMERICAN METEOROlOGICAl SOCIETy
|
Unauthenticated | Downloaded 05/17/21 08:57 AM UTC

values of the variable.” A full discussion of the defini-
tion of an extreme can be found in Seneviratne et al.
(2012). In addition, Zwiers et al. (2012, unpublished
manuscript) provide a discussion of the language sur-
rounding extremes that is used in the climate sciences.
Challenges in detection and attribution of extremes.
The discussion in this section reflects the fact that
most detection and attribution research on long-term
changes in the probability and frequency of extremes
thus far has focused on short duration events that
can be monitored using long records of local daily
temperature and precipitation observations. These
changes are generally captured as indices that docu-
ment the frequency or intensity of extremes in the
observed record rather than focusing on individual
rare events. In contrast, many of the events consid-
ered in later sections of this report are individual
events, often of longer duration than the extremes
considered here, and are also usually events with
longer return periods. Nevertheless, the finding that
human influence is detectable in some types of short
duration events that can be conveniently monitored
from meteorological observations provides important
context for the interpretation of other types of events.
For example, feedbacks and physical processes that
influence individual large events (Fischer et al. 2007;
Seneviratne et al. 2010) will often also be at play in
events that are reflected in indices. Thus, index-
based studies are helpful for providing context for
the attribution of individual events, and evaluate
the ability of models to realistically simulate events
that are affected by different feedbacks from those
affecting mean climate.
While not discussed in this section, the detection
and attribution of changes in the mean state of the
climate system often also provides important context
for the understanding of individual extreme events.
An example is the European 2003 heat wave, which
can be characterized both by very extreme warm
daily maximum and minimum temperatures, and by
an extremely warm summer season. The demonstra-
tion that human factors had influenced the climate
of southern Europe in a quantifiable way over the
latter part of the twentieth century was an important
element in establishing that human influence had
probably substantially increased the likelihood of an
extreme warm summer like that experienced in the
region in 2003 (Stott et al. 2004).
The frequency and intensity of extremes can be
affected by both the internal variability of the climate
system and external forcing, and the mechanisms
involved can be both direct (e.g., via a change in the
local energy balance) and indirect (e.g., via circula-
tion changes). This makes the attribution of events to
causes very challenging, since extreme events in any
location are rare by definition. However, global-scale
data make it possible to determine whether broadly
observed changes in the frequency and intensity of
extremes are consistent with changes expected from
human influences, and inconsistent with other pos-
sibilities such as climate variability. Results from
such detection and attribution studies provide the
scientific underpinning of work determining changes
in the likelihood of individual events.
Observed changes in extremes. We briefly consider
historical changes in frequency and intensity of
daily temperature and precipitation extremes. There
is a sizable literature on such events, in part because
reliable long-term monitoring data are gathered
operationally by meteorological services in many
countries. Many other areas remain understudied,
such as whether there have been changes in the
complex combinations of factors that trigger impacts
in humans and ecosystems (e.g., Hegerl et al. 2011),
or areas that are subject to greater observational
and/or process knowledge uncertainty, such as the
monitoring and understanding of changes in tropical
cyclone frequency and intensity (e.g., Knutson et al.
2010; Seneviratne et al. 2012).
Changes in extreme temperature and the
intensification of extreme precipitation events are
expected consequences of a warming climate. A
warmer climate would be expected to have more in-
tense warm temperature extremes, including longer
and more intense heat waves and more frequent
record-breaking high temperatures than expected
without warming. It would also be expected to
show less intense cold temperature extremes and
fewer record-breaking low temperatures than ex-
pected before. Both of these expected changes in the
occurrence of record-breaking temperatures have
indeed been observed (e.g., Alexander et al. 2006;
Meehl et al. 2009). Further, a warmer atmosphere
can, and does, contain more water vapor, as has
been observed and attributed to human influence
(Santer et al. 2007; Willett et al. 2007; Arndt et al.
2010). This implies that more moisture is available
to form precipitation in extreme events and to
provide additional energy to further intensify such
events. About two-thirds of locations globally with
long, climate-quality instrumental records [e.g.,
as compiled in the Hadley Centre Global Climate
Extremes dataset (HadEX); Alexander et al. 2006]
show intensification of extremes in the far tails of
1044
JULY 2012
|
Unauthenticated | Downloaded 05/17/21 08:57 AM UTC

the precipitation distribution during the latter half
of the twentieth century (Min et al. 2011).
Detection and attribution of changes in intensity and
frequency of extremes. A number of studies (e.g.,
Christidis et al. 2005, 2010; Zwiers et al. 2011; Morak
et al. 2011, 2012) have now used various types of
detection and attribution methods to determine
whether the changes in temperature extremes pre-
dicted by climate models in response to historical
greenhouse gas increases and other forcings are
detectable in observations. The accumulating body
of evidence on the human contribution to changes
in temperature extremes is robust, and leads to
the assessment that “it is likely that anthropogenic
influences have led to warming of extreme daily
minimum and maximum temperatures on the global
scale” (Seneviratne et al. 2012). Results tend to show
that the climate models used in studies simulate
somewhat more warming in daytime maximum
temperature extremes than observed, while under-
estimating the observed warming in cold extremes
in many locations on the globe. It remains to be
determined if this model-data difference occurs
consistently across all models, or whether it is spe-
cific to the small set of phase 3 of the Coupled Model
Intercomparison Project (CMIP3) climate models
used in the studies.
Heavy and extreme precipitation events have
also received a considerable amount of study.
Heavy precipitation has been found to contribute an
increasing fraction of total precipitation over many
of the regions for which good instrumental records
are available (Groisman et al. 2005; Alexander et al.
2006; Karl and Knight 1998; Kunkel et al. 2007;
Peterson et al. 2008; Gleason et al. 2008), indicating
an intensification of precipitation extremes. Direct
examination of precipitation extremes, such as the
largest annual 1-day accumulation, or the largest
annual 5-day accumulation, also shows that extreme
precipitation has been intensifying over large parts
of the global landmass for which suitable records
are available (Alexander et al. 2006; Min et al. 2011;
Figs. 1 and 2), with an increase in the likelihood of a
typical 2-yr event of about 7% over the 49-yr period
from 1951 to 1999 (Min et al. 2011). It should be
noted, however, that the spatial extent of regions for
which long records of daily and pentadal precipita-
tion accumulations are available is still severely
limited (e.g., Alexander et al. 2006; see also Fig. 1),
and that spatial patterns of change are still noisy.
The intensification of extreme precipitation is an
expected consequence of human influence on the cli-
mate system (e.g., Allen and Ingram 2002; Trenberth
et al. 2003) and is simulated by models over the latter
half of the twentieth century in response to anthro-
pogenic forcing, albeit with weaker amplitude than
observed, which is at least partly due to differences
in the spatial scales resolved by climate models and
station-based local records (Chen and Knutson 2008).
Nevertheless, Min et al. (2011) recently showed, using
an ensemble of models and an index of extreme pre-
cipitation that is more comparable between models
and data than records of intensity of events, that the
observed large-scale increase in heavy precipitation
cannot be explained by natural internal climate
variability, and that human influence on climate
provides a more plausible explanation. The body of
research available on precipitation extremes is in an
earlier stage of development than for temperature
extremes, and thus Seneviratne et al. (2012) did not
give a quantified likelihood assessment concerning
precipitation extremes, but rather stated that “there
is medium confidence
2
that anthropogenic influences
Fig. 1. Geographical distribution of trends of probability-based indices (PI) of extreme precipitation
during 1951–99 for 1-day precipitation accumulations. Annual extremes of 1-day accumulations were
fitted to the Generalized Extreme Value distribution, which was then inverted to map the extremes
onto a 0% –100% probability scale. Blue colors indicate intensification of extreme precipitation, which
is observed at about two-thirds of locations. From Min et al. (2011).
1045
JULY 2012AMERICAN METEOROLOGICAL SOCIETY
|
Unauthenticated | Downloaded 05/17/21 08:57 AM UTC

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, model projections of tropical cyclone activity response to anthropogenic warming in climate models are assessed and observations, theory, and models, with increasing robustness, indicate that tropical cyclones respond well to global warming.
Abstract: Model projections of tropical cyclone (TC) activity response to anthropogenic warming in climate models are assessed. Observations, theory, and models, with increasing robustness, indicate ...

536 citations

Journal ArticleDOI
TL;DR: The authors estimates anthropogenic influence to have caused a greater than 60-fold increase in the likelihood of extreme summer heat and projects that hot summers will continue to increase in frequency and five of the hottest summers have occurred since 2000.
Abstract: Mean summer temperature in Eastern China has increased by 082 °C since the 1950s and five of the hottest summers have occurred since 2000 This study estimates anthropogenic influence to have caused a greater than 60-fold increase in the likelihood of extreme summer heat and projects that hot summers will continue to increase in frequency

515 citations

Book
23 Nov 2015
TL;DR: In this article, the authors provide guidance on how to design climate policies so they contribute to poverty reduction, and how to designing poverty reduction policies that contribute to climate change mitigation and resilience building, and explore how they can more easily be achieved if considered together.
Abstract: Climate change threatens the objective of eradicating poverty. Poor people and poor countries are already vulnerable to all types of climate-related shocks—natural disasters that destroy assets and livelihoods; water borne diseases and pests that becomemore prevalent during heat waves, floods, or droughts; crop failure from reduced rainfall; andspikes in food prices that follow extreme weather events. Such shocks can erase decades ofhard work and leave people with irreversible human and physical losses. Changes in climateconditions caused by increasing concentrations of greenhouse gases in the atmosphere will worsen these shocks and slow down poverty reduction. The good news is that, at least until 2030, “good development” can prevent most of these impacts. By “good development,” we mean development that is rapid, inclusive, and climate informed; includes strong social safety nets and universal health coverage; and iscomplemented with targeted adaptation interventions such as heat-tolerant crops and early warning systems. Absent such good development, many people will still be living in or closeto extreme poverty in 2030, with few resources to cope with climate shocks and adapt to longterm trends, and climate change could increase extreme poverty by more than 100 million people by 2030. In the longer run, beyond 2030, our ability to adapt to unabated climate change is limited. To keep the longer-term impacts on poverty in check, immediate emissions-reduction policiesare needed that bring emissions to zero by the end of the 21st century. These policies neednot threaten short-term progress on poverty reduction—provided they are well designed andinternational support is available for poor countries.Ending poverty and stabilizing climate change will be unprecedented global achievements.But neither can be attained without the other: they need to be designed and implementedas an integrated strategy. Shock Waves: Managing the Impacts of Climate Change on Poverty brings together those two objectives and explores how they can more easily be achieved if considered together. The book provides guidance on how to design climate policies so they contribute to poverty reduction, and on how to design poverty reduction policies so they contribute to climate change mitigation and resilience building.

468 citations

01 Dec 2012
TL;DR: In this paper, the authors present the results of a postdoctoral fellowship program at the U.S. National Oceanic and Atmospheric Administration (NOAA) in the field of ocean science.
Abstract: United States. National Oceanic and Atmospheric Administration (Postdoctoral Fellowship Program)

458 citations

References
More filters
01 Jan 2007
TL;DR: The first volume of the IPCC's Fourth Assessment Report as mentioned in this paper was published in 2007 and covers several topics including the extensive range of observations now available for the atmosphere and surface, changes in sea level, assesses the paleoclimatic perspective, climate change causes both natural and anthropogenic, and climate models for projections of global climate.
Abstract: This report is the first volume of the IPCC's Fourth Assessment Report. It covers several topics including the extensive range of observations now available for the atmosphere and surface, changes in sea level, assesses the paleoclimatic perspective, climate change causes both natural and anthropogenic, and climate models for projections of global climate.

32,826 citations

Journal ArticleDOI
TL;DR: The NCEP/NCAR 40-yr reanalysis uses a frozen state-of-the-art global data assimilation system and a database as complete as possible, except that the horizontal resolution is T62 (about 210 km) as discussed by the authors.
Abstract: The NCEP and NCAR are cooperating in a project (denoted “reanalysis”) to produce a 40-year record of global analyses of atmospheric fields in support of the needs of the research and climate monitoring communities. This effort involves the recovery of land surface, ship, rawinsonde, pibal, aircraft, satellite, and other data; quality controlling and assimilating these data with a data assimilation system that is kept unchanged over the reanalysis period 1957–96. This eliminates perceived climate jumps associated with changes in the data assimilation system. The NCEP/NCAR 40-yr reanalysis uses a frozen state-of-the-art global data assimilation system and a database as complete as possible. The data assimilation and the model used are identical to the global system implemented operationally at the NCEP on 11 January 1995, except that the horizontal resolution is T62 (about 210 km). The database has been enhanced with many sources of observations not available in real time for operations, provided b...

28,145 citations

Journal ArticleDOI
TL;DR: ERA-Interim as discussed by the authors is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF), which will extend back to the early part of the twentieth century.
Abstract: ERA-Interim is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF). The ERA-Interim project was conducted in part to prepare for a new atmospheric reanalysis to replace ERA-40, which will extend back to the early part of the twentieth century. This article describes the forecast model, data assimilation method, and input datasets used to produce ERA-Interim, and discusses the performance of the system. Special emphasis is placed on various difficulties encountered in the production of ERA-40, including the representation of the hydrological cycle, the quality of the stratospheric circulation, and the consistency in time of the reanalysed fields. We provide evidence for substantial improvements in each of these aspects. We also identify areas where further work is needed and describe opportunities and objectives for future reanalysis projects at ECMWF. Copyright © 2011 Royal Meteorological Society

22,055 citations

BookDOI
01 Jan 1986
TL;DR: The Kernel Method for Multivariate Data: Three Important Methods and Density Estimation in Action.
Abstract: Introduction. Survey of Existing Methods. The Kernel Method for Univariate Data. The Kernel Method for Multivariate Data. Three Important Methods. Density Estimation in Action.

15,499 citations

Journal ArticleDOI
TL;DR: The fifth phase of the Coupled Model Intercomparison Project (CMIP5) will produce a state-of-the- art multimodel dataset designed to advance the authors' knowledge of climate variability and climate change.
Abstract: The fifth phase of the Coupled Model Intercomparison Project (CMIP5) will produce a state-of-the- art multimodel dataset designed to advance our knowledge of climate variability and climate change. Researchers worldwide are analyzing the model output and will produce results likely to underlie the forthcoming Fifth Assessment Report by the Intergovernmental Panel on Climate Change. Unprecedented in scale and attracting interest from all major climate modeling groups, CMIP5 includes “long term” simulations of twentieth-century climate and projections for the twenty-first century and beyond. Conventional atmosphere–ocean global climate models and Earth system models of intermediate complexity are for the first time being joined by more recently developed Earth system models under an experiment design that allows both types of models to be compared to observations on an equal footing. Besides the longterm experiments, CMIP5 calls for an entirely new suite of “near term” simulations focusing on recent decades...

12,384 citations


"Explaining Extreme Events of 2012 f..." refers methods in this paper

  • ...We use Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations (Taylor et al. 2012) to explore whether the observed summer 2012 ASIE anomaly and the 2001–12 trend can be explained as a response to anthropogenic and natural forcing and how they relate to the observed increase in global…...

    [...]

  • ...We use Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations (Taylor et al. 2012) to explore whether the observed summer 2012 ASIE anomaly and the 2001–12 trend can be explained as a response to anthropogenic and natural forcing and how they relate to the observed increase in global mean surface air temperature (SATgm)....

    [...]

  • ...We use the Coupled Model Intercomparison Project Phase 5 (CMIP5) GCM experiments (Taylor et al. 2012) to quantify the likelihood of a 2012-magnitude event in the current and preindustrial forcing regimes....

    [...]

  • ...To make inferences about potential anthropogenic impacts, historical runs from models in the Coupled Model Intercomparison Project phase 5 (CMIP5) archive (Taylor et al. 2012) were analyzed and compared with observations....

    [...]

  • ...To place this seasonal extreme warmth in the context of long-term climate change, we analyze the time series for this region, comparing observed trends with model simulations of internal climate variability and modeled responses to both anthropogenic and natural forcings using 23 Coupled Model Intercomparison Project phase 5 (CMIP5) models (Taylor et al. 2012)....

    [...]

Related Papers (5)
Frequently Asked Questions (11)
Q1. What have the authors contributed in "Explaining extreme events of 2011 from a climate perspective" ?

E very year, the Bulletin of the AMS publishes an annual report on the State of the Climate [ e. g., see the Blunden and Arndt ( 2012 ) supplement to this issue ]. 

In addition, high interannual correlations between observed and analog temperatures confirm that the North Atlantic dynamics remains the main driver of European temperature variability, especially in wintertime. 

The El Niño–Southern Oscillation (ENSO), for one, is considered to be a key driver of drought conditions in the central United States (Trenberth et al. 

Low-latitude regions generally have higher ratios between the signal of climate change in temperature and variability than other regions (Mahlstein et al. 2011) and there appears to be potential skill in seasonal forecasting of impactrelevant metrics such as the onset of seasonal rains in Africa (Graham and Biot 2012). 

In summary, model results indicate that human influence has reduced the odds by at least 20% and possibly by as much as 4 times with a best estimate that the odds have been halved. 

It has been questioned whether attribution studies might neglect many of the regions most vulnerable to extreme weather because of the greater difficulties of collecting climate observations and undertaking climate modeling in developing countries (Hulme 2011). 

More recent SST-driven climate simulations have emphasized the important role of post-1999 warming in the Pacific in driving the 2011 drought (Lyon and DeWitt 2012). 

After the bias correction, there is good agreement between the ensembles and observations, giving confidence that any change in return time is representative of the change in return time in the observations. 

One possibility is the assumption that the probability distribution function of monsoon rainfall does not change shape but is shifted to higher or lower values by the changing climate (van Oldenborgh 2007). 

Therefore the analysis of the East African drought of 2011 is particularly interesting because it demonstrates the potential for attribution in tropical regions that lack robust international exchange of climate observations. 

The back-to-back failures of these rains, which were linked to the dominant La Niña climate and warm SSTs in the central and southeastern Indian Ocean, were particularly problematic since they followed poor rainfall during the spring and summer of 2008 and 2009.