scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A review of global ocean temperature observations: Implications for ocean heat content estimates and climate change

01 Sep 2013-Reviews of Geophysics (American Geophysical Union)-Vol. 51, Iss: 3, pp 450-483
TL;DR: The evolution of ocean temperature measurement systems is presented with a focus on the development and accuracy of two critical devices in use today (expendable bathythermographs and conductivity-temperature-depth instruments used on Argo floats).
Abstract: The evolution of ocean temperature measurement systems is presented with a focus on the development and accuracy of two critical devices in use today (expendable bathythermographs and conductivity-temperature-depth instruments used on Argo floats). A detailed discussion of the accuracy of these devices and a projection of the future of ocean temperature measurements are provided. The accuracy of ocean temperature measurements is discussed in detail in the context of ocean heat content, Earth's energy imbalance, and thermosteric sea level rise. Up-to-date estimates are provided for these three important quantities. The total energy imbalance at the top of atmosphere is best assessed by taking an inventory of changes in energy storage. The main storage is in the ocean, the latest values of which are presented. Furthermore, despite differences in measurement methods and analysis techniques, multiple studies show that there has been a multidecadal increase in the heat content of both the upper and deep ocean regions, which reflects the impact of anthropogenic warming. With respect to sea level rise, mutually reinforcing information from tide gauges and radar altimetry shows that presently, sea level is rising at approximately 3 mm yr-1 with contributions from both thermal expansion and mass accumulation from ice melt. The latest data for thermal expansion sea level rise are included here and analyzed. Key Points Oceanographic techniques and analysis have improved over many decadesThese improvements allow more accurate Earth-energy balance estimatesUnderstanding of ocean heat content and sea-level rise has also increased ©2013. American Geophysical Union. All Rights Reserved.

Summary (8 min read)

1. INTRODUCTION

  • The broad topic of climate science includes a multitude of subspecialties that are associated with various components of the climate system and climate processes.
  • It remains a challenging problem for climate scientists.
  • Finally, changes in measurement techniques and instrumentation have resulted in biases, many of which have been discovered with some account made. [5].
  • International, observational programs and projects are vital to the data used in these analyses.
  • The Global Climate Observing System, in partnership with WCRP, has formulated a global ocean observing system and encouraged contribution to it, particularly through the OceanObs workshops in 1999 and 2009. [8].

2. THE EVOLVING SUBSURFACE TEMPERATURE OBSERVING SYSTEM: A HISTORICAL PERSPECTIVE

  • An understanding of ocean heat content changes is only as good as the subsurface ocean temperature observations upon which these calculated changes are based.
  • The subsurface temperature observing system is still relatively young when compared to atmospheric observing systems.
  • What follows is a look at the developments and ideas that enabled implementation and precipitated changes in the observing system.
  • As a guide, Figure 1 shows geographical coverage during the height of each iteration of the observing system.

2.1. Early Measurements (From 1772)

  • On Captain James Cook’s second voyage (1772–1775), water samples were obtained from the subsurface Southern Ocean and it was found that surface waters were colder than waters at 100 fathoms (~183m) [Cook, 1777].
  • Slightly more than 100 years later, the Challenger expedition (1873–1876) circumnavigated the globe, taking temperature profiles from the surface to the ocean bottom along the way, ushering in an increased interest in subsurface oceanography and new technology developments which facilitated measurement.
  • Pairs of protected and unprotected reversing thermometers were used to determine temperature and pressure, with pressure determined to an accuracy of ±5m depth in the upper 1000m.
  • The development of the Nansen bottle [Mill, 1900; HellandHansen and Nansen, 1909] which attached the thermometers to a sealed water sample bottle completed the instrumentation package which constituted the subsurface upper ocean temperature observing system for the 1900–1939 time period.
  • Hence, the long-term mean seasonal variations, the year-to-year variance, and vertical structure of the ocean were not well described.

2.3. Mechanical Bathythermograph Observation System (From 1939)

  • Quickly and accurately mapping the temperature variation of the upper ocean became a military priority in the lead-up to World War II for the accurate interpretation of sonar readings to locate submarines and their potential hiding places.
  • Oceanographers now had the means with which to acquire detailed sets of measurements to map the mixed layer and shallow thermocline [Spilhaus, 1940]. [14].
  • Inside the cylinder is a Bourdon tube enclosing a capillary tube with xylene (a hydrocarbon obtained fromwood or coal tar) inside.
  • A stylus attached to the Bourdon tube captures the movement as temperature change horizontally scratched on a plate of smoked glass.

2.4. Ship-Based Conductivity-Temperature-Depth Instruments (From 1955)

  • The development of the salinity-temperature-depth (STD) and later the conductivity-temperature-depth (CTD) instruments augmented existing observations by eventually replacing the discrete reversing thermometer observations with continuous profiles of temperature.
  • The development of the CTD also laid the groundwork for their current observing system and for the backbone large-scale measurement cruises of the World Ocean Circulation Experiment (WOCE) among others.
  • The basic physical concept of a thermal resistor was known as early as 1833 when Faraday noted that the conductivity of certain elements was affected by changes in temperature [Faraday, 1833].
  • Brown later modified the CTD design to use both a fast-response thermistor and a platinum resistance thermometer as well as a wire strain gauge bridge transducer to measure pressure in order to correct transients in the conductivity signal [Brown, 1974].

2.5. The Expendable Bathythermograph Observing System (From 1967)

  • Technological advances in wire and wire insulation made it possible to create an instrument electrically connected to the ship and able to transmit information through a thin conducting wire.
  • U.S. Navy traces were sent to the Fleet Numerical Weather Center (FNWC) where they were digitized, used for weather prediction and other projects, and then passed to the U.S. National Oceanographic Data Center (NODC) for archive and public release [Magruder, 1970].
  • In 1990, a global system of distributing XBT data was implemented (see below discussion of the Global Temperature and Salinity Profile Program (GTSPP)). [20].
  • A Canadian company, Sparton, also briefly manufactured XBTs of their own design. [22].

2.6. Tropical Moored Arrays (From 1984)

  • The tropical moored arrays were set up to continuously monitor the tropical ocean.
  • The first tropical moored array, the Tropical Atmosphere Ocean (TAO) array (later TAO/ TRITON), was set up to help monitor and understand the El Niño phenomenon [McPhaden et al., 1998].
  • Later, the SOFAR (Sound Fixing and Ranging) float [Webb and Tucker, 1970; Rossby and Webb, 1970] improved on this system by enabling tracking of the float by underwater listening devices.
  • Each cycle, they dive to a nominal 2000 dbar target and typically measure pressure, temperature, and salinity from there to the surface where the information is transmitted to a satellite.
  • So, the Argo Program governs the floats from deployment planning through quality control and dissemination, a true end-to-end observation system.

2.8. Summary of Ocean Temperature Measurements

  • Interspersed within the main observing system data are high-quality bottle and CTD temperature measurements from projects such as WOCE (1990–1998).
  • Historic studies of ocean heat content and other related variables need to take into consideration the changes in the observing system and the limitations of the system during each time period to fully interpret their results.
  • Gliders, undulating CTDs, and sensor-outfitted animals are already starting to extend and expand the observing system, and full-depth Argo floats are under development with a goal of allowing an ever-improving understanding of ocean heat content variability and its place in the Earth’s climate system.

3.1.1. The XBT Instrument

  • As discussed earlier, an XBT is a probe that measures temperature as it free-falls through the water column.
  • There is a hint of temperature dependence in this bias (Figure 5), as also indicated by probe calibrations in the laboratory [Gouretski and Reseghetti, 2010]. [48].
  • Wijffels et al. [2008] contributed two sets of corrections, both of which attempt to remove bias by applying time-varying multiplicative factors to the measurement depths (effectively assuming that there are no pure temperature or depth offset adjustments).
  • Cowley et al. [2013] derive corrections for the common shallow and deep types of XBT; those with missing data are assigned to a type by their maximum depth and country of origin.
  • Many correction schemes have been proposed to adjust for these biases. [63].

3.2. Dynamic Models for XBT Devices

  • While traditionally, the descent of XBT probes into the ocean water is handled through the use of standardized FREs, it is also possible to use dynamic models, allowing independent predictions of probe depths.
  • For the dynamic modeling technique, it is possible to incorporate changes to the drop conditions that are not reflected in the experiments during which the FRE was obtained.
  • And with a dynamic model, it is possible for users to calculate the depth of XBT devices independent of FRE models. [69].
  • Probe mass and drop height may have a significant impact on probe depths; however, a recent set of experiments suggest that the effect of drop height might be overpredicted [Abraham et al., 2012a].
  • Surface effects (such as sudden impact forces at entry, angle of impact with the ocean surface, ship motion, entrainment of air, etc.) may negate the larger impact velocities. [71].

3.3. The Global XBT Measurement Network

  • [72] XBT deployments are designated by their spatial and temporal sampling goals or modes of deployment (low density, frequently repeated, and high density) and sample along repeated, well-observed transects, on either large or small spatial scales, or at special locations such as boundary currents and chokepoints (Figure 11).
  • Frequently repeated transects typically target 12–18 realizations per year, with XBTs deployed at 100–150 km spacing, and are designed to obtain high spatial resolution observations in consecutive realizations in regions where temporal variability is strong and resolvable with an order of 20 day sampling.
  • Given the advances in global observing system, the global XBT network is currently focused on the monitoring of boundary currents and heat transport and not exclusively on the upper ocean thermal field.
  • The scientific objectives of HD sampling and examples of research targeting these objectives are as follows [Goni et al., 2010]: [75].
  • Measure the seasonal and interannual fluctuations in the transport of mass and heat across transects which define large enclosed ocean areas and investigate their links to climate indices. [76].

3.4. Future of the XBT Network

  • The XBT network reflects the recommendations of OceanObs99 and OceanObs09 [Goni et al., 2010] and includes several transects that the scientific community has added during the last 12 years (Figure 11).
  • Ship recruitment is an ongoing issue in implementing the XBT network, resulting in gaps or shifts in sections.
  • Thirteen years after OceanObs99, the XBT HD transects continue to increase in value, not only through the growing length of decadal time series, but also due to integrative relationships with other elements of the ocean observing system, including the following: [85].
  • HD transects together with Argo float data provide views of the large-scale ocean interior and small-scale features near the boundary, as well as of the relationship of the interior circulation to the boundary-to-boundary transport integrals. [86].
  • Almost 20 years of continuous global satellite altimetric sea surface heights are matched by contemporaneous HD sampling on many transects.

4. ACCURACY/BIASES OF ARGO FLOATS

  • The Argo Program, an array of over 3000 autonomous floats designed to return materially important oceanic climate data, is a vitally important component of the present oceanic Earth observing system and a strong complement to satellite observations.
  • This program, designed to complement the Jason altimeter missions, provides observed climate signals that, when globally averaged, are sensitive to the presence of data bias.
  • The Argo Program has advanced the breadth, quality, and distribution of oceanographic data as compared to the broad-scale XBT network while continuing to supplement ship-based CTD programs (section 2).
  • The immediate distribution of data results in a two-tiered quality control system, each with distinct expectations of bias within the data.
  • A more careful analysis termed “delayedmode quality control” (DMQC) is performed by the float provider 6–18months after data acquisition [Wong et al., 2012].

4.1. Argo Float CTD Sensors

  • The majority of floats within the “Core Argo” array measure temperature, salinity, and pressure with SeaBird Electronics, Inc. (SBE) conductivity-temperature-depth (CTD) packages.
  • In recent years, the Argo array has become nearly homogenous in the use of the SBE CTDs due to their high accuracy, modest conductivity sensor drift (both in numbers of floats with drift and, if present, the rate of drift), and the lack of a suitable alternative sensor provider.
  • The last FSI-equipped Argo float was deployed in December 2006.
  • The SBE41CP can also be used in spot sampling mode.
  • The SBE-41 has historically been installed in more Argo floats.

4.2. Sensor Drift: The Causes, Identification, and Correction

  • Sensor drift is a continuous concern for instruments that are designed to obtain extended duration measurements in the climate system.
  • The manufacturing process of the Druck pressure sensor was modified, and Argo floats with rigorously tested Druck pressure sensors were again being deployed by late 2009.
  • For pressure drifts larger than 10 dbar, a temperature component to the nonlinear correction is necessary, with cold water at depth requiring greater correction (N. Larson, Sea-Bird Electronics, personal communication, 2012). [99].
  • Regardless of the method, the float is transmitting profile and trajectory data that use corrected pressures.

4.3. Temperature Sensor Bias

  • No example of significant temperature drift has been identified within the Argo array.
  • Small numbers of instruments recovered and recalibrated after 4–9month missions have shown no appreciable drift within manufacturer’s stated temperature accuracy [Oka and Ando, 2004].
  • Argo float models report the Core Argo profile parameters—temperature, salinity, and pressure—as either a point measurement or vertical pressure average (bin averaged).
  • Both sampling methods are equally valid and each provides advantageous properties depending on the application, but the sampling mode should be known for most accurate use.
  • The globally averaged temperature gradient from 2000 up to approximately 200 dbar (Figure 12b, black lines) results in a warm bias for floats recording bin-averaged data.

4.4. Biases Introduced by Float Firmware

  • Two recent, unrelated issues affected a single (but different) model of Argo float and introduced pressure bias into the Argo data set.
  • Data users may make their own determination of TNDP status by referring to the SP variables included in a float technical parameter netCDF file available at the GDAC.
  • A subset of Argo floats (SOLO), manufactured prior to 2007 by the Woods Hole Oceanographic Institution (WHOI), had assigned temperature and salinity values to incorrect pressure levels [Willis et al., 2009].
  • The float profile data that were in error due to incorrect pressure level assignment either have been corrected to the proper pressure (all SOLO WHOI SBE models and a subset of SOLOWHOI FSI) or have been assigned bad quality control flags for those models that are uncorrectable (subset of SOLO WHOI FSI).

4.5. Discussion

  • The Argo Program float array is an important component of the present oceanic Earth observing system, extending broad-scale monitoring of ocean temperature, among other variables, from what was achieved by previous research programs.
  • The most apparent spatial bias in Argo float density in the pelagic ocean is found within seasonal ice zones.
  • Development of a Sea-Bird CTD for use in Deep Argo floats is ongoing.
  • Advantages include the recording of profiles at higher resolution (2 dbar and higher), reduced float mortality due to shorter surface periods, and the ability to modify the float sampling midmission driven by scientific objective.

5. GLOBAL OCEAN HEAT CONTENT, EARTH ENERGY BUDGET, AND THERMOSTERIC SEA LEVEL RISE

  • The amount of heat accumulating in the global ocean is vital for diagnosing Earth’s energy imbalance and sea level rise.
  • The uptake of heat by the ocean acts as a buffer to climate change, slowing the rate of surface warming [Raper et al., 2002], and so is an important element in the evolution of the climate over land and between the Northern and Southern Hemispheres. [117].
  • This section will present an abbreviated update of ocean heat content estimates, the present Earth energy balance, and thermosteric sea level rise with a particular focus on the accuracy of the temperature measurements and the impact of accuracy on the certainty of these measurements.

5.1.1. Background

  • Changes to ocean heat content (OHC) can be calculated from measurements of the temperature evolution of the ocean.
  • It is now widely accepted that the large decadal variability in the 1970s–1980s in the earlier estimates was mostly an artifact caused by XBT biases.
  • These time-dependent biases, if left uncorrected and when integrated in depth and over the global ocean, lead to substantial errors in OHC estimates, in terms of both temporal variability and trends [e.g., Domingues et al., 2008; Wijffels et al., 2008; Levitus et al., 2009]. [122].
  • Domingues et al. [2008] and Church et al. [2011] use a reduced-space optimal interpolation in which a reduced set of near-global spatial functions (derived from satellite altimeter sea level measurements) is combined with thermal expansion observations to produce spatially complete fields from 1950 onward.

5.1.2. Current Observational Estimates

  • Updated and recent observational analyses of global upper OHC (Table 1) all show significant multidecadal warming, with a steady increase in OHC since the 1970s (Figures 14a1–14a3).
  • More generally, confidence in interannual variability in the global upper OHC has improved after 2005, following the dramatic improvement in open ocean coverage by the Argo floats, at least for the upper ~2000m.
  • Further systematic comparisons between OHC analyses are needed to understand the spread in multidecadal rates, to isolate the impact of individual structural uncertainties, and to develop best practices for analyses.

5.2. Deep Ocean Heating

  • Though variations in deep ocean temperatures are small compared to the upper ocean, the large volume of the deep ocean makes its contribution to the global energy balance significant [Purkey and Johnson, 2010].
  • Variability of the heat content of the deep ocean modulates both the energy budget of the climate system and global sea level [IPCC, 2007].
  • At present, only hydrographic observations provide data of the required accuracy and at a decadal frequency (the minimum needed to observe climate change).

5.3. Impact of Ocean Measurements on Earth Energy Balances

  • The key issues for the Earth from an overall energy standpoint are the energy imbalances at the top and bottom of the atmosphere and their changes over time.
  • Other observing systems in place can nominally measure the major storage and flux terms, but owing to errors and uncertainty, it remains a challenge to track anomalies with confidence.
  • Estimates of OHC trends above 700m from 2005 to 2012 (Figure 14f) range from 0.2 to 0.4Wm 2, with large, overlapping uncertainties, highlighting the remaining issues of adequately dealing with missing data in space and time and how OHC is mapped, in addition to remediating instrumental biases, quality control, and other sensitivities.

5.4. Ocean Temperature Measurements and Thermosteric Sea Level Rise

  • Both the volume and mass of the global ocean, and thus sea level (global mean sea level), change across a variety of timescales, due to expansion and contraction of water as ocean temperatures and heat content change, and the growth of ice sheets and glaciers.
  • Because the ocean has the largest heat capacity of the climate system, and the ocean thermosteric rise is one of the largest contributors to the late 20th (and projected 21st) century sea level rise [Church et al., 2011; Meehl et al., 2007], the Earth’s energy and sea level budgets must be consistent.
  • To address the implicit assumption of zero anomaly where there were no data and XBT biases, Domingues et al. [2008] used a reduced-space optimal interpolation scheme in combination with the XBT corrections of Wijffels et al. [2008].
  • For waters deeper than 700m, ocean heat content estimates remained dependent on deep ocean bottle and CTD casts until the implementation of the Argo Program [Gould et al., 2004], which dramatically improved global sampling to a depth of 2000m, particularly in the Southern Ocean.

6. CONCLUDING REMARKS AND FUTURE DIRECTIONS

  • This paper brings together a broad set of perspectives and information on oceanographic temperature measurements and their implications for climate change.
  • Included are discussions of the history of temperature measurements, the primary instrumentations which have been used to complete the measurements, and their associated accuracy.
  • With respect to (ii), even with the advances to the observing system culminating in the Argo array, more than 50% of the ocean is without routine observations.
  • Important areas such as boundary currents, which are responsible for large poleward heat transport, need higher-frequency observations than are currently provided by Argo.

6.1. Some Future Directions of Instrumentation

  • Fortunately, technological advances are being made on all fronts.
  • Arvor3500, a profiling float capable of diving to 3500m, has been constructed in prototype, and Deep NINJA Argo (4000m capable float) test floats have already been deployed (Argonautics newsletter 13, August 2012, http://www.argo.ucsd.edu/newsletter.html).
  • Another method for gathering under-ice temperature profiles is with the use of instrumented animals, most commonly pinnipeds [Fedak, 2012].
  • Gliders are under the control of the pilot, so they can be deployed to carry out a set geographic and depth sampling plan, with updated instructions when needed.

6.2. Improved Observational OHC Estimates and Analysis Methodologies

  • A high-quality subsurface ocean temperature database along with accurate and comprehensive metadata is an important prerequisite for advancing knowledge on instrumental biases (e.g., XBT/MBT) and devising more accurate corrections to help further reduce uncertainties in OHC estimates.
  • Improvements to the quality of the historical ocean temperature database and metadata information for climate research purposes are currently being planned through a global coordinated effort (http://www.clivar.org/ organization/gsop/activities/clivar-gsop-coordinated-qualitycontrol-global-subsurface-ocean-climate). [170].
  • To better understand and quantify the structural uncertainties arising from methods used in publications, a comprehensive project is underway [Boyer et al., 2013].
  • A series of systematic intercomparisons is being carried out for a number of sensitivity tests based on different parameter choices but using agreed temperature databases (e.g., same input data).
  • It is hoped that this project will provide helpful guidance on best practices to be developed for how best, for instance, to infill observational gaps. [171].

6.3. Conclusion

  • Two recent detection and attribution analyses [Gleckler et al., 2012; Pierce et al., 2012] have significantly increased confidence since the last IPCC AR4 report that the warming (thermal expansion) observed during the late twentieth century, in the upper 700m of the ocean, is largely due to anthropogenic factors.
  • T.B., J.M.L., and G.C.J. were supported by the NOAA Climate Program Office and NOAA Research.
  • The Editor on this paper was Eelco Rohling.

Did you find this useful? Give us your feedback

Figures (16)

Content maybe subject to copyright    Report

A REVIEW OF GLOBAL OCEAN TEMPERATURE
OBSERVATIONS: IMPLICATIONS FOR OCEAN
HEAT CONTENT ESTIMATES AND
CLIMATE CHANGE
J. P. Abraham,
1
M. Baringer,
2
N. L. Bindoff,
3,4,5
T. Boyer,
6
L. J. Cheng,
7
J. A. Church,
4
J. L. Conroy,
8
C. M. Domingues,
5
J. T. Fasullo,
9
J. Gilson,
10
G. Goni,
2
S. A. Good,
11
J. M. Gorman,
1
V. Gouretski,
12
M. Ishii,
13
G. C. Johnson,
14
S. Kizu,
15
J. M. Lyman,
14,16
A. M. Macdonald,
17
W. J. Minkowycz,
18
S. E. Moftt,
19,20
M. D. Palmer,
11
A. R. Piola,
21
F. Reseghetti,
22
K. Schuckmann,
23
K. E. Trenberth,
9
I. Velicogna,
24,25
and J. K. Willis
25
Received 24 April 2013; revised 9 August 2013; accepted 13 August 2013; published 23 September 2013.
[1] The evolution of ocean tempera ture measurement systems
is pre sented with a focus on the development and accuracy of
two critical devices in use toda y (expenda ble bathythe rmo-
graphs and conductivity-temperature-depth instruments used
on Argo oats). A detailed discussion of the accu racy of these
devices and a projection of the future of ocean temperature
measurements are provided. The accuracy of ocean tempera-
ture measurements is discussed in detail in the context of ocean
heat content, Earths energy imbala nce, and thermosteric se a
level rise. Up-to-date estimates are provided for these three
important quantities. The total energy imbalance at the to p of
atmosphere is best assessed by taking an inventory of changes
in energy storage. The main storage is in the ocean, the late st
values of whic h are presented. Furthe rmore, despite differences
in measurement methods and analysis techniques, multiple
studies sh ow that there has been a multidecadal increase in
the heat content of both the upper and deep ocean regions,
which reects the impact of anthropogenic warming. With
respect to sea level rise, mutually reinforcing information from
tide gauges and radar altimetry sh ows that presently, sea level
is rising at approximately 3 mm yr
1
with contributions from
both thermal expansion and mass accumulation from ice melt.
The latest data for thermal expansio n sea level rise are inc luded
here and analyzed.
1
School of Engineering, University of St. Thomas, St. Paul, Minneapolis,
USA.
2
Atlantic Oceanographic and Meteorological Laboratory, National
Oceanic and Atmospheric Administration, Miami, Florida, USA.
3
IMAS, University of Tasmania, Hobart, Tasmania, Australia.
4
Centre for Australian Weather and Climate Research, CSIRO Marine
and Atmospheric Research, Hobart, Tasmania, Australia.
5
Antarctic Climate and Ecosystems Cooperative Research Centre,
University of Tasmania, Hobart, Tasmania, Australia.
6
National Oceanographic Data Center, NOAA, Silver Spring, Maryland,
USA.
7
Institute of Atmospheric Physics, Chinese Academy of Science, Bejing,
China.
8
School of Earth and Atmospheric Sciences, Georgia Institute of
Technology, Atlanta, Georgia, USA.
9
National Center for Atmospheric Research, Boulder, Colorado, USA.
10
Scripps Institution of Oceanography, La Jolla, California, USA.
11
Met Ofce Hadley Centre, Exeter, UK.
12
Klima Campus, Hamburg University, Hamburg, Germany.
13
Climate Research Department, Meteorological Research Institute,
Tsukuba, Japan.
14
Pacic Marine Environmental Laboratory, NOAA, Seattle, Washington,
USA.
15
Department of Geophysics, Tohoku University, Sendai, Japan.
16
Joint Institute for Marine and Atmospheric Research, University of
Hawaii at Manoa, Honolulu, Hawaii, USA.
17
Woods Hole Oceanographic Institution, Woods Hole, Massachusettes,
USA.
18
Department of Mechanical and Industrial Engineering, University of
Illinois at Chicago, Chicago, Illinois, USA.
19
Bodega Marine Laboratory, Bodega, California, USA.
20
Graduate Group in Ecology, University of California, Davis, California,
USA.
21
Departamento Oceanograa, Servicio de Hidrograa Naval and
Departamento de Ciencias de la Atmosfera y los Oceanos/UMI IFAECI,
Universidad de Buenos Aires, Buenos Aires, Argentina.
22
ENEAItalian National Agency for New Technologies, Energy
Sustainable Economic DevelopmentUTMAR-OSS, La Spezia, Italy.
23
Ifremer, Toulon, France.
24
Department of Earth System Science, University of California, Irvine,
California, USA.
25
Jet Propulsion Laboratory, California Institute of Technology,
Pasadena, California, USA.
Corresponding author: J. P. Abraham, School of Engineering, University
of St. Thomas, 2115 Summit Ave., St. Paul, MN 55105-1079, USA.
(jpabraham@stthomas.edu)
©2013. American Geophysical Union. All Rights Reserved. Reviews of Geophysics, 51 / 2013
450
8755-1209/13/10.1002/rog.20022 Paper number 2013RG000432

Citation: Abraham, J. P., et al. (2013), A review of global ocean temperature observations: Implications for
ocean heat content estimates and climate change, Rev. Geophys., 51, 450483, doi:10.1002/rog.20022.
1. INTRODUCTION
[
2] The broad topic of climate science includes a multitude
of subspecialties that are associated with various components
of the climate system and climate processes. Among these
components are Earths oceans, atmosphere, cryosphere, and
terrestrial regions. Processes include all forms of heat transfer
and uid mechanics within the climate system, changes to
thermal energy of various reservoirs, and the radiative balance
of the Earth. The incredible diversity of climate science makes
it nearly impossible to cover all aspects in a single manuscript,
except perhaps for within massive assessment reports [e.g.,
Intergovernmental Panel on Climate Change (IPCC), 2007].
Nevertheless, it is important to periodically provide detailed
surveys of the aforementioned topical areas to establish the
current state of the art and future directions of research.
[
3]Itisrmly established that changes to the Earths atmo-
spheric concentrations of greenhouse gases can and have
caused a global change to the stored thermal energy in the
Earths climate system [Hansen et al., 2005; Levitus et al.,
2001]. To assess the impact of human emissions on climate
change and to evaluate the overall change to Earthsthermal
energy (whether from natural or human causes), it is essential
to comprehensively monitor the major thermal reservoirs.
The largest thermal reservoirs are the Earths oceans; their
extensive total volume and large thermal capacity require
a larger injection of energy for a change in temperature
compared to other reservoirs.
[
4] Despite the importance of accurately measuring the ther-
mal energy of the ocean, it remains a challenging problem for
climate scientists. Measurements covering extensive spatial
and temporal scales are required for a determination of the
energy changes over time. While there have been signicant
advancements in the quantity and quality of ocean temperature
measurements, coverage is not yet truly global. Furthermore,
past eras of ocean monitoring have provided extensive data
but variable spatial coverage. Finally, chang es in measurement
techniques and instrumenta tion have resulted in biases, many
of which have been discovered with some account made .
[
5] This review focuses on subsurface ocean temperature
measurements that are required for climate assessment, with
an emphasis on the status of oceanographic temperature
measurements as obtained from two of the key historical and
modern measurement instruments. Those instruments (the
expendable bathythermograph (XBT) and the Argo oats)
are among the most important instruments for assessing ocean
temperatures globally, and they provide up-to-date ocean
subsurface temperature measurements. A historical discussion
of other families of probes will also be provided along with
discussions of the accuracy of those families.
[
6] While most of the analyses reviewed here are done by
individuals or small groups of investigators, they would not
have been possible without strong international coordination
and cooperation. International, observational programs and
projects are vital to the data used in these analyses. Early
examples are the International Geophysical Year in 1957
1958, with its extensive Nansen bottle sections, and the
19711980 International Decade of Ocean Exploration, which
endorsed the North Pacic Experiment (greatly increasing
North Pacic shallow XBT use in the 1970s) and
Geochemical Ocean Sections Study (a global high-quality
and full-depth, if sparse, baseline oceanographic survey).
[
7] Since its inception, the World Climate Research
Program (WCRP) has taken international leadership with the
Tropical Ocean-Global Atmosphere project which focused
on observation in the equatorial region in the 1980s, including
initiating the Tropical Atmosphere Ocean/Triangle Trans-
Ocean Buoy Network (TAO/TRITON) moored array and the
World Ocean Circulation Experiment (WOCE) which took a
truly global set of oceanographic coast-to-coast full-depth
sections and expanded the XBT network in the 1990s. The
WOCE provides a global-scale benchmark against which
change can be assessed. More recently, the WCRP formulated
the Climate Variability and Predictability Project (CLIVAR),
further fostering the Argo oat array and reoccupation of some
of the full-depth WOCE hydrographic sections under the aus-
pices of the Global Oceanographic Ship-Based Hydrographic
Investigations Program. The Global Climate Observing
System, in partnership with WCRP, has formulated a global
ocean observing system and encourage d contri bution to it, par-
ticularly through the OceanObs workshops in 1999 and 2009.
[
8] Oceanographic data centers, both national and interna-
tional, are also vital to the studies reviewed here. These centers
accept, collect, and actively seek out data (from large programs
and small); then archive and quality control them; and make
the results readily and publically available. The collection,
assembly, and quality control of a comprehensive data set are
invaluable for all sorts of global analyses, including those of
ocean temperature, heat content, and thermal expansion.
2. THE EVOLVING SUBSURFACE TEMPERATURE
OBSERVING SYSTEM: A HISTORICAL PERSPECTIVE
[
9] An understanding of ocean heat content changes is only
as good as the subsurface ocean temperature observations
upon which these calculated changes are based. The subsur-
face temperature observing system is still relatively young
when compared to atmospheric observing systems. What
follows is a look at the developments and ideas that enabled
implementation and precipitated changes in the observing
system. As a guide, Figure 1 shows geographical coverage
during the height of each iteration of the observing system.
2.1. Early Measurements (From 1772)
[
10] On Captain James Cooks second voyage (177 21775),
water samples were obtained from the subsurface Southern
Ocean and it was found that surface waters were colder
than waters at 100 fathoms (~183 m) [Cook, 1777]. These mea-
surements, although not very accurate, are among the rst in-
stances of oceanographic prole data recorded and preserved.
ABRAHAM ET AL.: REVIEW OF OCEAN OBSERVATIONS
451

Slightly more than 100 years later, the Challe nger expedition
(18731876) circumnavigated the globe, taking temperature
proles from the surface to the ocean bottom along the way,
ushering in an increased interest in subsurface oceanography
and new technology developments which facilitated measure-
ment. The Challenger was equipped with a pressure-shielded
thermometer [Anonymous, 1870; Wolla ston, 1782; Roemmich
et al., 2012] to partially counteract the effe cts of pressu re on
temperature at great depths.
2.2. The Nansen Bottle Observation System
(From 1900)
[
11] Around the time of the Challenger expedition, the
reversing thermometer [Negretti and Zambra, 1873] was
introduced and remained the standard instrument for subsur-
face temperature measurements until 1939. It is still in limited
use today. A protected reversing thermometer was typically
accurate to 0.01°C or better when properly calibrated. Pairs
of protected and unprotected reversing thermometers were
used to determine temperature and pressure, with pressure de-
termined to an accuracy of ±5 m depth in the upper 1000 m.
The development of the Nansen bottle [Mill,1900;Helland-
Hansen and Nansen, 1909] which attached the thermometers
to a sealed water sample bottle completed the instrumentation
package which constituted the subsurface upper ocean temper-
ature observing system for the 19001939 time period. The
problems during this time period with regard to a global ocean
observing system were that Nansen bottle/reversing thermom-
eter systems could only measure at a few discrete levels at
each oceanographic station and that it was time consuming
to deploy the instrumentation and make the measurements. It
was also difcult to get properly equipped ships to most areas
of the ocean. Many of the open ocean temperature proles
were measured during a small number of major research
cruises [Wust, 1964]. Hence, the long-term mean seasonal
variations, the year-to-year variance, and vertical structure of
the ocean were not well described.
2.3. Mechanical Bathythermograph Observation
System (From 1939)
[
12] Quickly and accurately mapping the temperature
variation of the upper ocean became a military priority in
the lead-up to World War II for the accurate interpretation
of sonar readings to locate submarines and their potential hid-
ing places. As related in Couper and LaFond [1970], sonar
operators were aware of an afternoon effect where sonar
ranges were shorter in the afternoon than in the morning,
but did not understand that the effect was due to diurnal
warming. The wide vertical spacings of Nansen bottle casts
did not capture the gradients at the bottom of the mixed layer
or indeed the vertical extent of the mixed layer.
[
13] Early in the 1930s, Carl-Gustaf Rossby had ex-
perimented with an oceanograph which could draw a
continuous pressure/temperature trace on a smoked brass foil
[Rossby and Montgomery, 1935]. Rossby enlisted Athelstan
Spilhaus to develop this idea into a cheap, reliable, reusable
instrument. Spilhaus created the rst version of the instrument
that we now call the mechanical bathythermograph (MBT)
[Spilhaus, 1938]. Oceanographers now had the means with
which to acquire detailed sets of measurements to map the
mixed layer and shallow thermocline [Spilhaus, 1940].
[
14] The U.S. Navy funded research to improve the design
and operation of the MBT, as Drs. Vine, Ewing, and Worzel
modied Spilhauss design to allow operational use of the
instrument by the Navy and oceanographers [Spilhaus,
1987]. The U.S. Navy, in conjunction with Scripps Institute
of Oceanography and the Woods Hole Oceanographic
Institution, facilitated the rst coordinated worldwide subsur-
face temperature measurement system, which grew up during
World War II and continued afterward. The MBT itself is a
cylinder approximately 31.5 inches (~0.8 m) long and 2 inches
(~0.51 m) in diameter with a nose weight, towing attachment,
Figure 1. Geographic distribution of subsurface tempera-
ture proles for (a) 1934, (b) 1960, (c) 1985, and (d) 2009.
Red = Nansen bottle or conductivity-temperature-depth
(CTD), light blue = mechanical bathythermograph (MBT),
dark blue = expendable bathythermograph (XBT), orange =
tropical moored buoy, green = proling oat.
ABRAHAM ET AL.: REVIEW OF OCEAN OBSERVATIONS
452

and tail. Inside the cylinder is a Bourdon tube enclosing a
capillary tube with xylene (a hydrocarbon obtained from wood
or coal tar) inside. As temperature increases, the pressure on
the xylene increases, causing the Bourdon tube to unwind. A
stylus attached to the Bourdon tube captures the movement
as temperature change horizontally scratched on a plate of
smoked glass. A spring and piston measuring pressure simul-
taneously pulls the stylus vertically down the glass, complet-
ing the depth/temperature prole. The instrument free-falls
from a winch that is used to recover the instrument; it can be
used at speeds up to 15 kt. Initially, MBTs were built to reach
depths of 400 feet (~122 m). By 1946, MBTs could reach to
900 feet (~275 m), although the shallower version was
deployed more often every year except 1964 (49% shallower
version). The 900 foot MBTs had signicant depth calibration
issues if they were lowered the full 900 feet, and for this rea-
son, most MBTs were not lowered deeper than 400450 feet.
The accuracy of the MBT instrument was ±5 dbar in pressure
and ±0.3°C in temperature.
[
15] The Navys interest in MBTs was for temperature
gradient information, but a system of careful calibration
was put in place to accurately preserve the full temperature
information for future study. Later, more than 1.5 million
MBT temperature traces from1939 to 1967 were digitized
at 5 m intervals and stored on index cards. These cards were,
in turn, electronically digitized and archived at the U.S.
National Oceanographic Data Center [Levitus, 2012]. It was
reported that 73% of all 19391967 MBTs were U.S.
devices, but other countries, notably Japan and the Soviet
Union, also dropped MBTs. However, these traces were not
distributed under the U.S. Navy system. MBTs continued
to be used after 1967, with ~800,000 traces gathered in
19681990. Geographic coverage of MBTs was limited by
areas of interest to navies, merchant ship routes, and research
cruises. So, while a sketch of the upper ocean waters was
being recorded by the MBT network, geographic distribution
was uneven and temperature measurements from depths
deeper than 250 m were still reliant on sparse Nansen
bottle observations.
2.4. Ship-Based Conductivity-Temperature-Depth
Instruments (From 1955)
[
16] The development of the salinity-temperature-depth
(STD) and later the conductivity-temperature-depth (CTD)
instruments augmented existing observations by eventually
replacing the discrete reversing thermometer observations
with continuous proles of temperature. The development of
the CTD also laid the groundwork for our current observing
system and for the backbone large-scale measurement cruises
of the World Ocean Circulation Experiment (WOCE) among
others. But, since it was an instrument that was mainly
deployed from research ships, the CTD could not replace the
MBT observing network. The development of the CTD was
precipitated by advances in temperature measurement
before and during World War II. The basic physical concept
of a thermal resistor was known as early as 1833 when
Faraday noted that the conductivity of certain elements
was affected by changes in temperature [Faraday , 1833].
However, it was not until 1946 that technological advances
made commercial production of these thermal resistors
(coined thermistors)possible[Becker, 1946]. Similarl y,
platinum resistance thermometers, which had been under-
stood for some time [Callendar, 1887], became practical
for oceanographic applications owing to more recent
technological advances [Barber, 1950].
[
17] An early attempt to measure a continuous temperature
prole [Jacobsen, 1948 ] inspired Hamon and Brown [Hamon,
1955; Hamon and Brown, 1958] to engineer a similar instru-
ment. Hamon and Brown deployed their rst STD in 1955
[Baker, 1981]. Their instrument, which was lowered by a
winch, used a thermistor, as well as a conductivity sensor
and pressure sensor connected by a sealed cable to an analog
strip chart on deck. The pressure sensor was a Bourdon tube
connected to a potentiometer. Commercial production of
CTDs began in 1964. Brown later modied the CTD design
to use both a fast-response thermistor and a platinum
resistance thermometer as well as a wire strain gauge bridge
transducer to measure pressure in order to correct transients
in the conductivity signal [Brown, 1974]. Most modern
CTDs now use thermistors, often in pairs, and strain gauge
pressure sensors. While Hamons original STD experiments
had an accuracy of 0.1°C and 20 m in depth, the modern
CTD is accurate to 0.001°C and 0.15% of full scale for
pressure (1.5 m at 1000 m depth) and fully digital. Modern
shipboard CTD temperature sensors have a time response of
0.065 s (compared to 0.20.4 s for the MBT stylus), which
allow the acquisition of accurate pressure/temperature
proles at a fairly rapid deployment rate from the surface
to the deep ocean. When combined with the lowering speed
(~1 m s
1
), a vertical resolution of 0.06 m is obtained,
although in practice, data are often reported in 1 or 2 m
averages, since ship-roll-induced motions alias the tempera-
ture data on ner vertical scales.
2.5. The Expendable Bathythermograph Observing
System (From 1967)
[
18]AsSnodgrass [1968] relates, by the early 1960s, the
search was on for a replacement for the MBT. The replace-
ment needed to be cheaper and easier to deploy , calibrate,
and retrieve data, and had to be able to prole deeply from
ships moving faster than 15 knots. Technological advances
in wire and wire insulation made it possible to create an
instrument electrically connected to the ship and able to
transmit information through a thin conducting wire.
Advances in thermistor manufacture made it practical to
deploy these temperature sensors cheaply, with no need to
retrieve instruments after deployment. More than 12 compa-
nies attempted to create the expendable bathythermograph
(XBT). Three succeeded, but only one, Sippican (Lockheed
Martin Sippican (LMS)), went on to dominate the XBT
market due to their winning of a contract with the U.S.
Navy [Kizu et al., 2011]. Their design was a torpedo-shaped
probe smaller than the MBT, containing a thermistor in the
central hole through the zinc nose. A wire connected the
probe to the ship deck. Part of the wire is wrapped around
the XBT itself and part in a canister shipboard.
ABRAHAM ET AL.: REVIEW OF OCEAN OBSERVATIONS
453

[19] U.S. Navy traces were sent to the Fleet Numerical
Weather Center (FNWC) where they were digitized, used
for weather prediction and other projects, and then passed
to the U.S. National Oceanographic Data Center (NODC)
for archive and public release [Magruder, 1970]. About
60% of all publicly available XBT data in 19671989 were
U.S. drops. In 1990, a global system of distributing XBT data
was implemented (see below discussion of the Global
Temperature and Salinity Prole Program (GTSPP)).
[
20] The new probe almost immediately revolutionized
subsurface ocean temperature observations with their low
cost and easy deployment from Navy, merchant, and research
ships. Estimates of upper ocean global mean yearly heat
content anomaly exhibit reduced sampling uncertainty
starting from around year 1967, the rst year of widespread
use of the XBT [Lyman and Johnson, 2008; Boyer et al.,
1998]. The success of the XBT and the concurrent Fleet
Numerical Weather Center (FNWC) Ship-of-Opportunity
Program (SOOP) led to more systematic designs of XBT
observing networks for the Pacic[White and Bernstein,
1979] and the Atlantic [Bretherton et al., 1984; Festa and
Molinari, 1992] which were implemented and continue still.
The switch to digital recorders in the 1980s made the use and
dissemination of XBT data even easier.
[
21] With the advent of the ARGOS positioning and data
transmission system, set up by the French and U.S. Space agen-
cies in 19 78, XBT proles began to be transmitted from ships
in real time and distributed on the World Meteorological
Organizations Global Telecommunications System (GTS).
The Global Temperature and Salinity Prole Program
(GTSPP) began in 1990 to systematically capture subsurface
temperature data off the GTS, perform quality check and
control, and distribute XBT temperature proles (and other
subsurface data) to the scientic and operational communities
in near-real time. The XBT response time, at 0.15 s, is slower
than modern shipboard CTDs, its accuracy likewise, at 0.15°C
and 2% or 5 m in depth, whichever is greater. LMS is still the
main manufacturer of XBTs. TSK, a Japanese company
(Tsurumi-Seiki Co.), started manufacturing T6s in 1972 and
T7s in 1978 [Kizu et al., 2011]. These designators follow a
model-naming scheme that uses letter/number combinations
to identify probe types. A Canadian company, Sparton, also
briey manufactured XBTs of their own design.
[
22] Despite their widespread use, XBTs are not free of
problems. Section 3 of this review will discuss these
problems in detail. From 1967 to 2001, the XBT was a major
contributor to the subsurface temperature observing system
and was responsible for the growth of this system.
However, it was still limited to major shipping routes and
Navy and research cruise paths, leaving large parts of the
ocean undersampled for many years. The XBT is also depth
limited. While there are deep falling XBTs such as the T-5
that reach to nearly 2000 m, they are of limited use due to cost
and the lower ship speed necessary for the drops.
[
23] There is another expendable probe that contempora-
neously measures conductivity and temperature (XCTD). It
is available from TSK; however, it has appeared in far fewer
numbers than the XBT devices described here.
2.6. Tropical Moored Arrays (From 1984)
[
24] The tropical moored arrays were set up to continuously
monitor the tropical ocean. The rst tropical moored array, the
Tropical Atmosphere Ocean (TAO) array (later TAO/
TRITON), was set up to help monitor and understand the El
Niño phenomenon [McPhaden et al., 1998]. After initial
experiments in 1979, an array of moored buoys, spaced at
latitude and 10°15° longitude, was set up across the
equatorial Pacic. Work began on the array in 1984, and it
was completed in 1994. The temperature sensor is often just
a thermistor but is sometimes paired with a conductivity or
pressure sensor depending on geographic location and depth.
Each buoyed sensor is attached to a mooring line and hung
at depths from the surface to 500 m. The measurements are
relayed to a satellite and then the GTS at 12 min intervals.
The TAO/TRITON array requires regular maintenance and
calibration cruises.
[
25] The PIRATA array (Pilot Research moored Array
in the Tropical Atlantic) [Bourles et al., 2008] was set up
in the Atlantic starting in the mid-1990s. The RAMA
array (Research Moored Array for African-Asian-Australian
Monsoon Analysis and Prediction) [McPhaden et al., 2009],
begun in the Indian Ocean in the early 2000s, is still not
complete. Both follow similar setup and data transmission
patterns as TAO/TRITON. The array is important for local
heat content calculations [e.g., Xue et al., 2012], and even
the exclusion of one meridional set of buoys from the heat
content calculation during the 19971998 El Niño led to a
signicant underestimate of heat content anomaly.
2.7. Argo Profiling Float Observing System
(From 2001)
[
26] By the 1990s, all the pieces were in place for a global
ocean observing system: a scientically based blueprint for
systematic observations, a satellite network for real-time data
delivery, technology for easy and accurate temperature and
pressure (depth) measurements, and a reliable data distribution
network. But the observing system was still limited by the need
to take most measurements from ships, geographically limited,
seasonally biased, and often costly to outt and deploy. As
with previous obstacles to the observing system, the answer
to these limitations lay in a combination of older ideas and
new technological applications. The Swallow oat was a neu-
trally buoyant oat developed in the 1950s [Swallow, 1955].
These oats sank to a neutrally buoyant level and were tracked
by a nearby surface ship. Later, the SOFAR (Sound Fixing and
Ranging) oat [Webb and Tucker, 1970; Rossby and Webb,
1970] improved on this system by enabling tracking of the
oat by underwater listening devices. In the 1980s, the
RAFOS oatreversedthisideabyhavingtheoat listen for
stationary underwater sound sources [Rossby et al., 1986].
[
27] The Autonomous Lagrangian Circulation Explorer re-
moved entirely the need for a system of underwater sound
sources by having the oat surface periodically and its position
determined by ARGOS satellites [Davis et al., 1992]. The
oats surface by increasing their buoyancy relative to the
surrounding water by transferring mass and volume between
the oats pressure case and an external bladder. The process
ABRAHAM ET AL.: REVIEW OF OCEAN OBSERVATIONS
454

Citations
More filters
Journal ArticleDOI
TL;DR: This paper showed that strengthening trade winds caused a reduction in the 2012 global average surface air temperature of 0.1 −0.2°C, which is a result of increased subsurface ocean heat uptake.
Abstract: The slowdown in global average surface warming has recently been linked to sea surface cooling in the eastern Pacific Ocean. This work shows that strengthening trade winds caused a reduction in the 2012 global average surface air temperature of 0.1–0.2 °C. This may account for much of the warming hiatus and is a result of increased subsurface ocean heat uptake.

1,151 citations

Journal ArticleDOI
TL;DR: Simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections.
Abstract: Using a 25-y time series of precision satellite altimeter data from TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3, we estimate the climate-change-driven acceleration of global mean sea level over the last 25 y to be 0.084 ± 0.025 mm/y2 Coupled with the average climate-change-driven rate of sea level rise over these same 25 y of 2.9 mm/y, simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections.

671 citations

Journal ArticleDOI
TL;DR: The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985, and OHC changes in six major oceans are reliable on decadal time scales.
Abstract: Earth's energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the data-rich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and with variability on decadal and multidecadal time scales (signal) that can be reliably distinguished from sampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition, OHC changes in six major oceans are reliable on decadal time scales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study.

439 citations

Journal ArticleDOI
TL;DR: The authors used the intensification of heavy precipitation as a counterexample, where seemingly complex and potentially computationally intractable processes manifest themselves to first order in simple ways: heavy precipitation intensification is now emerging in the observed record across many regions of the world.
Abstract: It has been predicted, by theory and models, that heavy precipitation will increase with climate change and this is now being seen in observations Emergence of signals such as this will enable testing of predictions, which should increase confidence in them Environmental phenomena are often observed first, and then explained quantitatively The complexity of processes, the range of scales involved, and the lack of first principles make it challenging to predict conditions beyond the ones observed Here we use the intensification of heavy precipitation as a counterexample, where seemingly complex and potentially computationally intractable processes manifest themselves to first order in simple ways: heavy precipitation intensification is now emerging in the observed record across many regions of the world, confirming both theory and model predictions made decades ago As the anthropogenic climate signal strengthens, there will be more opportunities to test climate predictions for other variables against observations and across a hierarchy of different models and theoretical concepts

413 citations

Journal ArticleDOI
22 Aug 2014-Science
TL;DR: The slowdown in global warming over the beginning of the 21st century has resulted from heat transport into the deep ocean, mainly caused by heat transported to deeper layers in the Atlantic and Southern oceans, initiated by a recurrent salinity anomaly in the subpolar North Atlantic.
Abstract: A vacillating global heat sink at intermediate ocean depths is associated with different climate regimes of surface warming under anthropogenic forcing: The latter part of the 20th century saw rapid global warming as more heat stayed near the surface. In the 21st century, surface warming slowed as more heat moved into deeper oceans. In situ and reanalyzed data are used to trace the pathways of ocean heat uptake. In addition to the shallow La Nina–like patterns in the Pacific that were the previous focus, we found that the slowdown is mainly caused by heat transported to deeper layers in the Atlantic and the Southern oceans, initiated by a recurrent salinity anomaly in the subpolar North Atlantic. Cooling periods associated with the latter deeper heat-sequestration mechanism historically lasted 20 to 35 years.

410 citations

References
More filters
Book ChapterDOI
01 Jan 2013
TL;DR: The Intergovernmental Panel on Climate Change (IPCC) as mentioned in this paper has become a key framework for the exchange of scientific dialogue on climate change within the scientific community as well as across the science and policy arenas.
Abstract: The Intergovernmental Panel on Climate Change (IPCC) is perceived as the leading international body for the assessment of climate change. In the 23 years since its founding, it has become a key framework for the exchange of scientific dialogue on climate change within the scientific community as well as across the science and policy arenas. This article provides an introduction to the IPCC (its establishment, structure, procedures, and publications) and briefly discusses the solutions proposed by the IPCC in the face of recent criticism and media scrutiny. The philosophical framework of the science/policy interface in which the IPCC functions is presented. Finally, this article concludes with a presentation of the challenges facing the IPCC in the ongoing preparation of its 5th assessment report including exploration of the entire solutions space, ensuring a comparable set of scenarios across IPCC working groups and a consistent treatment of uncertainty.

4,080 citations

Journal ArticleDOI
24 Mar 2000-Science
TL;DR: In this article, the authors quantify the interannual-to-decadal variability of the heat content (mean temperature) of the world ocean from the surface through 3000-meter depth for the period 1948 to 1998, showing that the global volume mean temperature increase for the 0- to 300-meter layer was 0.31°C, corresponding to an increase in heat content for this layer of ∼10 23 joules between the mid-1950s and mid-1990s.
Abstract: We quantify the interannual-to-decadal variability of the heat content (mean temperature) of the world ocean from the surface through 3000-meter depth for the period 1948 to 1998. The heat content of the world ocean increased by ∼2 × 10 23 joules between the mid-1950s and mid-1990s, representing a volume mean warming of 0.06°C. This corresponds to a warming rate of 0.3 watt per meter squared (per unit area of Earth9s surface). Substantial changes in heat content occurred in the 300- to 1000-meter layers of each ocean and in depths greater than 1000 meters of the North Atlantic. The global volume mean temperature increase for the 0- to 300-meter layer was 0.31°C, corresponding to an increase in heat content for this layer of ∼10 23 joules between the mid-1950s and mid-1990s. The Atlantic and Pacific Oceans have undergone a net warming since the 1950s and the Indian Ocean has warmed since the mid-1960s, although the warming is not monotonic.

1,680 citations

Journal ArticleDOI
TL;DR: The Simple Ocean Data Assimilation (SODA) reanalysis of ocean climate variability is described in this article, where a model forecast produced by an ocean general circulation model with an average resolution of 0.25° 0.4° 40 levels is continuously corrected by contemporaneous observations with corrections estimated every 10 days.
Abstract: This paper describes the Simple Ocean Data Assimilation (SODA) reanalysis of ocean climate variability. In the assimilation, a model forecast produced by an ocean general circulation model with an average resolution of 0.25° 0.4° 40 levels is continuously corrected by contemporaneous observations with corrections estimated every 10 days. The basic reanalysis, SODA 1.4.2, spans the 44-yr period from 1958 to 2001, which complements the span of the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) atmospheric reanalysis (ERA-40). The observation set for this experiment includes the historical archive of hydrographic profiles supplemented by ship intake measurements, moored hydrographic observations, and remotely sensed SST. A parallel run, SODA 1.4.0, is forced with identical surface boundary conditions, but without data assimilation. The new reanalysis represents a significant improvement over a previously published version of the SODA algorithm. In particular, eddy kinetic energy and sea level variability are much larger than in previous versions and are more similar to estimates from independent observations. One issue addressed in this paper is the relative importance of the model forecast versus the observations for the analysis. The results show that at near-annual frequencies the forecast model has a strong influence, whereas at decadal frequencies the observations become increasingly dominant in the analysis. As a consequence, interannual variability in SODA 1.4.2 closely resembles interannual variability in SODA 1.4.0. However, decadal anomalies of the 0–700-m heat content from SODA 1.4.2 more closely resemble heat content anomalies based on observations.

1,614 citations

Journal ArticleDOI
TL;DR: In this paper, the authors used three statistically based methods: optimal smoothing (OS), the Kalrnan filter (KF), and optimal interpolation (OI), along with estimates of the error covariance of the analyzed fields.
Abstract: Global analyses of monthly sea surface temperature (SST) anomalies from 1856 to 1991 are produced using three statistically based methods: optimal smoothing (OS), the Kalrnan filter (KF) and optimal interpolation (OI). Each of these is accompanied by estimates of the error covariance of the analyzed fields. The spatial covariance function these methods require is estimated from the available data; the time-marching model is a first-order autoregressive model again estimated from data. The data input for the analyses are monthly anomalies from the United Kingdom Meteorological Office historical sea surface temperature data set (MOHSST5) (Parker et al., 1994) of the Global Ocean Surface Temperature Atlas (COSTA) (Bottoraley et al., 1990). These analyses are compared with each other, with COSTA, and with an analy- sis generated by projection (P) onto a set of empirical orthogonal functions (as in Smith et al. (1996)). In theory, the quality of the analyses should rank in the order OS, KF, OI, P, and COSTA. It is found that the first four give comparable results in the data-rich periods (1951-1991), but at times when data is sparse the first three differ significantly from P and COSTA. At these times the latter two often have extreme and fluctuating values, prima facie evidence of error. The statistical schemes are also verified against data not used in any of the analyses (proxy records derived from corals and air temperature records from coastal and island stations). We also present evidence that the analysis error estimates are indeed indicative of the quality of the products. At most times the OS and KF products are close to the OI product, but at times of especially poor coverage their use of information from other times is advantageous. The methods appear to reconstruct the major features of the global SST field from very sparse data. Comparison with other indications of the E1 Nifio - Southern Oscillation cycle show that the analyses provide usable information on interannual variability as far back as the 1860s.

1,561 citations

Frequently Asked Questions (17)
Q1. What have the authors contributed in "A review of global ocean temperature observations: implications for ocean heat content estimates and climate change" ?

The evolution of ocean temperature measurement systems is presented with a focus on the development and accuracy of two critical devices in use today ( expendable bathythermographs and conductivity-temperature-depth instruments used on Argo floats ). A detailed discussion of the accuracy of these devices and a projection of the future of ocean temperature measurements are provided. Furthermore, despite differences in measurement methods and analysis techniques, multiple studies show that there has been a multidecadal increase in the heat content of both the upper and deep ocean regions, which reflects the impact of anthropogenic warming. 

Despite these potential future improvements to ocean monitoring, past and present measurements show that the Earth is experiencing a net gain in heat, largely from anthropogenic factors [ Hansen et al., 2005 ; Levitus et al., 2001 ], although the magnitude differs among individual studies. 

Technological advances in wire and wire insulation made it possible to create an instrument electrically connected to the ship and able to transmit information through a thin conducting wire. 

The expected lifetime of an Argo float is 3–5 years, so the fleet must be continually renewed to maintain the 3000 float goal. [28] 

Modern shipboard CTD temperature sensors have a time response of 0.065 s (compared to 0.2–0.4 s for the MBT stylus), which allow the acquisition of accurate pressure/temperature profiles at a fairly rapid deployment rate from the surface to the deep ocean. 

The Challenger was equipped with a pressure-shielded thermometer [Anonymous, 1870; Wollaston, 1782; Roemmich et al., 2012] to partially counteract the effects of pressure on temperature at great depths. 

since it was an instrument that was mainly deployed from research ships, the CTD could not replace the MBT observing network. 

With respect to sea level rise, mutually reinforcing information from tide gauges and radar altimetry shows that presently, sea level is rising at approximately 3mmyr 1 with contributions from both thermal expansion and mass accumulation from ice melt. 

Pairs of protected and unprotected reversing thermometers were used to determine temperature and pressure, with pressure determined to an accuracy of ±5m depth in the upper 1000m. 

On Captain James Cook’s second voyage (1772–1775), water samples were obtained from the subsurface Southern Ocean and it was found that surface waters were colder than waters at 100 fathoms (~183m) [Cook, 1777]. 

From 1967 to 2001, the XBT was a major contributor to the subsurface temperature observing system and was responsible for the growth of this system. 

The array is important for local heat content calculations [e.g., Xue et al., 2012], and even the exclusion of one meridional set of buoys from the heat content calculation during the 1997–1998 El Niño led to a significant underestimate of heat content anomaly. 

Those instruments (the expendable bathythermograph (XBT) and the Argo floats) are among the most important instruments for assessing ocean temperatures globally, and they provide up-to-date ocean subsurface temperature measurements. 

By 1946, MBTs could reach to 900 feet (~275m), although the shallower version was deployed more often every year except 1964 (49% shallower version). 

The XBT response time, at 0.15 s, is slower than modern shipboard CTDs, its accuracy likewise, at 0.15°C and 2% or 5m in depth, whichever is greater. 

The problems during this time period with regard to a global ocean observing system were that Nansen bottle/reversing thermometer systems could only measure at a few discrete levels at each oceanographic station and that it was time consuming to deploy the instrumentation and make the measurements. 

A stylus attached to the Bourdon tube captures the movement as temperature change horizontally scratched on a plate of smoked glass. 

Trending Questions (1)
What are the implications of ocean heat budget estimates for climate change?

The accuracy of ocean temperature measurements is crucial for understanding climate change and estimating Earth's energy imbalance and sea level rise.