scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Geodynamo model and error parameter estimation using geomagnetic data assimilation

01 Jan 2015-Geophysical Journal International (Oxford University Press)-Vol. 200, Iss: 1, pp 664-675
TL;DR: In this paper, a new geomagnetic data assimilation approach was developed which uses the minimum variance estimate for the analysis state, and which models both the forecast (or model output) and observation errors using an empirical approach and parameter tuning.
Abstract: S U M M A R Y We have developed a new geomagnetic data assimilation approach which uses the minimum variance’ estimate for the analysis state, and which models both the forecast (or model output) and observation errors using an empirical approach and parameter tuning. This system is used in a series of assimilation experiments using Gauss coefficients (hereafter referred to as observational data) from the GUFM1 and CM4 fieldmodels for the years 1590–1990.We show that this assimilation system could be used to improve our knowledge of model parameters, model errors and the dynamical consistency of observation errors, by comparing forecasts of the magnetic field with the observations every 20 yr. Statistics of differences between observation and forecast (O − F) are used to determine how forecast accuracy depends on the Rayleigh number, forecast error correlation length scale and an observation error scale factor. Experiments have been carried out which demonstrate that a Rayleigh number of 30 times the critical Rayleigh number produces better geomagnetic forecasts than lower values, with an Ekman number of E = 1.25 × 10−6, which produces a modified magnetic Reynolds number within the parameter domain with an ‘Earth like’ geodynamo. The optimal forecast error correlation length scale is found to be around 90 per cent of the thickness of the outer core, indicating a significant bias in the forecasts. Geomagnetic forecasts are also found to be highly sensitive to estimates of modelled observation errors: Errors that are too small do not lead to the gradual reduction in forecast error with time that is generally expected in a data assimilation system while observation errors that are too large lead to model divergence. Finally, we show that assimilation of L ≤ 3 (or large scale) gauss coefficients can help to improve forecasts of the L > 5 (smaller scale) coefficients, and that these improvements are the result of corrections to the velocity field in the geodynamo model.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the remarkable developments in numerical geodynamo simulations over the last few years are reviewed, where geomagnetic data and dynamo simulations are coupled to form a tool for interpreting the magnetostrophic force balance.
Abstract: This paper reviews the remarkable developments in numerical geodynamo simulations over the last few years. Simulations with Ekman numbers as low as E=10−8E=10−8 are now within reach and more and more details of the observed field are recovered by computer models. However, some newer experimental and ab initio results suggest a rather large thermal conductivity for the liquid iron alloy in Earth's core. More heat would then simply be conducted down the core adiabat and would not be available for driving the dynamo process. The current status of this topic is reported and alternative driving scenarios are discussed. The paper then addresses the question whether dynamo simulations obey the magnetostrophic force balance that characterises the geodynamo and proceeds with discussing related problems like scaling laws and torsional oscillations. Finally, recent developments in geomagnetic data assimilation are reviewed, where geomagnetic data and dynamo simulations are coupled to form a tool for interpre...

38 citations

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the particular contribution of liquid sodium spherical Couette devices to the subject matter of Earth's core dynamics, focusing on their ability to model the interplay between the flow of the conducting liquid and the magnetic field this flow sustains via dynamo action.
Abstract: Our understanding of the dynamics of the Earth’s core can be advanced by a combination of observation, experiments, and simulations. A crucial aspect of the core is the interplay between the flow of the conducting liquid and the magnetic field this flow sustains via dynamo action. This non-linear interaction, and the presence of turbulence in the flow, precludes direct numerical simulation of the system with realistic turbulence. Thus, in addition to simulations and observations (both seismological and geomagnetic), experiments can contribute insight into the core dynamics. Liquid sodium laboratory experiments can serve as models of the Earth’s core with the key ingredients of conducting fluid, turbulent flow, and overall rotation, and can also approximate the geometry of the core. By accessing regions of parameter space inaccessible to numerical studies, experiments can benchmark simulations and reveal phenomena relevant to the Earth’s core and other planetary cores. This review focuses on the particular contribution of liquid sodium spherical Couette devices to this subject matter.

18 citations


Cites methods from "Geodynamo model and error parameter..."

  • ...Several groups have used these techniques to better understand and forecast the geodynamo (Fournier et al. 2010; Tangborn and Kuang 2015)....

    [...]

Posted Content
TL;DR: In this article, the authors reviewed the observation constraints on geomagnetic field changes from interannual to millenial periods, and the current resolution of field models (covering archeological to satellite eras) is discussed.
Abstract: Observational constraints on geomagnetic field changes from interannual to millenial periods are reviewed, and the current resolution of field models (covering archeological to satellite eras) is discussed. With the perspective of data assimilation, emphasis is put on uncertainties entaching Gauss coefficients, and on the statistical properties of ground-based records. These latter potentially call for leaving behind the notion of geomagnetic jerks. The accuracy at which we recover interannual changes also requires considering with caution the apparent periodicity seen in the secular acceleration from satellite data. I then address the interpretation of recorded magnetic fluctuations in terms of core dynamics, highlighting the need for models that allow (or pre-suppose) a magnetic energy orders of magnitudes larger than the kinetic energy at large length-scales, a target for future numerical simulations of the geodynamo. I finally recall the first attempts at implementing geomagnetic data assimilation algorithms.

18 citations


Cites background from "Geodynamo model and error parameter..."

  • ...It is however known that space and time cross-correlations are important [Fournier et al., 2011, Tangborn and Kuang, 2015]....

    [...]

Journal ArticleDOI
TL;DR: In this article, a sequential data assimilation framework is proposed to extract the correlations from an ensemble of dynamo models for long-term geomagnetic predictions, where the primary correlations couple variables of the same azimuthal wave number, reflecting the predominant axial symmetry of the magnetic field.
Abstract: High precision observations of the present day geomagnetic field by ground based observatories and satellites provide unprecedented conditions for unveiling the dynamics of the Earth’s core. Combining geomagnetic observations with dynamo simulations in a data assimilation framework allows the reconstruction of past and present states of the internal core dynamics. The essential information that couples the internal state to the observations is provided by the statistical correlations from a numerical dynamo model in the form of a model covariance matrix. Here we test a sequential data assimilation framework, working through a succession of forecast and analysis steps, that extracts the correlations from an ensemble of dynamo models. The primary correlations couple variables of the same azimuthal wave number, reflecting the predominant axial symmetry of the magnetic field. Synthetic tests show that the scheme becomes unstable when confronted with high precision geomagnetic observations. Our study has identified spurious secondary correlations as the origin of the problem. Keeping only the primary correlations, by localizing the covariance matrix with respect to the azimuthal wave number, suffices to stabilize the assimilation. While the first analysis step is fundamental in constraining the large scale interior state, further assimilation steps refine the smaller and more dynamical scales. This refinement turns out to be critical for long term geomagnetic predictions. Increasing the assimilation steps from one to 18 roughly doubles the prediction horizon for the dipole from about 500 years to a millennium, and from 50 to about 100 years for smaller observable scales. This improvement is also reflected on the predictability of surface intensity features such as the South Atlantic Anomaly. Intensity prediction errors are decreased roughly by a half when assimilating long observation sequences.

15 citations

References
More filters
Book
01 Nov 2002
TL;DR: A comprehensive text and reference work on numerical weather prediction, first published in 2002, covers not only methods for numerical modeling, but also the important related areas of data assimilation and predictability.
Abstract: This comprehensive text and reference work on numerical weather prediction, first published in 2002, covers not only methods for numerical modeling, but also the important related areas of data assimilation and predictability. It incorporates all aspects of environmental computer modeling including an historical overview of the subject, equations of motion and their approximations, a modern and clear description of numerical methods, and the determination of initial conditions using weather observations (an important science known as data assimilation). Finally, this book provides a clear discussion of the problems of predictability and chaos in dynamical systems and how they can be applied to atmospheric and oceanic systems. Professors and students in meteorology, atmospheric science, oceanography, hydrology and environmental science will find much to interest them in this book, which can also form the basis of one or more graduate-level courses.

2,240 citations

Journal ArticleDOI
TL;DR: The system of objective weather map analysis used at the Joint Numerical Weather Prediction Unit is described in this article, which is an integral part of the automatic data processing system, and is designed to operate with a minimum of manual supervision.
Abstract: The system of objective weather map analysis used at the Joint Numerical Weather Prediction Unit is described. It is an integral part of the automatic data processing system, and is designed to operate with a minimum of manual supervision. The analysis method, based mainly on the method of Bergthorssen and Dooos, is essentially a method of applying corrections to a first guess field. The corrections are determined from a comparison of the data with the interpolated value of the guess field at the observation point. For the analysis of the heights of a pressure surface the reported wind is taken into account in determining the lateral gradient of the correction to be applied. A series of scans of the field is made, each scan consisting of application of corrections on a smaller lateral scale than during the previous scan. The analysis system is very flexible, and has been used to analyze many different types of variables. An example of horizontal divergence computed from a direct wind analysis is ...

1,771 citations

Book
01 Jan 1922
TL;DR: This chapter discusses the arrangement of points and instants in sequence, and some remaining problems of computing forms.
Abstract: The idea of forecasting the weather by calculation was first dreamt of by Lewis Fry Richardson. He set out in this book a detailed algorithm for systematic numerical weather prediction. The method of computing atmospheric changes, which he mapped out in great detail in this book, is essentially the method used today. He was greatly ahead of his time because, before his ideas could bear fruit, advances in four critical areas were needed: better understanding of the dynamics of the atmosphere; stable computational algorithms to integrate the equations; regular observations of the free atmosphere; and powerful automatic computer equipment. Over the ensuing years, progress in numerical weather prediction has been dramatic. Weather prediction and climate modelling have now reached a high level of sophistication, and are witness to the influence of Richardson's ideas. This new edition contains a new foreword by Peter Lynch that sets the original book in context.

1,474 citations

Journal ArticleDOI
Andrew C. Lorenc1
TL;DR: Methods discussed include variational techniques, smoothing splines, Kriging, optimal interpolation, successive corrections, constrained initialization, the Kalman-Bucy filter, and adjoint model data assimilation, which are all shown to relate to the idealized analysis, and hence to each other.
Abstract: SUMMARY Bayesian probabilistic arguments are used to derive idealized equations for finding the best analysis for numerical weather prediction. These equations are compared with those from other published methods in the light of the physical characteristics of thc NWP analysis problem; namely the predetermined nature of the basis for the analysis, the need for approximation because of large-order systems, the undcrdctcrminacy of the problem when using observations alone, and the availability of prior relationships to resolve the underdeterminacy . Prior relationships result from (1) knowledge of the time evolution of the model (which together with the use of a time distribution of observations constitutes four-dimensional data assimilation); (2) knowledge that the atmosphere varies slowly (leading to balance relationships); (3) other nonlinear relationships coupling parameters and scales in the atmosphere. Methods discussed include variational techniques, smoothing splines, Kriging, optimal interpolation, successive corrections, constrained initialization, the Kalman-Bucy filter, and adjoint model data assimilation. They are all shown to relate to the idealized analysis, and hence to each other. Opinions are given on when particular methods might he more appropriate. By comparison with the idealized method some insight is gained into appropriate choices of parameters in the practical methods.

1,301 citations