scispace - formally typeset
Journal ArticleDOI

Objective quality control of observations using Bayesian methods. Theory, and a practical implementation

Andrew C. Lorenc, +1 more
- 01 Jan 1988 - 
- Vol. 114, Iss: 480, pp 515-543
Reads0
Chats0
TLDR
In this paper, the authors provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities, and apply Bayes' theorem to derive the well-known formula for the combination of data with errors.
Abstract
This work attempts to provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities. Bayes' theorem is introduced, and is used in a simple example with Gaussian error distributions to derive the well-known formula for the combination of data with errors. A simple model is proposed whereby the error in each datum is either from a known Gaussian distribution, or a gross error, in which case the observation gives no useful information. Bayes' theorem is applied to this, and it is shown that usual operational practice, which is to reject outlying data and to treat the rest as if their errors are Gaussian, is a reasonable approximation to the correct Bayesian analysis. Appropriate rejection criteria are derived in terms of the observational error and the prior probability of a gross error. These ideas have been implemented in a computer program to check pressure, wind, temperature and position data from ships, weather ships, buoys and coastal synoptic reports. Historical information on the accuracies and reliabilities of various classifications of observation is used to provide prior estimates of observational errors and the prior probabilities of gross error. The latter are then updated in the light of information from a current forecast, and from nearby observations (allowing for the inaccuracies and possible gross errors in these) to give new estimates. The final probabilities can be used to reject or accept the data in an objective analysis. Results from trials of this system are given. It is shown to be possible using an archive generated by the system to update the prior error statistics necessary to make the method truly objective. Some practical case studies are shown, and compared with careful human quality control.

read more

Citations
More filters
Book

Atmospheric Modeling, Data Assimilation and Predictability

TL;DR: A comprehensive text and reference work on numerical weather prediction, first published in 2002, covers not only methods for numerical modeling, but also the important related areas of data assimilation and predictability.
Journal ArticleDOI

Atmospheric Modeling, Data Assimilation, and Predictability

Christopher K. Wikle
- 01 Nov 2005 - 
TL;DR: This monograph is an outstanding monograph on current research on skewelliptical models and its generalizations and does an excellent job presenting the depth of methodological research as well as the breath of application regimes.
Journal ArticleDOI

Quality Control and Flux Sampling Problems for Tower and Aircraft Data

TL;DR: A series of automated tests is developed for tower and aircraft time series to identify instrumentation problems, flux sampling problems, and physically plausible but unusual situations and serves as a safety net for quality controlling data.
Book ChapterDOI

Data assimilation in meteorology and oceanography

TL;DR: A review of current operational practice and of advanced data assimilation techniques in meteorology can be found in this article, where the authors provide a review of the most advanced data-assimilation techniques for meteorological and oceanographic data.
Journal ArticleDOI

The potential of the ensemble Kalman filter for NWP—a comparison with 4D‐Var

TL;DR: The EnKF is attractive when building a new medium-range ensemble numerical weather prediction system, however it is less suitable for NWP systems with uncertainty in a wide range of scales; it may not use high-resolution satellite data as effectively as 4D-Var.
References
More filters
Journal ArticleDOI

Analysis methods for numerical weather prediction

TL;DR: Methods discussed include variational techniques, smoothing splines, Kriging, optimal interpolation, successive corrections, constrained initialization, the Kalman-Bucy filter, and adjoint model data assimilation, which are all shown to relate to the idealized analysis, and hence to each other.
Journal ArticleDOI

A Global Three-Dimensional Multivariate Statistical Interpolation Scheme

TL;DR: In this article, a three-dimensional statistical interpolation method, multivariate in geopotential height, thickness and wind, is described, which has been implemented in the ECMWF operational global data-assimilation scheme, used for routine forecasting and for producing FGGE level III-b analyses.
Journal ArticleDOI

The statistical structure of short-range forecast errors as determined from radiosonde data. Part I: The wind field

TL;DR: In this paper, the authors analyzed the statistical structure of the short-range wind forecasts used in the global data assimilation system at ECMWF, by verifying the forecasts against radiosonde data over North America.
Journal ArticleDOI

The response of numerical weather prediction systems to fgge level iib data. Part I: Analyses

TL;DR: In this article, an intercomparison of analyses of the main FGGE level IIb dataset with three advanced analysis systems is presented, and the authors discuss objective evaluations of analysis quality, such as fit to observations, statistics of analysis differences, and mean fields.
Journal ArticleDOI

On approximations to geopotential and wind-field correlation structures

TL;DR: In this article, a correlation structure model which is rigorously derived from simple stochastic assumptions for geopotential anomalies, automatically satisfies the essential properties of correlation representations as they are used in multivariate optimal interpolation and diagnostic studies.
Related Papers (5)