scispace - formally typeset
Search or ask a question

Showing papers by "Eugenia Kalnay published in 2008"


Journal ArticleDOI
01 Jan 2008-Tellus A
TL;DR: The accuracy and computational efficiency of a parallel computer implementation of the Local Ensemble Transform Kalman Filter (LETKF) data assimilation scheme on the model component of the 2004 version of the Global Forecast System (GFS) of the National Centers for Environmental Prediction (NCEP) is investigated.
Abstract: The accuracy and computational efficiency of a parallel computer implementation of the Local Ensemble Transform Kalman Filter (LETKF) data assimilation scheme on the model component of the 2004 version of the Global Forecast System (GFS) of the National Centers for Environmental Prediction (NCEP) is investigated. Numerical experiments are carried out at model resolution T62L28. All atmospheric observations that were operationally assimilated by NCEP in 2004, except for satellite radiances, are assimilated with the LETKF. The accuracy of the LETKF analyses is evaluated by comparing it to that of the Spectral Statistical Interpolation (SSI), which was the operational global data assimilation scheme of NCEP in 2004. For the selected set of observations, the LETKF analyses are more accurate than the SSI analyses in the Southern Hemisphere extratropics and are comparably accurate in the Northern Hemisphere extratropics and in the Tropics. The computational wall-clock times achieved on a Beowulf cluster of 3.6 GHz Xeon processors make our implementation of the LETKF on the NCEP GFS a widely applicable analysis-forecast system, especially for research purposes. For instance, the generation of four daily analyses at the resolution of the NCAR-NCEP reanalysis (T62L28) for a full season (90 d), using 40 processors, takes less than 4 d of wall-clock time.

202 citations


Journal ArticleDOI
TL;DR: In this paper, an ensemble sensitivity method was proposed to calculate observation impacts without the need for an adjoint model, which is not always available for numerical weather prediction models, and the formulation is tested on the Lorenz 40-variable model and the results show that the observation impact estimated from the ensemblesensitivity method is similar to that from the adjoint method.
Abstract: We propose an ensemble sensitivity method to calculate observation impacts similar to Langland and Baker (2004) but without the need for an adjoint model, which is not always available for numerical weather prediction models. The formulation is tested on the Lorenz 40-variable model, and the results show that the observation impact estimated from the ensemble sensitivity method is similar to that from the adjoint method. Like the adjoint method, the ensemble sensitivity method is able to detect observations that have large random errors or biases. This sensitivity could be routinely calculated in an ensemble Kalman filter, thus providing a powerful tool to monitor the quality of observations and give quantitative estimations of observation impact on the forecasts. Copyright © 2008 Royal Meteorological Society

96 citations


Journal ArticleDOI
TL;DR: In this paper, the authors applied the observation minus reanalysis (OMR) method to surface stations in Argentina for the period 1961-2000 and found that over most of Argentina there has been net cooling, not warming (about −0.04°C/decade).
Abstract: [1] The “observation minus reanalysis” (OMR) method has been used to estimate the impact of changes in land use (including urbanization and agricultural practices such as irrigation) by computing the difference between the trends of the surface observations (which reflect all the sources of climate forcing, including surface effects) and the NCEP/NCAR reanalysis (which only contains the forcings influencing the assimilated atmospheric trends). In this paper we apply the OMR method to surface stations in Argentina for the period 1961–2000. In contrast to most other land areas, over most of Argentina there has been net cooling, not warming (about −0.04°C/decade). Observations also show a very strong decrease in the diurnal temperature range north of 40°S. This is associated with an observed strong reduction in the maximum temperature (−0.12°C/decade) together with a weak warming trend in the minimum temperature (0.05°C/decade). The OMR trends show a warming contribution to the mean temperature (+0.07°C/decade) and a decrease in diurnal temperature range (−0.08°C/decade), especially strong in the areas where the observed precipitation has increased the most and where, as a consequence, there has been an exponential increase of soy production in the last decade. The increase in precipitation is apparently associated with an increase in the moisture transport from the Amazons to northern Argentina by the low-level jet.

81 citations


Journal ArticleDOI
TL;DR: In this article, the impact of different surface vegetations on long-term surface temperature change is estimated by subtracting reanalysis trends in monthly surface temperature anomalies from observation trends over the last four decades.
Abstract: The impact of different surface vegetations on long-term surface temperature change is estimated by subtracting reanalysis trends in monthly surface temperature anomalies from observation trends over the last four decades. This is done using two reanalyses, namely, the 40-yr ECMWF (ERA-40) and NCEP– NCAR I (NNR), and two observation datasets, namely, Climatic Research Unit (CRU) and Global Historical Climate Network (GHCN). The basis of the observation minus reanalysis (OMR) approach is that the NNR reanalysis surface fields, and to a lesser extent the ERA-40, are insensitive to surface processes associated with different vegetation types and their changes because the NNR does not use surface observations over land, whereas ERA-40 only uses surface temperature observations indirectly, in order to initialize soil temperature and moisture. As a result, the OMR trends can provide an estimate of surface effects on the observed temperature trends missing in the reanalyses. The OMR trends obtained from observation minus NNR show a strong and coherent sensitivity to the independently estimated surface vegetation from normalized difference vegetation index (NDVI). The correlation between the OMR trend and the NDVI indicates that the OMR trend decreases with surface vegetation, with a correlation 0.5, indicating that there is a stronger surface response to global warming in arid regions, whereas the OMR response is reduced in highly vegetated areas. The OMR trend averaged over the desert areas (0 NDVI 0.1) shows a much larger increase of temperature (0.4°C decade 1 ) than over tropical forest areas (NDVI 0.4) where the OMR trend is nearly zero. Areas of intermediate vegetation (0.1 NDVI 0.4), which are mostly found over midlatitudes, reveal moderate OMR trends (approximately 0.1°–0.3°C decade 1 ). The OMR trends are also very sensitive to the seasonal vegetation change. While the OMR trends have little seasonal dependence over deserts and tropical forests, whose vegetation state remains rather constant throughout the year, the OMR trends over the midlatitudes, in particular Europe and North America, exhibit strong seasonal variation in response to the NDVI fluctuations associated with deciduous vegetation. The OMR trend rises up approximately to 0.2°–0.3°C decade 1 in winter and early spring when the vegetation cover is low, and is only 0.1°C decade 1 in summer and early autumn with high vegetation. However, the Asian inlands (Russia, northern China with Tibet, and Mongolia) do not show this strong OMR variation despite their midlatitude location, because of the relatively permanent aridity of these regions.

54 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used the singular value decomposition (SVD) to generate a basis of model errors and states and showed that the SVD method explains a significant component of the effect that the model's unresolved state has on the resolved state.
Abstract: The purpose of the present study is to use a new method of empirical model error correction, developed by Danforth et al. in 2007, based on estimating the systematic component of the nonperiodic errors linearly dependent on the anomalous state. The method uses singular value decomposition (SVD) to generate a basis of model errors and states. It requires only a time series of errors to estimate covariances and uses negligible additional computation during a forecast integration. As a result, it should be suitable for operational use at a relatively small computational expense. The method is tested with the Lorenz ’96 coupled system as the truth and an uncoupled version of the same system as a model. The authors demonstrate that the SVD method explains a significant component of the effect that the model’s unresolved state has on the resolved state and shows that the results are better than those obtained with Leith’s empirical correction operator. The improvement is attributed to the fact that the...

34 citations


Journal ArticleDOI
01 Jan 2008-Tellus A
TL;DR: In this article, an ensemble Kalman filter (EnKF) is used to assimilate data onto a non-linear chaotic model, coupling two kinds of variables: large amplitude, slow, large scale, distributed in eight equally spaced locations around a circle.
Abstract: An ensemble Kalman filter (EnKF) is used to assimilate data onto a non-linear chaotic model, coupling two kinds of variables. The first kind of variables of the system is characterized as large amplitude, slow, large scale, distributed in eight equally spaced locations around a circle. The second kind of variables are small amplitude, fast, and short scale, distributed in 256 equally spaced locations. Synthetic observations are obtained from the model and the observational error is proportional to their respective amplitudes. The performance of the EnKF is affected by differences in the spatial correlation scales of the variables being assimilated. This method allows the simultaneous assimilation of all the variables. The ensemble filter also allows assimilating only the large-scale variables, letting the small-scale variables to freely evolve. Assimilation of the large-scale variables together with a few small-scale variables significantly degrades the filter. These results are explained by the spurious correlations that arise from the sampled ensemble covariances. An alternative approach is to combine two different initialization techniques for the slow and fast variables. Here, the fast variables are initialized by restraining the evolution of the ensemble members, using a Newtonian relaxation toward the observed fast variables. Then, the usual ensemble analysis is used to assimilate the large-scale observations.

27 citations


Journal ArticleDOI
TL;DR: In this paper, a breeding method has been implemented in the NASA Global Modeling and Assimilation Office coupled general circulation model (CGCM) in its operational configuration in which ocean data assimilation is used to initialize the coupled forecasts.
Abstract: The breeding method has been implemented in the NASA Global Modeling and Assimilation Office coupled general circulation model (CGCM) in its operational configuration in which ocean data assimilation is used to initialize the coupled forecasts. Bred vectors (BVs), designed to capture the dominant growing errors in the atmosphere–ocean coupled system, are applied as initial ensemble perturbations. The potential improvement for ensemble prediction is investigated by comparing BVs with the oceanic growing errors, estimated by the one-month forecast error from the nonperturbed forecast. Results show that one-month forecast errors and BVs from the NASA CGCM share very similar features: BVs are clearly related to forecast errors in both SST and equatorial subsurface temperature—in particular, when the BV growth rate is large. Both the forecast errors and the BVs in the subsurface are dominated by large-scale structures near the thermocline. Results suggest that the forecast errors are dominated by dyna...

26 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare two methods of correcting the bias of a GCM; namely statistical correction performed a posteriori (offline) as a function of forecast length, and correction done within the model integration (online).
Abstract: [1] The purpose of this study is to compare two methods of correcting the bias of a GCM; namely statistical correction performed a posteriori (offline) as a function of forecast length, and correction done within the model integration (online). The model errors of a low resolution GCM are estimated by the 6-hour forecast residual averaged over several years and used to correct the model. Both the offline and online corrections substantially reduce the model bias when applied to independent data. Their performance in correcting the model error is comparable at all lead times, but for lead times longer than 1-day the online corrected forecasts have smaller RMS forecast errors and larger anomaly correlations than offline corrected forecasts. These results indicate that the online correction reduces not only the growth of the bias but also the nonlinear growth of non-constant (state-dependent and random) forecast errors during the model integration.

18 citations


Journal ArticleDOI
TL;DR: In this paper, the performance of the Local Ensemble Transform Kalman Filter (LETKF) with the PSAS under a perfect model scenario was compared using simulated winds and geopotential height observations and the finite volume Global Circulation Model with 72 grid points zonally, 46 grid points meridionally and 55 vertical levels.
Abstract: . This paper compares the performance of the Local Ensemble Transform Kalman Filter (LETKF) with the Physical-Space Statistical Analysis System (PSAS) under a perfect model scenario. PSAS is a 3D-Var assimilation system used operationally in the Goddard Earth Observing System Data Assimilation System (GEOS-4 DAS). The comparison is carried out using simulated winds and geopotential height observations and the finite volume Global Circulation Model with 72 grid points zonally, 46 grid points meridionally and 55 vertical levels. With forty ensemble members, the LETKF obtains analyses and forecasts with significantly lower RMS errors than those from PSAS, especially over the Southern Hemisphere and oceans. This observed advantage of the LETKF over PSAS is due to the ability of the 40-member ensemble LETKF to capture flow-dependent errors and thus create a good estimate of the evolving background uncertainty. An initial decrease of the forecast errors in the Northern Hemisphere observed in the PSAS but not in the LETKF suggests that the LETKF analysis is more balanced.

17 citations


01 Jan 2008
TL;DR: In this paper, the authors used the Observation Minus Reanalysis (OMR) surface temperature trends method suggested by Kalnay and Cai (Nature, 2003) to provide an estimate of the impact of surface eects on regional warming (or cooling).
Abstract: We use the Observation Minus Reanalysis (OMR) surface temperature trends method suggested by Kalnay and Cai (Nature, 2003) to provide an estimate of the impact of surface e!ects on regional warming (or cooling). It takes advantage of the insensitivity of the NCEP-NCAR Reanalysis (NNR) to land surface type, and eliminates the natural variability due to changes in circulation (since they are also included in the reanalysis), thus separating surface effects from greenhouse warming. Kalnay et al. (JGR, 2006) showed that over the US the OMR average is small, but it has di!erent regional signs, in good agreement with the regions of “urban heating and cooling” obtained by Hansen et al (JGR, 2001) when using nightlights to discriminate between urban and rural regions (Fig. 1). Kalnay et al. (2006) also showed that the results obtained with OMR were not qualitatively a!ected by the NCDC corrections of non-climatic e!ects (e.g., change of observation time and station location), which increase the overall warming trend in the US (compare Fig. 2c with Fig. 2d, obtained using the USHCN observations corrected for nonclimatic e!ects).

12 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore the possibility of seasonal-interannual prediction of terrestrial ecosystems and the global carbon cycle using a 25-year hindcast experiment, using a prototype forecasting system in which the dynamic vegetation and terrestrial carbon cycle model VEGAS was forced with 15-member ensemble climate predictions generated by the NOAA/NCEP coupled climate forecasting system (CFS), with lead times up to 9 months.
Abstract: [1] Using a 25-year hindcast experiment, we explore the possibility of seasonal-interannual prediction of terrestrial ecosystems and the global carbon cycle. This has been achieved using a prototype forecasting system in which the dynamic vegetation and terrestrial carbon cycle model VEGAS was forced with 15-member ensemble climate predictions generated by the NOAA/NCEP coupled climate forecasting system (CFS) for the period 1981–2005, with lead times up to 9 months. The results show that the predictability is dominated by the ENSO signal with its major influence on the tropical and subtropical regions, including South America, Indonesia, southern Africa, eastern Australia, western United States, and central Asia. There is also important non-ENSO related predictability such as that associated with midlatitude drought. Comparison of the dynamical prediction results with benchmark statistical prediction methods such as anomaly persistence and damping show that the dynamical method performs significantly better. The hindcasted ecosystem variables and carbon flux show significantly slower decrease in skill at longer lead time compared to the climate forcing variables, partly because of the memories in land and vegetation processes that filter out the higher-frequency noise and sustain the signal.

Journal ArticleDOI
TL;DR: In this article, a high resolution regional model was used to monitor and predict several days in advance the atmospheric transport of smoke in the La Plata River delta, to the northwest of Buenos Aires.
Abstract: [1] The smoke that affected the city of Buenos Aires and its suburbs (approximate population, 13M) in mid April 2008 was an extreme event without historical precedent. The episode resulted in an increase of health problems among the population (respiratory problems, eye irritation) and, due to poor visibility, led to hazardous driving conditions and accidents that forced the intermittent closure of major highways. The origin of the smoke was traced to pasture burning in the La Plata River delta, to the northwest of Buenos Aires. Unfortunately, the increased shifting of livestock to the La Plata River delta may result in more common smoke episodes due to associated biomass burning practices. We clarify the mechanisms that resulted in this extreme episode, including the contribution of the La Plata River local circulations to the intensity of the event. We further show its high predictability using high resolution regional model simulations and forecasts. Our results suggest that a high resolution regional model could be used to monitor and predict several days in advance the atmospheric transport of smoke. These results could have policy implications, as preventive measures on biomass burning could be put in effect when smoke from the fires is predicted to affect a densely populated area.

Posted Content
TL;DR: In this article, the authors proposed a scheme to improve the performance of the ensemble-based Kalman filter during the initial spin-up period by applying the no-cost ensemble Kalman smoother, which allows the model solutions for the ensemble to be running in place with the true dynamics, provided by a few observations.
Abstract: A scheme is proposed to improve the performance of the ensemble-based Kalman Filters during the initial spin-up period. By applying the no-cost ensemble Kalman Smoother, this scheme allows the model solutions for the ensemble to be "running in place" with the true dynamics, provided by a few observations. Results of this scheme are investigated with the Local Ensemble Transform Kalman Filter (LETKF) implemented in a Quasi-geostrophic model, whose original framework requires a very long spin-up time when initialized from a cold start. Results show that it is possible to spin up the LETKF and have a fast convergence to the optimal level of error. The extra computation is only required during the initial spin-up since this scheme resumes to the original LETKF after the "running in place" is achieved.

Proceedings ArticleDOI
05 Jun 2008
TL;DR: The application of the lagged average forecast method to an operational model requires the resolution of how to obtain a homogeneous sample large enough to calculate stable statistics, and it is suggested that this problem may be solved by carefully modeling the required statistics in terms of a small set of parameters.
Abstract: We have previously described the lagged average forecast (LAF) method as an alternative to the Monte Carlo forecast (MCF) method. The LAF differs from the MCF in the definition of the ensemble of initial states which are used to generate the ensemble of forecasts. The LAF initial states are the current analysis and the forecasts made from previous analyses verifying the current time. Thus the LAF ensemble is composed of forecasts which are made by a regular operational system of numerical weather prediction and the LAF method is therefore operationally attractive.The application of our previous ideas and results, to an operational model requires the resolution of what might be called the degree of freedom problem, i.e., how to obtain a homogeneous sample large enough to calculate stable statistics. We suggest that this problem may be solved by carefully modeling the required statistics in terms of a small set of parameters and then estimating only these few parameters from the data. We also note that ther...

01 Jan 2008
TL;DR: The Multi-Meteorology Air Quality (MMAQ) Ensemble Project applies the ensemble forecasting techniques that have so successfully improved numerical weather prediction [Kalnay, 2003] to regional air quality forecasting.
Abstract: The Multi-Meteorology Air Quality (MMAQ) Ensemble Project applies the ensemble forecasting techniques that have so successfully improved numerical weather prediction [Kalnay, 2003] to regional air quality forecasting. The ensemble focuses on meteorological uncertainty, the largest source of error in air quality modeling [McKeen et al., 2005]. The models chosen for this project are WRF-ARW (Weather Research and Forecasting model—Advanced Research WRF), SMOKE (Sparse Matrix Operator Kernel Emission system), and CMAQ (Community Multiscale Air Quality model).The project uses varied physics mesoscale meteorology ensembles to create emissions ensembles. Both sets of ensembles are used to initialize and generate air quality ensembles. The ozone forecasts for each ensemble and the ensemble average will then be evaluated against observations. The research questions that will be addressed are (1) How sensitive is emissions model to meteorology physics options? (2) How sensitive is air quality model to meteorology physics options? (3) Is the ensemble spread sufficient to represent forecast uncertainty? This paper will examine the preliminary results of MMAQ. It focuses on the spread generated by the WRF varied physics ensemble in meteorology variables essential to air quality modeling.