scispace - formally typeset
Search or ask a question

Showing papers in "Monthly Weather Review in 2008"


Journal ArticleDOI
TL;DR: In this article, a new bulk microphysical parameterization (BMP) was developed for use with the Weather Research and Forecasting (WRF) Model or other mesoscale models.
Abstract: A new bulk microphysical parameterization (BMP) has been developed for use with the Weather Research and Forecasting (WRF) Model or other mesoscale models. As compared with earlier single-moment BMPs, the new scheme incorporates a large number of improvements to both physical processes and computer coding, and it employs many techniques found in far more sophisticated spectral/bin schemes using lookup tables. Unlike any other BMP, the assumed snow size distribution depends on both ice water content and temperature and is represented as a sum of exponential and gamma distributions. Furthermore, snow assumes a nonspherical shape with a bulk density that varies inversely with diameter as found in observations and in contrast to nearly all other BMPs that assume spherical snow with constant density. The new scheme’s snow category was readily modified to match previous research in sensitivity experiments designed to test the sphericity and distribution shape characteristics. From analysis of four idea...

2,206 citations


Journal ArticleDOI
TL;DR: The Simple Ocean Data Assimilation (SODA) reanalysis of ocean climate variability is described in this article, where a model forecast produced by an ocean general circulation model with an average resolution of 0.25° 0.4° 40 levels is continuously corrected by contemporaneous observations with corrections estimated every 10 days.
Abstract: This paper describes the Simple Ocean Data Assimilation (SODA) reanalysis of ocean climate variability. In the assimilation, a model forecast produced by an ocean general circulation model with an average resolution of 0.25° 0.4° 40 levels is continuously corrected by contemporaneous observations with corrections estimated every 10 days. The basic reanalysis, SODA 1.4.2, spans the 44-yr period from 1958 to 2001, which complements the span of the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) atmospheric reanalysis (ERA-40). The observation set for this experiment includes the historical archive of hydrographic profiles supplemented by ship intake measurements, moored hydrographic observations, and remotely sensed SST. A parallel run, SODA 1.4.0, is forced with identical surface boundary conditions, but without data assimilation. The new reanalysis represents a significant improvement over a previously published version of the SODA algorithm. In particular, eddy kinetic energy and sea level variability are much larger than in previous versions and are more similar to estimates from independent observations. One issue addressed in this paper is the relative importance of the model forecast versus the observations for the analysis. The results show that at near-annual frequencies the forecast model has a strong influence, whereas at decadal frequencies the observations become increasingly dominant in the analysis. As a consequence, interannual variability in SODA 1.4.2 closely resembles interannual variability in SODA 1.4.0. However, decadal anomalies of the 0–700-m heat content from SODA 1.4.2 more closely resemble heat content anomalies based on observations.

1,614 citations


Journal ArticleDOI
Nigel Roberts1, Humphrey Lean1
TL;DR: In this article, the authors examined whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale, and a verification method was described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas.
Abstract: The development of NWP models with grid spacing down to ∼1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillfu...

925 citations


Journal ArticleDOI
TL;DR: In this article, Bengtsson et al. showed that the ensemble size required for a successful particle filter scales exponentially with the problem size and that the required ensemble size scales with the state dimension.
Abstract: Particle filters are ensemble-based assimilation schemes that, unlike the ensemble Kalman filter, employ a fully nonlinear and non-Gaussian analysis step to compute the probability distribution function (pdf) of a system’s state conditioned on a set of observations. Evidence is provided that the ensemble size required for a successful particle filter scales exponentially with the problem size. For the simple example in which each component of the state vector is independent, Gaussian, and of unit variance and the observations are of each state component separately with independent, Gaussian errors, simulations indicate that the required ensemble size scales exponentially with the state dimension. In this example, the particle filter requires at least 1011 members when applied to a 200-dimensional state. Asymptotic results, following the work of Bengtsson, Bickel, and collaborators, are provided for two cases: one in which each prior state component is independent and identically distributed, and ...

654 citations


Journal ArticleDOI
TL;DR: This paper used the Advanced Hurricane WRF (AHW) model to forecast five landfalling Atlantic hurricanes during 2005 using the Advanced Research Weather Research and Forecasting (WRF) Model at grid spacings of 12 and 4 km revealed performance generally competitive with, and occasionally superior to, other operational forecasts for storm position and intensity.
Abstract: Real-time forecasts of five landfalling Atlantic hurricanes during 2005 using the Advanced Research Weather Research and Forecasting (WRF) (ARW) Model at grid spacings of 12 and 4 km revealed performance generally competitive with, and occasionally superior to, other operational forecasts for storm position and intensity. Recurring errors include 1) excessive intensification prior to landfall, 2) insufficient momentum exchange with the surface, and 3) inability to capture rapid intensification when observed. To address these errors several augmentations of the basic community model have been designed and tested as part of what is termed the Advanced Hurricane WRF (AHW) model. Based on sensitivity simulations of Katrina, the inner-core structure, particularly the size of the eye, was found to be sensitive to model resolution and surface momentum exchange. The forecast of rapid intensification and the structure of convective bands in Katrina were not significantly improved until the grid spacing ap...

465 citations


Journal ArticleDOI
TL;DR: A basin-to-channel-scale implementation of the Advanced Circulation (ADCIRC) unstructured grid hydrodynamic model has been developed that accurately simulates hurricane storm surge, tides, and river flow in this complex region as mentioned in this paper.
Abstract: Southern Louisiana is characterized by low-lying topography and an extensive network of sounds, bays, marshes, lakes, rivers, and inlets that permit widespread inundation during hurricanes. A basin- to channel-scale implementation of the Advanced Circulation (ADCIRC) unstructured grid hydrodynamic model has been developed that accurately simulates hurricane storm surge, tides, and river flow in this complex region. This is accomplished by defining a domain and computational resolution appropriate for the relevant processes, specifying realistic boundary conditions, and implementing accurate, robust, and highly parallel unstructured grid numerical algorithms. The model domain incorporates the western North Atlantic, the Gulf of Mexico, and the Caribbean Sea so that interactions between basins and the shelf are explicitly modeled and the boundary condition specification of tidal and hurricane processes can be readily defined at the deep water open boundary. The unstructured grid enables highly refi...

445 citations


Journal ArticleDOI
TL;DR: In this article, real-data experiments with an ensemble data assimilation system using the NCEP Global Forecast System model were performed and compared with the N CEP Global Data Assimilation System (GDAS).
Abstract: Real-data experiments with an ensemble data assimilation system using the NCEP Global Forecast System model were performed and compared with the NCEP Global Data Assimilation System (GDAS). All observations in the operational data stream were assimilated for the period 1 January–10 February 2004, except satellite radiances. Because of computational resource limitations, the comparison was done at lower resolution (triangular truncation at wavenumber 62 with 28 levels) than the GDAS real-time NCEP operational runs (triangular truncation at wavenumber 254 with 64 levels). The ensemble data assimilation system outperformed the reduced-resolution version of the NCEP three-dimensional variational data assimilation system (3DVAR), with the biggest improvement in data-sparse regions. Ensemble data assimilation analyses yielded a 24-h improvement in forecast skill in the Southern Hemisphere extratropics relative to the NCEP 3DVAR system (the 48-h forecast from the ensemble data assimilation system was as accurate as the 24-h forecast from the 3DVAR system). Improvements in the data-rich Northern Hemisphere, while still statistically significant, were more modest. It remains to be seen whether the improvements seen in the Southern Hemisphere will be retained when satellite radiances are assimilated. Three different parameterizations of background errors unaccounted for in the data assimilation system (including model error) were tested. Adding scaled random differences between adjacent 6-hourly analyses from the NCEP–NCAR reanalysis to each ensemble member (additive inflation) performed slightly better than the other two methods (multiplicative inflation and relaxation-to-prior).

426 citations


Journal ArticleDOI
TL;DR: A systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events shows that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized.
Abstract: With many operational centers moving toward order 1-km-gridlength models for routine weather forecasting, this paper presents a systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events. The authors describe a suite of configurations of the Met Office Unified Model running with grid lengths of 12, 4, and 1 km and analyze results from these models for a number of convective cases from the summers of 2003, 2004, and 2005. The analysis includes subjective evaluation of the rainfall fields and comparisons of rainfall amounts, initiation, cell statistics, and a scale-selective verification technique. It is shown that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized. However, the 4-km model representation suffers from large convective cells and delayed initiation because the grid length is too long to correctly reproduce the convection explicitly. These problems are not as evident in the 1-km model, although it does suffer from too numerous small cells in some situations. Both the 4- and 1-km models suffer from poor representation at the start of the forecast in the period when the high-resolution detail is spinning up from the lower-resolution (12 km) starting data used. A scale-selective precipitation verification technique implies that for later times in the forecasts (after the spinup period) the 1-km model performs better than the 12- and 4-km models for lower rainfall thresholds. For higher thresholds the 4-km model scores almost as well as the 1-km model, and both do better than the 12-km model.

374 citations


Journal ArticleDOI
TL;DR: In this article, an object-based quality measure, referred to as SAL, is introduced for the verification of quantitative precipitation forecasts (QPF), which considers aspects of the structure (S), amplitude (A), and location (L) of the precipitation field in a prespecified domain.
Abstract: A novel object-based quality measure, which contains three distinct components that consider aspects of the structure (S), amplitude (A), and location (L) of the precipitation field in a prespecified domain (e.g., a river catchment) is introduced for the verification of quantitative precipitation forecasts (QPF). This quality measure is referred to as SAL. The amplitude component A measures the relative deviation of the domain-averaged QPF from observations. Positive values of A indicate an overestimation of total precipitation; negative values indicate an underestimation. For the components S and L, coherent precipitation objects are separately identified in the forecast and observations; however, no matching is performed of the objects in the two datasets. The location component L combines information about the displacement of the predicted (compared to the observed) precipitation field’s center of mass and about the error in the weighted-average distance of the precipitation objects from the total field’s center of mass. The structure component S is constructed in such a way that positive values occur if precipitation objects are too large and/or too flat, and negative values if the objects are too small and/or too peaked. Perfect QPFs are characterized by zero values for all components of SAL. Examples with both synthetic precipitation fields and real data are shown to illustrate the concept and characteristics of SAL. SAL is applied to 4 yr of daily accumulated QPFs from a global and finer-scale regional model for a German river catchment, and the SAL diagram is introduced as a compact means of visualizing the results. SAL reveals meaningful information about the systematic differences in the performance of the two models. While the median of the S component is close to zero for the regional model, it is strongly positive for the coarser-scale global model. Consideration is given to the strengths and limitations of the novel quality measure and to possible future applications, in particular, for the verification of QPFs from convection-resolving weather prediction models on short time scales.

316 citations


Journal ArticleDOI
TL;DR: Based on 13 years of satellite altimetry data, in situ and climatological upper-ocean thermal structure data, best-track typhoon data of the U.S. Joint Typhoon Warning Center, together with an ocean mixed layer model, 30 western North Pacific category 5 typhoons that occurred during the typhoon season from 1993 to 2005 are systematically examined in this paper.
Abstract: Category 5 cyclones are the most intense and devastating cyclones on earth. With increasing observations of category 5 cyclones, such as Hurricane Katrina (2005), Rita (2005), Mitch (1998), and Supertyphoon Maemi (2003) found to intensify on warm ocean features (i.e., regions of positive sea surface height anomalies detected by satellite altimeters), there is great interest in investigating the role ocean features play in the intensification of category 5 cyclones. Based on 13 yr of satellite altimetry data, in situ and climatological upper-ocean thermal structure data, best-track typhoon data of the U.S. Joint Typhoon Warning Center, together with an ocean mixed layer model, 30 western North Pacific category 5 typhoons that occurred during the typhoon season from 1993 to 2005 are systematically examined in this study. Two different types of situations are found. The first type is the situation found in the western North Pacific south eddy zone (SEZ; 21°–26°N, 127°–170°E) and the Kuroshio (21°–30...

306 citations


Journal ArticleDOI
TL;DR: The 2005 Atlantic hurricane season was the most active of record as discussed by the authors, with 28 named tropical storms, 27 hurricanes, and one subtropical storm making landfall in the United States, including four major hurricanes.
Abstract: The 2005 Atlantic hurricane season was the most active of record. Twenty-eight storms occurred, including 27 tropical storms and one subtropical storm. Fifteen of the storms became hurricanes, and seven of these became major hurricanes. Additionally, there were two tropical depressions and one subtropical depression. Numerous records for single-season activity were set, including most storms, most hurricanes, and highest accumulated cyclone energy index. Five hurricanes and two tropical storms made landfall in the United States, including four major hurricanes. Eight other cyclones made landfall elsewhere in the basin, and five systems that did not make landfall nonetheless impacted land areas. The 2005 storms directly caused nearly 1700 deaths. This includes approximately 1500 in the United States from Hurricane Katrina—the deadliest U.S. hurricane since 1928. The storms also caused well over $100 billion in damages in the United States alone, making 2005 the costliest hurricane season of record.

Journal ArticleDOI
TL;DR: In this article, a new radiation package, McRad, has become operational with cycle 32R2 of the Integrated Forecasting System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF), which includes an improved description of the land surface albedo from MODIS observations, the Monte Carlo independent column approximation treatment of the radiative transfer in clouds, and the Rapid Radiative Transfer Model shortwave scheme.
Abstract: A new radiation package, “McRad,” has become operational with cycle 32R2 of the Integrated Forecasting System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). McRad includes an improved description of the land surface albedo from Moderate Resolution Imaging Spectroradiometer (MODIS) observations, the Monte Carlo independent column approximation treatment of the radiative transfer in clouds, and the Rapid Radiative Transfer Model shortwave scheme. The impact of McRad on year-long simulations at TL159L91 and higher-resolution 10-day forecasts is then documented. McRad is shown to benefit the representation of most parameters over both shorter and longer time scales, relative to the previous operational version of the radiative transfer schemes. At all resolutions, McRad improves the representation of the cloud–radiation interactions, particularly in the tropical regions, with improved temperature and wind objective scores through a reduction of some systematic errors in the ...

Journal ArticleDOI
TL;DR: A hybrid ensemble transform Kalman filter–three-dimensional variational data assimilation (ETKF–3DVAR) system for the Weather Research and Forecasting (WRF) Model is introduced, based on the existing WRF 3DVar.
Abstract: A hybrid ensemble transform Kalman filter–three-dimensional variational data assimilation (ETKF–3DVAR) system for the Weather Research and Forecasting (WRF) Model is introduced. The system is based on the existing WRF 3DVAR. Unlike WRF 3DVAR, which utilizes a simple, static covariance model to estimate the forecast-error statistics, the hybrid system combines ensemble covariances with the static covariances to estimate the complex, flow-dependent forecast-error statistics. Ensemble covariances are incorporated by using the extended control variable method during the variational minimization. The ensemble perturbations are maintained by the computationally efficient ETKF. As an initial attempt to test and understand the newly developed system, both an observing system simulation experiment under the perfect model assumption (Part I) and the real observation experiment (Part II) were conducted. In these pilot studies, the WRF was run over the North America domain at a coarse grid spacing (200 km) t...

Journal ArticleDOI
TL;DR: In this article, the calibration of probabilistic forecasts of 12-hourly precipitation amounts using a large ensemble reforecast dataset from the European Centre for Medium-Range Weather Forecasts (ECMWF) and the Global Forecast System (GFS) was discussed.
Abstract: As a companion to Part I, which discussed the calibration of probabilistic 2-m temperature forecasts using large training datasets, Part II discusses the calibration of probabilistic forecasts of 12-hourly precipitation amounts. Again, large ensemble reforecast datasets from the European Centre for Medium-Range Weather Forecasts (ECMWF) and the Global Forecast System (GFS) were used for testing and calibration. North American Regional Reanalysis (NARR) 12-hourly precipitation analysis data were used for verification and training. Logistic regression was used to perform the calibration, with power-transformed ensemble means and spreads as predictors. Forecasts were produced and validated for every NARR grid point in the conterminous United States (CONUS). Training sample sizes were increased by including data from 10 nearby grid points with similar analyzed climatologies. “Raw” probabilistic forecasts from each system were considered, in which probabilities were set according to ensemble relative ...

Journal ArticleDOI
TL;DR: In this paper, an ensemble-based four-dimensional variational (En4DVAR) algorithm is presented that uses a flow-dependent background error covariance matrix constructed by ensemble forecasts and performs 4DVMAR optimization to produce a balanced analysis.
Abstract: Applying a flow-dependent background error covariance (𝗕 matrix) in variational data assimilation has been a topic of interest among researchers in recent years. In this paper, an ensemble-based four-dimensional variational (En4DVAR) algorithm, designed by the authors, is presented that uses a flow-dependent background error covariance matrix constructed by ensemble forecasts and performs 4DVAR optimization to produce a balanced analysis. A great advantage of this En4DVAR design over standard 4DVAR methods is that the tangent linear and adjoint models can be avoided in its formulation and implementation. In addition, it can be easily incorporated into variational data assimilation systems that are already in use at operational centers and among the research community. A one-dimensional shallow water model was used for preliminary tests of the En4DVAR scheme. Compared with standard 4DVAR, the En4DVAR converges well and can produce results that are as good as those with 4DVAR but with far less comp...

Journal ArticleDOI
TL;DR: In this article, an implicit Rayleigh damping term is applied only to the vertical velocity, as a final adjustment at the end of each small (acoustic) time step.
Abstract: Although the use of a damping layer near the top of a computational model domain has proven effective in absorbing upward-propagating gravity-wave energy in idealized simulations, this technique has been less successful in real atmospheric applications. Here, a new technique is proposed for nonhydrostatic model equations that are solved using split-explicit time-integration techniques. In this method, an implicit Rayleigh damping term is applied only to the vertical velocity, as a final adjustment at the end of each small (acoustic) time step. The adjustment is equivalent to including an implicit Rayleigh damping term in the vertical momentum equation together with an implicit vertical diffusion of w, and could be applied in this manner in other time-integration schemes. This implicit damping for the vertical velocity is unconditionally stable and remains effective even for hydrostatic gravity waves. The good absorption characteristics of this layer across a wide range of horizontal scales are confirmed through analysis of the linear wave equation and numerical mountain-wave simulations, and through simulations of an idealized squall line and of mountain waves over the Colorado Rocky Mountains.

Journal ArticleDOI
TL;DR: In this article, the authors used the satellite-based Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) mission to retrieve tropospheric profiles of temperature and moisture over the data-sparse eastern Pacific Ocean.
Abstract: This study uses the new satellite-based Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) mission to retrieve tropospheric profiles of temperature and moisture over the data-sparse eastern Pacific Ocean. The COSMIC retrievals, which employ a global positioning system radio occultation technique combined with “first-guess” information from numerical weather prediction model analyses, are evaluated through the diagnosis of an intense atmospheric river (AR; i.e., a narrow plume of strong water vapor flux) that devastated the Pacific Northwest with flooding rains in early November 2006. A detailed analysis of this AR is presented first using conventional datasets and highlights the fact that ARs are critical contributors to West Coast extreme precipitation and flooding events. Then, the COSMIC evaluation is provided. Offshore composite COSMIC soundings north of, within, and south of this AR exhibited vertical structures that are meteorologically consistent with satellit...

Journal ArticleDOI
TL;DR: A polaroptimized version of the National Center for Atmospheric Research Mesoscale Model (MM5) was developed to fill climate and synoptic needs of the polar science community and to achieve an improved regional performance as discussed by the authors.
Abstract: A polar-optimized version of the fifth-generation Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5) was developed to fill climate and synoptic needs of the polar science community and to achieve an improved regional performance. To continue the goal of enhanced polar mesoscale modeling, polar optimization should now be applied toward the state-of-the-art Weather Research and Forecasting (WRF) Model. Evaluations and optimizations are especially needed for the boundary layer parameterization, cloud physics, snow surface physics, and sea ice treatment. Testing and development work for Polar WRF begins with simulations for ice sheet surface conditions using a Greenland-area domain with 24-km resolution. The winter month December 2002 and the summer month June 2001 are simulated with WRF, version 2.1.1, in a series of 48-h integrations initialized daily at 0000 UTC. The results motivated several improvements to Polar WRF, especially to the Noah land surface model (LSM) and the snowpack treatment. Different physics packages for WRF are evaluated with December 2002 simulations that show variable forecast skill when verified with the automatic weather station observations. The WRF simulation with the combination of the modified Noah LSM, the Mellor–Yamada–Janjic ´ boundary layer parameterization, and the WRF single-moment microphysics produced results that reach or exceed the success standards of a Polar MM5 simulation for December 2002. For summer simulations of June 2001, WRF simulates an improved surface energy balance, and shows forecast skill nearly equal to that of Polar MM5.

Journal ArticleDOI
TL;DR: In this paper, the hybrid ensemble transform Kalman filter (ETKF)-3DVAR system was further tested with real observations, as a follow-up for the observation system simulation experiment (OSSE) conducted in Part I. A domain encompassing North America was considered.
Abstract: The hybrid ensemble transform Kalman filter–three-dimensional variational data assimilation (ETKF–3DVAR) system developed for the Weather Research and Forecasting (WRF) Model was further tested with real observations, as a follow-up for the observation system simulation experiment (OSSE) conducted in Part I. A domain encompassing North America was considered. Because of limited computational resources and the large number of experiments conducted, the forecasts and analyses employed relatively coarse grid spacing (200 km) to emphasize synoptic scales. As a first effort to explore the new system with real observations, relatively sparse observation datasets consisting of radiosonde wind and temperature during 4 weeks of January 2003 were assimilated. The 12-h forecasts produced by the hybrid analysis produced less root-mean-square error than the 3DVAR. The hybrid improved the forecast more in the western part of the domain than the eastern part. It also produced larger improvements in the upper tr...

Journal ArticleDOI
TL;DR: In this article, the applicability of three different cyclone detection and tracking schemes with reanalysis datasets is investigated with three different approaches, based on the ERA-40 dataset, and the results show that the cyclone intensity is a more robust measure of variability than the number of cyclones.
Abstract: The applicability of three different cyclone detection and tracking schemes is investigated with reanalysis datasets. First, cyclone climatologies and cyclone characteristics of the 40-yr ECMWF Re-Analysis (ERA-40) are compared with the NCEP–NCAR dataset using one method. ERA-40 shows systematically more cyclones, and therefore a higher cyclone center density, than the NCEP–NCAR reanalysis dataset. Geostrophically adjusted geopotential height gradients around cyclone centers, a measure of cyclone intensity, are enhanced in ERA-40 compared with the NCEP–NCAR reanalysis dataset. The variability of the number of cyclones per season is significantly correlated between the two reanalysis datasets, but time series of the extreme cyclone intensity exhibit a higher correlation. This suggests that the cyclone intensity is a more robust measure of variability than the number of cyclones. Second, three cyclone detection and tracking schemes are compared, based on the ERA-40 dataset. In general the schemes s...

Journal ArticleDOI
TL;DR: In this article, an ensemble Kalman filter (EnKF) was used to assimilate real-data observations for a warm-season mesoscale convective vortex (MCV) event on 10-12 June 2003.
Abstract: The feasibility of using an ensemble Kalman filter (EnKF) for mesoscale and regional-scale data assimilation has been demonstrated in the authors’ recent studies via observing system simulation experiments (OSSEs) both under a perfect-model assumption and in the presence of significant model error. The current study extends the EnKF to assimilate real-data observations for a warm-season mesoscale convective vortex (MCV) event on 10–12 June 2003. Direct comparison between the EnKF and a three-dimensional variational data assimilation (3DVAR) system, both implemented in the Weather Research and Forecasting model (WRF), is carried out. It is found that the EnKF consistently performs better than the 3DVAR method by assimilating either individual or multiple data sources (i.e., sounding, surface, and wind profiler) for this MCV event. Background error covariance plays an important role in the performance of both the EnKF and the 3DVAR system. Proper covariance inflation and the use of different combin...

Journal ArticleDOI
TL;DR: In this paper, the effect of observations on the forecast metric is quantified by changes in the metric mean and variance for a single observation, expressions for these changes involve a product of scalar quantities which can be rapidly evaluated for large numbers of observations.
Abstract: The sensitivity of forecasts to observations is evaluated using an ensemble approach with data drawn from a pseudo-operational ensemble Kalman filter. For Gaussian statistics and a forecast metric defined as a scalar function of the forecast variables, the effect of observations on the forecast metric is quantified by changes in the metric mean and variance. For a single observation, expressions for these changes involve a product of scalar quantities, which can be rapidly evaluated for large numbers of observations. This technique is applied to determining climatological forecast sensitivity and predicting the impact of observations on sea level pressure and precipitation forecast metrics. The climatological 24-h forecast sensitivity of the average pressure over western Washington State shows a region of maximum sensitivity to the west of the region, which tilts gently westward with height. The accuracy of ensemble sensitivity predictions is tested by withholding a single buoy pressure observation from this region and comparing this perturbed forecast with the control case where the buoy is assimilated. For 30 cases, there is excellent agreement between these forecast differences and the ensemble predictions, as measured by the forecast metric. This agreement decreases for increasing numbers of observations. Nevertheless, by using statistical confidence tests to address sampling error, the impact of thousands of observations on forecast-metric variance is shown to be well estimated by a subset of the O(100) most significant observations.

Journal ArticleDOI
TL;DR: In this article, a new technique for relating central pressure and maximum winds in tropical cyclones is presented, together with a method of objectively determining a derivative of the Holland b parameter, bs, which relates directly to surface winds and varies with the pressure drop into the cyclone center, intensification rate, latitude, and translation speed.
Abstract: A new technique for relating central pressure and maximum winds in tropical cyclones is presented, together with a method of objectively determining a derivative of the Holland b parameter, bs, which relates directly to surface winds and varies with the pressure drop into the cyclone center, intensification rate, latitude, and translation speed. By allowing this bs parameter to vary, a realistic scatter in maximum winds for a given central pressure is obtained. This provides an improvement over traditional approaches that provide a unique wind for each central pressure. It is further recommended that application of the Dvorak satellite-interpretation technique be changed to enable a direct derivation of central pressure. The pressure– wind model derived here can then provide the maximum wind estimates. The recent North Atlantic data archive is shown to be largely derived from the use of the Dvorak technique, even when hurricane reconnaissance data are available and Dvorak overestimates maximum winds in this region for the more intense hurricanes. Application to the full North Atlantic hurricane archive confirms the findings by Landsea (1993) of a substantial overestimation of maximum winds between 1950 and 1980; the Landsea corrections do not completely remove this bias.

Journal ArticleDOI
TL;DR: In this article, the authors consider the importance of using mean-preserving solutions for the ensemble transform matrix (ETM) in data assimilation systems and propose a new variant of ESRF, referred to as ESRFs with meanpreserving random rotations.
Abstract: This paper considers implications of different forms of the ensemble transformation in the ensemble square root filters (ESRFs) for the performance of ESRF-based data assimilation systems. It highlights the importance of using mean-preserving solutions for the ensemble transform matrix (ETM). The paper shows that an arbitrary mean-preserving ETM can be represented as a product of the symmetric solution and an orthonormal mean-preserving matrix. The paper also introduces a new flavor of ESRF, referred to as ESRF with mean-preserving random rotations. To investigate the performance of different solutions for the ETM in ESRFs, experiments with two small models are conducted. In these experiments, the performances of two mean-preserving solutions, two non-mean-preserving solutions, and a traditional ensemble Kalman filter with perturbed observations are compared. The experiments show a significantly better performance of the mean-preserving solutions for the ETM in ESRFs compared to non-mean-preservi...

Journal ArticleDOI
TL;DR: The output of two global atmospheric models participating in the second phase of the Canadian Historical Forecasting Project (HFP2) is utilized to assess the forecast skill of the Madden-Julian oscillation (MJO) as discussed by the authors.
Abstract: The output of two global atmospheric models participating in the second phase of the Canadian Historical Forecasting Project (HFP2) is utilized to assess the forecast skill of the Madden–Julian oscillation (MJO). The two models are the third generation of the general circulation model (GCM3) of the Canadian Centre for Climate Modeling and Analysis (CCCma) and the Global Environmental Multiscale (GEM) model of Recherche en Prevision Numerique (RPN). Space–time spectral analysis of the daily precipitation in nearequilibrium integrations reveals that GEM has a better representation of the convectively coupled equatorial waves including the MJO, Kelvin, equatorial Rossby (ER), and mixed Rossby–gravity (MRG) waves. An objective of this study is to examine how the MJO forecast skill is influenced by the model’s ability in representing the convectively coupled equatorial waves. The observed MJO signal is measured by a bivariate index that is obtained by projecting the combined fields of the 15°S–15°N meridionally averaged precipitation rate and the zonal winds at 850 and 200 hPa onto the two leading empirical orthogonal function (EOF) structures as derived using the same meridionally averaged variables following a similar approach used recently by Wheeler and Hendon. The forecast MJO index, on the other hand, is calculated by projecting the forecast variables onto the same two EOFs. With the HFP2 hindcast output spanning 35 yr, for the first time the MJO forecast skill of dynamical models is assessed over such a long time period with a significant and robust result. The result shows that the GEM model produces a significantly better level of forecast skill for the MJO in the first 2 weeks. The difference is larger in Northern Hemisphere winter than in summer, when the correlation skill score drops below 0.50 at a lead time of 10 days for GEM whereas it is at 6 days for GCM3. At lead times longer than about 15 days, GCM3 performs slightly better. There are some features that are common for the two models. The forecast skill is better in winter than in summer. Forecasts initialized with a large amplitude for the MJO are found to be more skillful than those with a weak MJO signal in the initial conditions. The forecast skill is dependent on the phase of the MJO at the initial conditions. Forecasts initialized with an MJO that has an active convection in tropical Africa and the Indian Ocean sector have a better level of forecast skill than those initialized with a different phase of the MJO.

Journal ArticleDOI
TL;DR: In this article, the inner core structure, atmospheric boundary layer, sea surface temperature, and outflow layer of a superintense tropical cyclone using high-resolution in situ flight-level, NCAR GPS dropwindsonde, Doppler radar, and satellite measurements were analyzed.
Abstract: Unprecedented observations of Hurricane Isabel (2003) at category 5 intensity were collected from 12 to 14 September. This study presents a detailed analysis of the inner-core structure, atmospheric boundary layer, sea surface temperature, and outflow layer of a superintense tropical cyclone using high-resolution in situ flight-level, NCAR GPS dropwindsonde, Doppler radar, and satellite measurements. The analysis of the dropwindsonde and in situ data includes a comprehensive discussion of the uncertainties associated with this observational dataset and provides an estimate of the storm-relative axisymmetric inner-core structure using Barnes objective analysis. An assessment of gradient and thermal wind balance in the inner core is also presented. The axisymmetric data composites presented in this study suggest that Isabel built a reservoir of high moist entropy air by sea-to-air latent heat flux inside the low-level eye that was utilized as an additional energy source to nearly maintain its extreme intensity even after crossing the cool wake of Hurricane Fabian. It is argued here that the combined mean and asymmetric eddy flux of high moist entropy air from the low-level eye into the eyewall represents an additional power source or “turbo boost” to the hurricane heat engine. Recent estimates of the ratio of sea-to-air enthalpy and momentum exchange at high wind speeds are used to suggest that Isabel utilized this extra power to exceed the previously assumed intensity upper bound for the given environmental conditions on all three days. This discrepancy between a priori potential intensity theory and observations may be as high as 35 m s 1 on 13 September.

Journal ArticleDOI
TL;DR: In this article, the calibration of 2-m temperature forecasts using these reforecast datasets as well as samples of the last 30 days of training data was used to calibrate forecasts at stations distributed across much of North America.
Abstract: Recently, the European Centre for Medium-Range Weather Forecasts (ECMWF) produced a reforecast dataset for a 2005 version of their ensemble forecast system. The dataset consisted of 15-member reforecasts conducted for the 20-yr period 1982–2001, with reforecasts computed once weekly from 1 September to 1 December. This dataset was less robust than the daily reforecast dataset produced for the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS), but it utilized a much higher-resolution, more recent model. This manuscript considers the calibration of 2-m temperature forecasts using these reforecast datasets as well as samples of the last 30 days of training data. Nonhomogeneous Gaussian regression was used to calibrate forecasts at stations distributed across much of North America. Significant observations included the following: (i) although the “raw” GFS forecasts (probabilities estimated from ensemble relative frequency) were commonly unskillful as measured in conti...

Journal ArticleDOI
TL;DR: In this article, a radar simulator for polarimetric radar variables, including reflectivities at horizontal and vertical polarizations, the differential reflectivity, and the specific differential phase, has been developed.
Abstract: A radar simulator for polarimetric radar variables, including reflectivities at horizontal and vertical polarizations, the differential reflectivity, and the specific differential phase, has been developed. This simulator serves as a test bed for developing and testing forward observation operators of polarimetric radar variables that are needed when directly assimilating these variables into storm-scale numerical weather prediction (NWP) models, using either variational or ensemble-based assimilation methods. The simulator takes as input the results of high-resolution NWP model simulations with ice microphysics and produces simulated polarimetric radar data that may also contain simulated errors. It is developed based on calculations of electromagnetic wave propagation and scattering at the S band of wavelength 10.7 cm in a hydrometeor-containing atmosphere. The T-matrix method is used for the scattering calculation of raindrops and the Rayleigh scattering approximation is applied to snow and ha...

Journal ArticleDOI
TL;DR: In this paper, the ability of different combinations of bulk cloud microphysics and planetary boundary layer (PBL) parameterization schemes implemented in the Weather Research and Forecasting Model to realistically simulate the wide variety of cloud types associated with an extratropical cyclone is examined.
Abstract: In this study, the ability of different combinations of bulk cloud microphysics and planetary boundary layer (PBL) parameterization schemes implemented in the Weather Research and Forecasting Model to realistically simulate the wide variety of cloud types associated with an extratropical cyclone is examined. An ensemble of high-resolution model simulations was constructed for this case using four microphysics and two PBL schemes characterized by different levels of complexity. Simulated cloud properties, including cloud optical thickness, cloud water path, cloud-top pressure, and radiative cloud phase, were subsequently compared to cloud data from three Moderate Resolution Imaging Spectroradiometer (MODIS) overpasses across different portions of the domain. A detailed comparison of the simulated datasets revealed that the PBL and cloud microphysics schemes both exerted a strong influence on the spatial distribution and physical properties of the simulated cloud fields. In particular, the low-level cloud properties were found to be very sensitive to the PBL scheme while the upper-level clouds were sensitive to both the microphysics and PBL schemes. Overall, the simulated cloud properties were broadly similar to the MODIS observations, with the most realistic cloud fields produced by the more sophisticated parameterization schemes.

Journal ArticleDOI
TL;DR: In this paper, a statistical prediction scheme, employing logistic regression, is developed to predict the probability of tropical cyclone (TC) formation in zones of the Southern Hemisphere during forthcoming weeks.
Abstract: A statistical prediction scheme, employing logistic regression, is developed to predict the probability of tropical cyclone (TC) formation in zones of the Southern Hemisphere during forthcoming weeks. Through physical reasoning, examination of previous research, and some new analysis, five predictors were chosen for this purpose: one representing the climatological seasonal cycle of TC activity in each zone, two representing the eastward propagation of the Madden–Julian oscillation (MJO), and a further two representing the leading patterns of interannual sea surface temperature variability in the Indo-Pacific Oceans. Cross-validated hindcasts were generated, being careful to use the predictors at lags that replicate what can be performed in real time. All predictors contribute significantly to the skill of the hindcasts for at least some leads in the majority of zones. In particular, it is found that inclusion of indices of the MJO as predictors leads to increased skill out to about the third wee...