scispace - formally typeset
Search or ask a question

Showing papers in "Stochastic Environmental Research and Risk Assessment in 2010"


Journal ArticleDOI
TL;DR: The meta-Gaussian copula was employed to model the dependence of periodic hydrologic data using meta-elliptical copulas and these results were found satisfactory.
Abstract: This study aims to model the joint probability distribution of periodic hydrologic data using meta-elliptical copulas. Monthly precipitation data from a gauging station (410120) in Texas, US, was used to illustrate parameter estimation and goodness-of-fit for univariate drought distributions using chi-square test, Kolmogorov–Smirnov test, Cramer-von Mises statistic, Anderson-Darling statistic, modified weighted Watson statistic, and Liao and Shimokawa statistic. Pearson’s classical correlation coefficient r n , Spearman’s ρ n, Kendall’s τ, Chi-Plots, and K-Plots were employed to assess the dependence of drought variables. Several meta-elliptical copulas and Gumbel-Hougaard, Ali-Mikhail-Haq, Frank and Clayton copulas were tested to determine the best-fit copula. Based on the root mean square error and the Akaike information criterion, meta-Gaussian and t copulas gave a better fit. A bootstrap version based on Rosenblatt’s transformation was employed to test the goodness-of-fit for meta-Gaussian and t copulas. It was found that none of meta-Gaussian and t copulas considered could be rejected at the given significance level. The meta-Gaussian copula was employed to model the dependence, and these results were found satisfactory.

190 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors analyzed daily precipitation and temperature data of 23 meteorological stations covering 1960-2005 to indicate variability of water resources in the Tarim River basin and investigated the impacts of climate changes on water resources.
Abstract: Streamflow series of five hydrological stations were analyzed with aim to indicate variability of water resources in the Tarim River basin. Besides, impacts of climate changes on water resources were investigated by analyzing daily precipitation and temperature data of 23 meteorological stations covering 1960-2005. Some interesting and important results were obtained: (1) the study region is characterized by increasing temperature, however, only temperature in autumn is in significant increasing trend; (2) precipitation changes present different properties. Generally, increasing precipitation can be detected. However, only the precipitation in the Tienshan mountain area is in significant increasing trend. Annual streamflow of major rivers of the Tarim River basin are not in significant trends, except that of the Akesu River which is in significantly increasing trend. Due to the geomorphologic properties of the Tienshan mountain area, precipitation in this area demonstrates significant increasing trend and which in turn leads to increasing streamflow of the Akesu River. Due to the fact that the sources of streamflow of the rivers in the Tarim River basin are precipitation and melting glacial, both increasing precipitation and accelerating melting ice has the potential to cause increasing streamflow. These results are of practical and scientific merits in basin-scale water resource management in the arid regions in China under the changing environment.

172 citations


Journal ArticleDOI
TL;DR: In this paper, a stationary stochastic ARMA/ARIMA [Autoregressive Moving (Integrated) Average] modeling approach has been adapted to forecast daily mean ambient air pollutants (O3, CO, NO and NO2) at an urban traffic site (ITO) of Delhi, India.
Abstract: In the present study, a stationary stochastic ARMA/ARIMA [Autoregressive Moving (Integrated) Average] modelling approach has been adapted to forecast daily mean ambient air pollutants (O3, CO, NO and NO2) concentration at an urban traffic site (ITO) of Delhi, India. Suitable variance stabilizing transformation has been applied to each time series in order to make them covariance stationary in a consistent way. A combination of different information-criterions, namely, AIC (Akaike Information Criterion), HIC (Hannon–Quinn Information Criterion), BIC (Bayesian Information criterion), and FPE (Final Prediction Error) in addition to ACF (autocorrelation function) and PACF (partial autocorrelation function) inspection, has been tried out to obtain suitable orders of autoregressive (p) and moving average (q) parameters for the ARMA(p,q)/ARIMA(p,d,q) models. Forecasting performance of the selected ARMA(p,q)/ARIMA(p,d,q) models has been evaluated on the basis of MAPE (mean absolute percentage error), MAE (mean absolute error) and RMSE (root mean square error) indicators. For 20 out of sample forecasts, one step (i.e., one day) ahead MAPE for CO, NO2, NO and O3, have been found to be 13.6, 12.1, 21.8 and 24.1%, respectively. Given the stochastic nature of air pollutants data and in the light of earlier reported studies regarding air pollutants forecasts, the forecasting performance of the present approach is satisfactory and the suggested forecasting procedure can be effectively utilized for short term air quality forewarning purposes.

171 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors used a trivariate Plackett copula to model the joint probability distribution of drought duration, severity and inter-arrival time using streamflow data from three gaging stations.
Abstract: This study aims to model the joint probability distribution of drought duration, severity and inter-arrival time using a trivariate Plackett copula. The drought duration and inter-arrival time each follow the Weibull distribution and the drought severity follows the gamma distribution. Parameters of these univariate distributions are estimated using the method of moments (MOM), maximum likelihood method (MLM), probability weighted moments (PWM), and a genetic algorithm (GA); whereas parameters of the bivariate and trivariate Plackett copulas are estimated using the log-pseudolikelihood function method (LPLF) and GA. Streamflow data from three gaging stations, Zhuangtou, Taian and Tianyang, located in the Wei River basin, China, are employed to test the trivariate Plackett copula. The results show that the Plackett copula is capable of yielding bivariate and trivariate probability distributions of correlated drought variables.

146 citations


Journal ArticleDOI
TL;DR: It is shown how location dependent covariates e.g. a spatial trend can be accounted for in spatial copula models and introduced geostatistical copula-based models that are able to deal with random fields having discrete marginal distributions.
Abstract: It is common in geostatistics to use the variogram to describe the spatial dependence structure and to use kriging as the spatial prediction methodology Both methods are sensitive to outlying observations and are strongly influenced by the marginal distribution of the underlying random field Hence, they lead to unreliable results when applied to extreme value or multimodal data As an alternative to traditional spatial modeling and interpolation we consider the use of copula functions This paper extends existing copula-based geostatistical models We show how location dependent covariates eg a spatial trend can be accounted for in spatial copula models Furthermore, we introduce geostatistical copula-based models that are able to deal with random fields having discrete marginal distributions We propose three different copula-based spatial interpolation methods By exploiting the relationship between bivariate copulas and indicator covariances, we present indicator kriging and disjunctive kriging As a second method we present simple kriging of the rank-transformed data The third method is a plug-in prediction and generalizes the frequently applied trans-Gaussian kriging Finally, we report on the results obtained for the so-called Helicopter data set which contains extreme radioactivity measurements

133 citations


Journal ArticleDOI
TL;DR: The results convincingly demonstrate the advantages of GRA for hydrologic applications, which achieves similar performance as MMA and BMA, but is much simpler to implement and use, and computationally much less demanding.
Abstract: The Author(s) 2010. This article is published with open access at Springerlink.com Abstract Multi-model averaging is currently receiving a surge of attention in the atmospheric, hydrologic, and sta- tistical literature to explicitly handle conceptual model uncertainty in the analysis of environmental systems and derive predictive distributions of model output. Such den- sity forecasts are necessary to help analyze which parts of the model are well resolved, and which parts are subject to considerable uncertainty. Yet, accurate point predictors are still desired in many practical applications. In this paper, we compare a suite of different model averaging tech- niques by their ability to improve forecast accuracy of environmental systems. We compare equal weights aver- aging (EWA), Bates-Granger model averaging (BGA), averaging using Akaike's information criterion (AICA), and Bayes' Information Criterion (BICA), Bayesian model averaging (BMA), Mallows model averaging (MMA), and Granger-Ramanathan averaging (GRA) for two different hydrologic systems involving water flow through a 1950 km 2 watershed and 5 m deep vadose zone. Averaging methods with weights restricted to the multi-dimensional simplex (positive weights summing up to one) are shown to have considerably larger forecast errors than approaches with unconstrained weights. Whereas various sophisticated model averaging approaches have recently emerged in the literature, our results convincingly demonstrate the advantages of GRA for hydrologic applications. This method achieves similar performance as MMA and BMA, but is much simpler to implement and use, and computa- tionally much less demanding.

127 citations


Journal ArticleDOI
TL;DR: In this paper, a linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregression integrated moving averaging (SARIM) were used to predict drought in the Buyuk Menderes river basin using the Standardized Precipitation Index (SPI) as a drought index.
Abstract: In the present study, a seasonal and non-seasonal prediction of the Standardized Precipitation Index (SPI) time series is addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict drought in the Buyuk Menderes river basin using SPI as drought index. Temporal characteristics of droughts based on SPI as an indicator of drought severity indicate that the basin is affected by severe and more or less prolonged periods of drought from 1975 to 2006. Therefore, drought prediction plays an important role for water resources management. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, diagnostic checking. In model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of the SPI series, different ARIMA models are identified. The model gives the minimum Akaike Information Criterion (AIC) and Schwarz Bayesian Criterion (SBC) is selected as the best fit model. Parameter estimation step indicates that the estimated model parameters are significantly different from zero. Diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicated that the residuals are independent, normally distributed and homoscedastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The ARIMA models developed to predict drought found to give acceptable results up to 2 months ahead. The stochastic models developed for the Buyuk Menderes river basin can be employed to predict droughts up to 2 months of lead time with reasonably accuracy.

110 citations


Journal ArticleDOI
TL;DR: In this article, a regional flood frequency analysis and recognition of spatial patterns for flood-frequency variations in the Pearl River Delta (PRD) region using the well-known index flood L-moments approach together with some advanced statistical test and spatial analysis methods.
Abstract: The Pearl River Delta (PRD) has one of the most complicated deltaic drainage systems with probably the highest density of crisscross-river network in the world. This article presents a regional flood frequency analysis and recognition of spatial patterns for flood-frequency variations in the PRD region using the well-known index flood L-moments approach together with some advanced statistical test and spatial analysis methods. Results indicate that: (1) the whole PRD region is definitely heterogeneous according to the heterogeneity test and can be divided into three homogeneous regions; (2) the spatial maps for annual maximum flood stage corresponding to different return periods in the PRD region suggest that the flood stage decreases gradually from the riverine system to the tide dominated costal areas; (3) from a regional perspective, the spatial patterns of flood-frequency variations demonstrate the most serious flood-risk in the coastal region because it is extremely prone to the emerging flood hazards, typhoons, storm surges and well-evidenced sea-level rising. Excessive rainfall in the upstream basins will lead to moderate floods in the upper and middle PRD region. The flood risks of rest parts are identified as the lowest in entire PRD. In order to obtain more reliable estimates, the stationarity and serial-independence are tested prior to frequency analysis. The characterization of the spatial patterns of flood-frequency variations is conducted to reveal the potential influences of climate change and intensified human activities. These findings will definitely contribute to formulating the regional development strategies for policymakers and stakeholders in water resource management against the menaces of frequently emerged floods and well-evidenced sea level rising.

83 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the hypothesis that non-linearity matters in the spatial mapping of complex patterns of groundwater arsenic contamination and find that the use of a highly nonlinear pattern learning technique in the form of an artificial neural network (ANN) can yield more accurate results under the same set of constraints when compared to the ordinary kriging method.
Abstract: In this technical note, we investigate the hypothesis that ‘non-linearity matters in the spatial mapping of complex patterns of groundwater arsenic contamination’. The spatial mapping pertained to data-driven techniques of spatial interpolation based on sampling data at finite locations. Using the well known example of extensive groundwater contamination by arsenic in Bangladesh, we find that the use of a highly non-linear pattern learning technique in the form of an artificial neural network (ANN) can yield more accurate results under the same set of constraints when compared to the ordinary kriging method. One ANN and a variogram model were used to represent the spatial structure of arsenic contamination for the whole country. The probability for successful detection of a well as safe or unsafe was found to be atleast 15% larger than that by kriging under the country-wide scenario. The probability of false hopes, which is a serious issue in public health monitoring was found to be significantly lower (by more than 10%) than that by kriging.

69 citations


Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper presented a visually enhanced evaluation of the spatio-temporal patterns of the dam-induced hydrologic alteration in the middle and upper East River, south China over 1952-2002, using the range of variability approach (RVA) and visualization package XmdvTool.
Abstract: This paper presents a visually enhanced evaluation of the spatio-temporal patterns of the dam-induced hydrologic alteration in the middle and upper East River, south China over 1952-2002, using the range of variability approach (RVA) and visualization package XmdvTool. The impacts of climate variability on hydrological processes have been removed for wet and dry periods, respectively, so that we focus on the impacts of human activities (i.e., dam construction). The results indicate that: (1) along the East River, dams have greatly altered the natural flow regime, range condition and spatial variability; (2) six most remarkable indicators of hydrologic alteration induced by dam-construction are rise rate (1.16), 3-day maximum (0.91), low pulse duration (0.88), January (0.80), July (0.80) and February (0.79) mean flow of the East River during 1952-2002; and (3) spatiotemporal hydrologic alterations are different among three stations along Easter River. Under the influence of dam construction in the upstream, the degree of hydrologic changes from Lingxia, Heyuan to Longchuan station increases. This study reveals that visualization techniques for high-dimensional hydrological datasets together with RVA are beneficial for detecting spatio-temporal hydrologic changes.

69 citations


Journal ArticleDOI
TL;DR: In this article, the authors derived the ECHAM5/MPI-OM model for the Huaihe River Basin, China, for the years 2001-2100 from the observed precipitation and temperature data covering 1964-2007.
Abstract: Climate projections for the Huaihe River Basin, China, for the years 2001–2100 are derived from the ECHAM5/MPI-OM model based on observed precipitation and temperature data covering 1964–2007. Streamflow for the Huaihe River under three emission scenarios (SRES-A2, A1B, B1) from 2010 to 2100 is then projected by applying artificial neural networks (ANN). The results show that annual streamflow will change significantly under the three scenarios from 2010 to 2100. The interannual fluctuations cover a significant increasing streamflow trend under the SRES-A2 scenario (2051–2085). The streamflow trend declines gradually under the SRES-A1B scenario (2024–2037), and shows no obvious trend under the SRES-B1 scenario. From 2010 to 2100, the correlation coefficient between the observed and modeled streamflow in SRES-A2 scenario is the best of the three scenarios. Combining SRES-A2 scenario of the ECHAM5 model and ANN might therefore be the best approach for assessing and projecting future water resources in the Huaihe basin and other catchments. Compared to the observed period of streamflows, the projected periodicity of streamflows shows significant changes under different emission scenarios. Under A2 scenario and A1B scenario, the period would delay to about 32–33a and 27–28a, respectively, but under B1 scenario, the period would not change, as it is about 5–6a and the observed period is about 7–8a. All this might affect drought/flood management, water supply and irrigation projects in the Huaihe River basin.

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors presented a method integrating Mann-Kendall trend test, wavelet transform analysis and spatial mapping techniques to identify the temporal and spatial patterns of low-flow changes in the Yellow River (1955-2005).
Abstract: Low-flow is widely regarded as the primary flow conditions for the anthropogenic and aquatic communities in most rivers, particularly in such an arid and semi-arid area as the Yellow River. This study presents a method integrating Mann-Kendall trend test, wavelet transform analysis and spatial mapping techniques to identify the temporal and spatial patterns of low-flow changes in the Yellow River (1955-2005). The results indicate that: (1) no trend can be identified in the major low-flow conditions in the upper Yellow River, but downward trends can be found in the middle and lower Yellow River; (2) similar periodic patterns are detected in the 7-day minima (AM7Q) in the upper and middle Yellow River, while different patterns are found in the lower Yellow River; (3) the increasing coefficients of variance in the primary low-flow conditions suggest that the variability of the low-flow is increasing from the upper to lower stream; (4) climate change and uneven temporal-spatial patterns of precipitation, jointly with highly intensified water resource utilization, are recognized as the major factors that led to the decrease of low-flow in the lower Yellow River in recent decades. The current investigation should be helpful for regional water resources management in the Yellow River basin, which is characterized by serious water shortage.

Journal ArticleDOI
TL;DR: In this article, the authors apply Bayesian analysis to the estimation of intensity-duration-frequency (IDF) curves to assess the return periods of rainfall events and often steer decisions in urban water structures such as sewers, pipes and retention basins.
Abstract: Intensity–duration–frequency (IDF) curves are used extensively in engineering to assess the return periods of rainfall events and often steer decisions in urban water structures such as sewers, pipes and retention basins. In the province of Quebec, precipitation time series are often short, leading to a considerable uncertainty on the parameters of the probabilistic distributions describing rainfall intensity. In this paper, we apply Bayesian analysis to the estimation of IDF curves. The results show the extent of uncertainties in IDF curves and the ensuing risk of their misinterpretation. This uncertainty is even more problematic when IDF curves are used to estimate the return period of a given event. Indeed, standard methods provide overly large return period estimates, leading to a false sense of security. Comparison of the Bayesian and classical approaches is made using different prior assumptions for the return period and different estimation methods. A new prior distribution is also proposed based on subjective appraisal by witnesses of the extreme character of the event.

Journal ArticleDOI
TL;DR: In this article, a nonstationary variogram technique employed generalises a moving window kriging (MWK) model where classic variogram (CV) estimators are replaced with information-rich, geographically weighted variogram estimators.
Abstract: This study adds to our ability to predict the unknown by empirically assessing the performance of a novel geostatistical-nonparametric hybrid technique to provide accurate predictions of the value of an attribute together with locally-relevant measures of prediction confidence, at point locations for a single realisation spatial process. The nonstationary variogram technique employed generalises a moving window kriging (MWK) model where classic variogram (CV) estimators are replaced with information-rich, geographically weighted variogram (GWV) estimators. The GWVs are constructed using kernel smoothing. The resultant and novel MWK–GWV model is compared with a standard MWK model (MWK–CV), a standard nonlinear model (Box–Cox kriging, BCK) and a standard linear model (simple kriging, SK), using four example datasets. Exploratory local analyses suggest that each dataset may benefit from a MWK application. This expectation was broadly confirmed once the models were applied. Model performance results indicate much promise in the MWK–GWV model. Situations where a MWK model is preferred to a BCK model and where a MWK–GWV model is preferred to a MWK–CV model are discussed with respect to model performance, parameterisation and complexity; and with respect to sample scale, information and heterogeneity.

Journal ArticleDOI
TL;DR: This work uses space–time variability in historical data and projections of future population density to improve forecasting of residential water demand in the City of Phoenix, Arizona and proposes a methodology to generate information-rich future estimates that benefits from the uncertain estimates of the future.
Abstract: Managing environmental and social systems in the face of uncertainty requires the best possible forecasts of future conditions. We use space–time variability in historical data and projections of future population density to improve forecasting of residential water demand in the City of Phoenix, Arizona. Our future water estimates are derived using the first and second order statistical moments between a dependent variable, water use, and an independent variable, population density. The independent variable is projected at future points, and remains uncertain. We use adjusted statistical moments that cover projection errors in the independent variable, and propose a methodology to generate information-rich future estimates. These updated estimates are processed in Bayesian Maximum Entropy (BME), which produces maps of estimated water use to the year 2030. Integrating the uncertain estimates into the space–time forecasting process improves forecasting accuracy up to 43.9% over other space–time mapping methods that do not assimilate the uncertain estimates. Further validation studies reveal that BME is more accurate than co-kriging that integrates the error-free independent variable, but shows similar accuracy to kriging with measurement error that processes the uncertain estimates. Our proposed forecasting method benefits from the uncertain estimates of the future, provides up-to-date forecasts of water use, and can be adapted to other socio-economic and environmental applications.

Journal ArticleDOI
TL;DR: The results show that it is possible to put together concepts that appear to be incompatible: the deterministic estimate of PMF, taken as a theoretical limit for floods, and the frequency analysis of maximum flows, with the inclusion of non-systematic data.
Abstract: Some recent research on fluvial processes suggests the idea that some hydrological variables, such as flood flows, are upper-bounded. However, most probability distributions that are currently employed in flood frequency analysis are unbounded to the right. This paper describes an exploratory study on the joint use of an upper-bounded probability distribution and non-systematic flood information, within a Bayesian framework. Accordingly, the current PMF maximum discharge appears as a reference value and a reasonable estimate of the upper-bound for maximum flows, despite the fact that PMF determination is not unequivocal and depends strongly on the available data. In the Bayesian context, the uncertainty on the PMF can be included into the analysis by considering an appropriate prior distribution for the maximum flows. In the sequence, systematic flood records, historical floods, and paleofloods can be included into a compound likelihood function which is then used to update the prior information on the upper-bound. By combining a prior distribution describing the uncertainties of PMF estimates along with various sources of flood data into a unified Bayesian approach, the expectation is to obtain improved estimates of the upper-bound. The application example was conducted with flood data from the American river basin, near the Folsom reservoir, in California, USA. The results show that it is possible to put together concepts that appear to be incompatible: the deterministic estimate of PMF, taken as a theoretical limit for floods, and the frequency analysis of maximum flows, with the inclusion of non-systematic data. As compared to conventional analysis, the combination of these two concepts within the logical context of Bayesian theory, contributes an advance towards more reliable estimates of extreme floods.

Journal ArticleDOI
TL;DR: In this article, different sources of uncertainty in statistical modeling of extreme hydrological events are studied in a systematic way by focusing on several key uncertainty sources using three different case studies, and the chosen case studies highlight a number of projects where there have been questions regarding the uncertainty in extreme rainfall/flood estimation.
Abstract: With the increase of both magnitude and frequency of hydrological extreme events such as drought and flooding, the significance of adequately modeling hydrological extreme events is fully recognized. Estimation of extreme rainfall/flood for various return periods is of prime importance for hydrological design or risk assessment. However, due to knowledge and data limitation, uncertainty involved in extrapolating beyond available data is huge. In this paper, different sources of uncertainty in statistical modeling of extreme hydrological events are studied in a systematic way. This is done by focusing on several key uncertainty sources using three different case studies. The chosen case studies highlight a number of projects where there have been questions regarding the uncertainty in extreme rainfall/flood estimation. The results show that the uncertainty originated from the methodology is the largest and could be >40% for a return period of 200 years, while the uncertainty caused by ignoring the dependence among multiple hydrological variables seems the smallest. In the end, it is highly recommended that uncertainty in modeling extreme hydrological events be fully recognized and incorporated into a formal hydrological extreme analysis

Journal ArticleDOI
TL;DR: This article introduces, characterize and apply an extended version of the Birnbaum–Saunders model based on the Mudolkar–Hutson skew distribution, and finds the density, distribution function, and moments of the new model.
Abstract: In this article, we introduce, characterize and apply an extended version of the Birnbaum–Saunders model based on the Mudolkar–Hutson skew distribution. This model is appropriated for describing phenomena involving accumulation of some type, as is the case of environmental contamination. Specifically, we find the density, distribution function, and moments of the new model. In addition, we derive several properties and transformations related to this distribution. Furthermore, we propose an estimation method for the parameters of the model. Moreover, we conduct a study of its hazard rate focuses in environmental analysis. A computational implementation in R language of the obtained results is discussed. Finally, we present two examples with real data from environmental quality in Chile that illustrate the proposed methodology.

Journal ArticleDOI
TL;DR: In this paper, an integrative fuzzy set pair model for assessing the land ecological security was developed by integrating fuzzy assessment and set pair analysis (SPA) to calculate the approximate degree of land eco-security to the optimal standard set by combining multiple indices.
Abstract: Due to the increasingly serious ecological degradation of land systems, the land ecological security issues have attracted more and more attention of policy makers, researchers and citizens. Aiming at overcoming the disadvantages in subjectivity and complexity of the currently used assessment methods, an integrative fuzzy set pair model for assessing the land ecological security was developed by integrating fuzzy assessment and set pair analysis (SPA). The approximate degree of land ecological security to the optimal standard set was calculated to describe the secure level by combining multiple indices. The indices and weights were determined by a pressure-state-response model and the fuzzy analytic hierarchy process (AHP), respectively. Aided by a geographic information system, this model was applied to evaluate comprehensively the status of land ecological security in Xiaolangdi Reservoir Region, China, taking the administrative division as the assessment unit. The results showed that 20% of the total area maintained a slightly secure status, while 50% of the study area was of a middle or seriously low grade of land ecological security. The remaining portion (30%) was the most ecologically insecure. From the spatial prospective, obvious variations were observed. The land eco-security gradually decreased from the Xiaolangdi Dam to its surrounding regions. It was concluded that the status of the integral land ecological security of Xiaolangdi Reservoir Region was in the middle level, and increasingly intense human activities speeded up the degradation of regional land ecosystem in recent years and thus induced the crisis of land ecological security.

Journal ArticleDOI
TL;DR: An adaptive neuro-fuzzy inference system (ANFIS) approach is used to construct monthly sediment forecasting system and it is observed that the ANFIS is preferable and can be applied successfully because it provides high accuracy and reliability for forecasting of monthly total sediment.
Abstract: Accurate forecasting of sediment is an important issue for reservoir design and water pollution control in rivers and reservoirs. In this study, an adaptive neuro-fuzzy inference system (ANFIS) approach is used to construct monthly sediment forecasting system. To illustrate the applicability of ANFIS method the Great Menderes basin is chosen as the study area. The models with various input structures are constructed for the purpose of identification of the best structure. The performance of the ANFIS models in training and testing sets are compared with the observed data. To get more accurate evaluation of the results ANFIS models, the best fit model structures are also tested by artificial neural networks (ANN) and multiple linear regression (MLR) methods. The results of three methods are compared, and it is observed that the ANFIS is preferable and can be applied successfully because it provides high accuracy and reliability for forecasting of monthly total sediment.

Journal ArticleDOI
TL;DR: In this article, the authors present new ideas on sampling design and minimax prediction in a geostatistical model setting, which are based on regression design ideas for linear regression models with uncorrelated errors.
Abstract: This paper presents new ideas on sampling design and minimax prediction in a geostatistical model setting Both presented methodologies are based on regression design ideas For this reason the appendix of this paper gives an introduction to optimum Bayesian experimental design theory for linear regression models with uncorrelated errors The presented methodologies and algorithms are then applied to the spatial setting of correlated random fields To be specific, in Sect 1 we will approximate an isotropic random field by means of a regression model with a large number of regression functions with random amplitudes, similarly to Fedorov and Flanagan (J Combat Inf Syst Sci: 23, 1997) These authors make use of the Karhunen Loeve approximation of the isotropic random field We use the so-called polar spectral approximation instead; ie we approximate the isotropic random field by means of a regression model with sine-cosine-Bessel surface harmonics with random amplitudes and then, in accordance with Fedorov and Flanagan (J Combat Inf Syst Sci: 23, 1997), apply standard Bayesian experimental design algorithms to the resulting Bayesian regression model Section 2 deals with minimax prediction when the covariance function is known to vary in some set of a priori plausible covariance functions Using a minimax theorem due to Sion (Pac J Math 8:171–176, 1958) we are able to formulate the minimax problem as being equivalent to an optimum experimental design problem, too This makes the whole experimental design apparatus available for finding minimax kriging predictors Furthermore some hints are given, how the approach to spatial sampling design with one a priori fixed covariance function may be extended by means of minimax kriging to a whole set of a priori plausible covariance functions such that the resulting designs are robust The theoretical developments are illustrated with two examples taken from radiological monitoring and soil science

Journal ArticleDOI
TL;DR: Through SI-IFSCCP, multiple uncertainties expressed as intervals, possibilistic and probabilistic distributions, as well as their combinations, could be directly communicated into the optimization process, leading to enhanced system robustness and tackling fuzziness and two-layer randomness.
Abstract: A superiority-inferiority-based inexact fuzzy-stochastic chance-constrained programming (SI-IFSCCP) approach is developed for supporting long-term municipal solid waste management under uncertainty. Through SI-IFSCCP, multiple uncertainties expressed as intervals, possibilistic and probabilistic distributions, as well as their combinations, could be directly communicated into the optimization process, leading to enhanced system robustness. Through tackling fuzziness and two-layer randomness, various subjective judgments of many stakeholders with different interests and preferences could be extensively reflected, guaranteeing a lower degree of biases during data sampling and a higher degree of public acceptance for the generated plans. Two levels of system-violation risk could also be reflected by SI-IFSCCP, reflecting the relationship between economic efficiency and system reliability. A two-step solution method with improved computational efficiency is proposed for SI-IFSCCP. To demonstrate its applicability, the developed methodology is then applied to a long-term municipal solid waste management problem. Useful solutions have been generated. Satisfactory waste flow plans could be identified according to system conditions and policy inclination, supporting in-depth tradeoff analyses between system optimality and reliability as well as between economic and environmental objectives.

Journal ArticleDOI
TL;DR: This work compares the forecast skill of the average of the collection of science-based computational models to the skills of the individual members and gives general criteria for the average to perform more or less skillfully than the most skillful individual model, the “best” model.
Abstract: Given a collection of science-based computational models that all estimate states of the same environmental system, we compare the forecast skill of the average of the collection to the skills of the individual members. We illustrate our results through an analysis of regional climate model data and give general criteria for the average to perform more or less skillfully than the most skillful individual model, the “best” model. The average will only be more skillful than the best model if the individual models in the collection produce very different forecasts; if the individual forecasts generally agree, the average will not be as skillful as the best model.

Journal ArticleDOI
TL;DR: This study introduces Bayesian model averaging (BMA) to deal with model structure uncertainty in groundwater management decisions and implements the methodology to manage saltwater intrusion in the “1,500-foot” sand aquifer in the Baton Rouge area, Louisiana.
Abstract: This study introduces Bayesian model averaging (BMA) to deal with model structure uncertainty in groundwater management decisions. A robust optimized policy should take into account model parameter uncertainty as well as uncertainty in imprecise model structure. Due to a limited amount of groundwater head data and hydraulic conductivity data, multiple simulation models are developed based on different head boundary condition values and semivariogram models of hydraulic conductivity. Instead of selecting the best simulation model, a variance-window-based BMA method is introduced to the management model to utilize all simulation models to predict chloride concentration. Given different semivariogram models, the spatially correlated hydraulic conductivity distributions are estimated by the generalized parameterization (GP) method that combines the Voronoi zones and the ordinary kriging (OK) estimates. The model weights of BMA are estimated by the Bayesian information criterion (BIC) and the variance window in the maximum likelihood estimation. The simulation models are then weighted to predict chloride concentrations within the constraints of the management model. The methodology is implemented to manage saltwater intrusion in the “1,500-foot” sand aquifer in the Baton Rouge area, Louisiana. The management model aims to obtain optimal joint operations of the hydraulic barrier system and the saltwater extraction system to mitigate saltwater intrusion. A genetic algorithm (GA) is used to obtain the optimal injection and extraction policies. Using the BMA predictions, higher injection rates and pumping rates are needed to cover more constraint violations, which do not occur if a single best model is used.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the effect of meteorological factors such as temperature, wind speed, wind direction and rainfall on TSP, PM10, PM2.5 and PM10 mass concentrations.
Abstract: In this study, particulate matters (TSP, PM10, PM2.5 and PM10–2.5) which are hazardous for environment and human health were investigated in Erzurum urban atmosphere at a sampling point from February 2005 to February 2006. During sampling, two low volume samplers were used and each sampling period lasted approximately 24 h. In order for detection of representative sampling region and point of Erzurum, Kriging method was applied to the black smoke concentration data for winter seasons. Mass concentrations of TSP, PM10 and PM2.5 of Erzurum urban atmosphere were measured on average, as 129, 31 and 13 μg/m3, respectively, in the sampling period. Meteorological factors, such as temperature, wind speed, wind direction and rainfall were typically found to be affecting PMs, especially PM2.5. Air temperature did not seem to be significantly affecting TSP and PM10 mass concentrations, but had a considerably negative induction on PM2.5 mass concentrations. However, combustion sourced PM2.5 was usually diluted from the urban atmosphere by the speed of wind, soil sourced coarse mode particle concentrations (TSP, PM10) were slightly affected by the speed of wind. Rainfall was found to be decreasing concentrations to 48% in all fractions (TSP, PM10, PM10–2.5, PM2.5) and played an important role on dilution of the atmosphere. Fine mode fraction of PM (PM2.5) showed significant daily and seasonal variations on mass concentrations. On the other hand, coarse mode fractions (TSP, PM10 and PM10–2.5) revealed more steady variations. It was observed that fine mode fraction variations were affected by the heating in residences during winter seasons.

Journal ArticleDOI
TL;DR: In this article, a segmented regression with constraints method is introduced to model both trend analysis and abrupt change detection, which can detect both shift trends and abrupt changes simultaneously and have weak ability to detect multiple change points together.
Abstract: Hydrological time series are generally subject to shift trends and abrupt changes. However, most of the methods used in the literature cannot detect both shift trends and abrupt changes simultaneously and have weak ability to detect multiple change points together. In this study, the segmented regression with constraints method, which can model both trend analysis and abrupt change detection, is introduced. The modified Akaike’s information criterion is used for model selection. As an application, the method is employed to analyse the mean annual temperature, precipitation, runoff and runoff coefficient time series in the Shiyang River Basin for the period from 1958 to 2003. The segmented regression model shows that the trends of the mean annual precipitation, temperature and runoff change over time, with different join (turning) points for different stations. The runoff pattern can potentially explained by the climate variables (precipitation and temperature). Runoff coefficients show slightly decreasing trends for Xiying, Huangyang, Gulang and Zamu catchments, slight increasing trends for Dongda and Dajing catchments and nearly no change for Xida catchment. No change points are found in runoff coefficient in all catchments.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a risk analysis model to evaluate the risk of underestimating the predicted peak discharge, i.e., the exceedance of probability due to the uncertainties in rainfall information (rainfall depth, duration, and storm pattern) and the parameters of the rainfall-runoff model (SAC-SMA) during the flooding prevention and warning operation.
Abstract: This work proposes a risk analysis model to evaluate the risk of underestimating the predicted peak discharge, i.e. the exceedance of probability due to the uncertainties in rainfall information (rainfall depth, duration, and storm pattern) and the parameters of the rainfall-runoff model (Sacramento Soil Moisture Accounting model, SAC-SMA) during the flooding prevention and warning operation. The proposed risk analysis model is combined with the multivariate Monte Carlo simulation method and the Advance First-Order Second-Moment method (AFOSM). The observed rainfall and discharge measured at Yu-feng Basin study area in Shihmen reservoir watershed is used in the model development and application. The results of the model application indicate that the proposed risk analysis model can analyze the sensitivity of the uncertainty factors for the predicted peak discharge and evaluates the variation of the probability of exceeding the predicted peak discharge with respect to the rainfall depth and storm duration. In addition, the result of risk analysis for a real rainstorm event, Typhoon Morakot, shows that the proposed model successfully explores the risk of underestimating the predicted peak discharge using SAC-SMA and forecasted rainfall information and provides a probabilistic forecast of the peak discharge.

Journal ArticleDOI
TL;DR: In this paper, a hybrid urban expansion model (NNSCA) was proposed to simulate rapid urban growth in a typical industrial city, Dongying, China, by coupling a artificial-neural-network-based stochastic cellular automata model and several socioeconomic indictors, i.e., the per capita income of the rural population, the per-capence of the urban population, population and gross domestic products of the city.
Abstract: Urbanization is one of the most important anthropogenic activities that create extensive environmental implications at both local and global scales. Dynamic urban expansion models are useful tools to understand the urbanization process, project its spatiotemporal dynamics and provide useful information for assessing the environmental implications of urbanization. A hybrid urban expansion model (NNSCA model) was proposed to simulate rapid urban growth in a typical industrial city, Dongying, China, by coupling a artificial-neural-network-based stochastic cellular automata model and several socioeconomic indictors, i.e., the per capita income of the rural population, the per capita income of the urban population, population and gross domestic products of the city. Good conformity between simulated and actual urban patterns suggested that the NNSCA model was able to effectively simulate historic urban growth and to generate realistic urban patterns. A series of scenario analyses suggested that the expanding urban would threaten the ecosystem health of coastal wetlands in the city unless environmental protection actions are taken in the future. The NNSCA model provides abilities to assess future urban growth under various planning and management scenarios, and can be integrated into ecological or environmental process models to evaluate urbanization’s environmental implications.

Journal ArticleDOI
TL;DR: A number of different methods are compared and improvements are presented using as case study a nonlinear DNAPL source dissolution and solute transport model, including the Metropolis–Hastings method and the adaptive direction sampling (ADS) method.
Abstract: Popular parameter estimation methods, including least squares, maximum likelihood, and maximum a posteriori (MAP), solve an optimization problem to obtain a central value (or best estimate) followed by an approximate evaluation of the spread (or covariance matrix). A different approach is the Monte Carlo (MC) method, and particularly Markov chain Monte Carlo (MCMC) methods, which allow sampling from the posterior distribution of the parameters. Though available for years, MC methods have only recently drawn wide attention as practical ways for solving challenging high-dimensional parameter estimation problems. They have a broader scope of applications than conventional methods and can be used to derive the full posterior pdf but can be computationally very intensive. This paper compares a number of different methods and presents improvements using as case study a nonlinear DNAPL source dissolution and solute transport model. This depth-integrated semi-analytical model approximates dissolution from the DNAPL source zone using nonlinear empirical equations with partially known parameters. It then calculates the DNAPL plume concentration in the aquifer by solving the advection-dispersion equation with a flux boundary. The comparison is among the classical MAP and some versions of computer-intensive Monte Carlo methods, including the Metropolis–Hastings (MH) method and the adaptive direction sampling (ADS) method.

Journal ArticleDOI
TL;DR: The study indicated that the proposed method was easy to apply and high accuracy was achieved at a 95% confidence level, and Bayesian belief network was used to quantify the probability of NTDs occurred at villages with no births.
Abstract: Neural tube defects (NTDs) constitute the most common type of birth defects. How much risk of NTDs could an area take? The answer to this question will help people understand the geographical distribution of NTDs and explore its environmental causes. Most existing methods usually take the spatial correlation of cases into account and rarely consider the effect of environmental factors. However, especially in rural areas, the NTDs cases have a little effect on each other across space, whereas the role of environmental factors is significant. To demonstrate these points, Heshun, a county with the highest rate of NTDs in China, was selected as the region of interest in the study. Bayesian belief network was used to quantify the probability of NTDs occurred at villages with no births. The study indicated that the proposed method was easy to apply and high accuracy was achieved at a 95% confidence level.