scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2015"


Journal ArticleDOI
TL;DR: A Matlab/Octave toolbox for the application of GSA, called SAFE (Sensitivity Analysis For Everybody), which implements several established GSA methods and allows for easily integrating others and embeds good practice guidelines through workflow scripts.
Abstract: Global Sensitivity Analysis (GSA) is increasingly used in the development and assessment of environmental models. Here we present a Matlab/Octave toolbox for the application of GSA, called SAFE (Sensitivity Analysis For Everybody). It implements several established GSA methods and allows for easily integrating others. All methods implemented in SAFE support the assessment of the robustness and convergence of sensitivity indices. Furthermore, SAFE includes numerous visualisation tools for the effective investigation and communication of GSA results. The toolbox is designed to make GSA accessible to non-specialist users, and to provide a fully commented code for more experienced users to complement their own tools. The documentation includes a set of workflow scripts with practical guidelines on how to apply GSA and how to use the toolbox. SAFE is open source and freely available for academic and non-commercial purpose. Ultimately, SAFE aims at contributing towards improving the diffusion and quality of GSA practice in the environmental modelling community. SAFE implements several GSA methods and can easily integrate new ones.SAFE facilitates assessment of robustness/convergence and effective visualization.SAFE embeds good practice guidelines through workflow scripts.SAFE is intended for both non-specialists users and SA developers.

472 citations


Journal ArticleDOI
TL;DR: This work reviews various methods that have been or could be applied to evaluate the uncertainty related to deterministic models' outputs, and covers expert judgement, model emulation, sensitivity analysis, temporal and spatial variability in the model outputs, use of multiple models, and statistical approaches.
Abstract: There is an increasing need for environmental management advice that is wide-scoped, covering various interlinked policies, and realistic about the uncertainties related to the possible management actions. To achieve this, efficient decision support integrates the results of pre-existing models. Many environmental models are deterministic, but the uncertainty of their outcomes needs to be estimated when they are utilized for decision support. We review various methods that have been or could be applied to evaluate the uncertainty related to deterministic models' outputs. We cover expert judgement, model emulation, sensitivity analysis, temporal and spatial variability in the model outputs, the use of multiple models, and statistical approaches, and evaluate when these methods are appropriate and what must be taken into account when utilizing them. The best way to evaluate the uncertainty depends on the definitions of the source models and the amount and quality of information available to the modeller. We review different types of uncertainty present in environmental modelling.We review methods to evaluate uncertainty related to model results.Best way to evaluate uncertainty depends on the models and available information.

443 citations


Journal ArticleDOI
TL;DR: A novel GSA method, called PAWN, to efficiently compute density-based sensitivity indices, which is to characterise output distributions by their Cumulative Distribution Functions (CDF), which are easier to derive than PDFs.
Abstract: Variance-based approaches are widely used for Global Sensitivity Analysis (GSA) of environmental models. However, methods that consider the entire Probability Density Function (PDF) of the model output, rather than its variance only, are preferable in cases where variance is not an adequate proxy of uncertainty, e.g. when the output distribution is highly-skewed or when it is multi-modal. Still, the adoption of density-based methods has been limited so far, possibly because they are relatively more difficult to implement. Here we present a novel GSA method, called PAWN, to efficiently compute density-based sensitivity indices. The key idea is to characterise output distributions by their Cumulative Distribution Functions (CDF), which are easier to derive than PDFs. We discuss and demonstrate the advantages of PAWN through applications to numerical and environmental modelling examples. We expect PAWN to increase the application of density-based approaches and to be a complementary approach to variance-based GSA. We present a new density-based GSA method called PAWN to complement variance-based GSA.Differently from variance-based methods, PAWN can be applied to highly-skewed or multi-modal output distributions.Differently from other density-based methods, PAWN uses output CDFs, which simplifies numerical implementation.PAWN can be easily tailored to focus on output sub-ranges, for instance extreme values.Intermediate results generated in the application of PAWN can be visualized to gather insights about the model behaviour.

311 citations


Journal ArticleDOI
TL;DR: An overview of the present state of crop modelling to assess climate change risks to food production and to which extent crop models comply with IAM demands is provided.
Abstract: The complexity of risks posed by climate change and possible adaptations for crop production has called for integrated assessment and modelling (IAM) approaches linking biophysical and economic models. This paper attempts to provide an overview of the present state of crop modelling to assess climate change risks to food production and to which extent crop models comply with IAM demands. Considerable progress has been made in modelling effects of climate variables, where crop models best satisfy IAM demands. Demands are partly satisfied for simulating commonly required assessment variables. However, progress on the number of simulated crops, uncertainty propagation related to model parameters and structure, adaptations and scaling are less advanced and lagging behind IAM demands. The limitations are considered substantial and apply to a different extent to all crop models. Overcoming these limitations will require joint efforts, and consideration of novel modelling approaches. Extreme events and future climate uncertainty represent risk for food production.Crop models are largely able to simulate crop response to climate factors.Adaptations are best evaluated in integrated assessment models (IAM).Key limitations for crop models in IAM are low data availability and integration.Cross-scale nature of IAM suggests novel modelling approaches are needed.

255 citations


Journal ArticleDOI
TL;DR: This paper develops an integrated agent-based model of residential solar adoption based upon a theoretically-driven behavioral model and data collection that is validated using multiple (temporal, spatial, and demographic) criteria.
Abstract: Agent-based modeling (ABM) techniques for studying human-technical systems face two important challenges. First, agent behavioral rules are often ad hoc, making it difficult to assess the implications of these models within the larger theoretical context. Second, the lack of relevant empirical data precludes many models from being appropriately initialized and validated, limiting the value of such models for exploring emergent properties or for policy evaluation. To address these issues, in this paper we present a theoretically-based and empirically-driven agent-based model of technology adoption, with an application to residential solar photovoltaic (PV). Using household-level resolution for demographic, attitudinal, social network, and environmental variables, the integrated ABM framework we develop is applied to real-world data covering 2004-2013 for a residential solar PV program at the city scale. Two applications of the model focusing on rebate program design are also presented. We develop an integrated agent-based model of residential solar adoption.Model is based upon a theoretically-driven behavioral model and data collection.Multiple data-streams are merged to enable empirical initialization of agent states.We use a technology-specific social network, leveraging observed geographic patterns.Model is validated using multiple (temporal, spatial, and demographic) criteria.

224 citations


Journal ArticleDOI
TL;DR: An overview on integrated assessment and modelling (IAM) for environmental problems examines the ten key dimensions of integration in IAM including what is being integrated, why and how and discusses how the integration dimensions fit into the IAM process.
Abstract: Integrated assessment and its inherent platform, integrated modelling, present an opportunity to synthesize diverse knowledge, data, methods and perspectives into an overarching framework to address complex environmental problems. However to be successful for assessment or decision making purposes, all salient dimensions of integrated modelling must be addressed with respect to its purpose and context. The key dimensions include: issues of concern; management options and governance arrangements; stakeholders; natural systems; human systems; spatial scales; temporal scales; disciplines; methods, models, tools and data; and sources and types of uncertainty. This paper aims to shed light on these ten dimensions, and how integration of the dimensions fits in the four main phases in the integrated assessment process: scoping, problem framing and formulation, assessing options, and communicating findings. We provide examples of participatory processes and modelling tools that can be used to achieve integration. This is an overview on integrated assessment and modelling (IAM) for environmental problems.We examine the ten key dimensions of integration in IAM including what is being integrated, why and how.We discuss how the integration dimensions fit into the IAM process.

223 citations


Journal ArticleDOI
TL;DR: This manuscript explores the impact of Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on the Global Earth Observation System of Systems and particularly its common digital infrastructure (i.e. the GEOSS Common Infrastructure).
Abstract: There are many expectations and concerns about Big Data in the sector of Earth Observation. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This manuscript explores the impact of Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on the Global Earth Observation System of Systems (GEOSS) and particularly its common digital infrastructure (i.e. the GEOSS Common Infrastructure). GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the EO realm. The manuscript introduces and discusses the general GEOSS strategies to address Big Data challenges, focusing on the cloud-based discovery and access solutions. A final section reports the results of the scalability and flexibility performance tests. Display Omitted Big Data challenges for the Global Earth Observation System of Systems (GEOSS).GEOSS Common Infrastructure (GCI) solutions to address Big Data challenges.The role played by the GEO Brokering framework (GEO DAB).GEO DAB cloud configuration.Performance Tests.

218 citations


Journal ArticleDOI
TL;DR: This manuscript is the first comprehensive review of the literature in this quickly evolving water research domain and contributes a general framework for the classification of residential water demand modeling studies, which allows revising consolidated approaches, describing emerging trends, and identifying potential future developments.
Abstract: Over the last two decades, water smart metering programs have been launched in a number of medium to large cities worldwide to nearly continuously monitor water consumption at the single household level. The availability of data at such very high spatial and temporal resolution advanced the ability in characterizing, modeling, and, ultimately, designing user-oriented residential water demand management strategies. Research to date has been focusing on one or more of these aspects but with limited integration between the specialized methodologies developed so far. This manuscript is the first comprehensive review of the literature in this quickly evolving water research domain. The paper contributes a general framework for the classification of residential water demand modeling studies, which allows revising consolidated approaches, describing emerging trends, and identifying potential future developments. In particular, the future challenges posed by growing population demands, constrained sources of water supply and climate change impacts are expected to require more and more integrated procedures for effectively supporting residential water demand modeling and management in several countries across the world. We review high resolution residential water demand modeling studies.We provide a classification of existing technologies and methodologies.We identify current trends, challenges and opportunities for future development.

205 citations


Journal ArticleDOI
TL;DR: The processing of the simple datasets used in the pilot proved to be relatively straightforward using a combination of R, RPy2, PyWPS and PostgreSQL, but the use of NoSQL databases and more versatile frameworks such as OGC standard based implementations may provide a wider and more flexible set of features that particularly facilitate working with larger volumes and more heterogeneous data sources.
Abstract: Recent evolutions in computing science and web technology provide the environmental community with continuously expanding resources for data collection and analysis that pose unprecedented challenges to the design of analysis methods, workflows, and interaction with data sets. In the light of the recent UK Research Council funded Environmental Virtual Observatory pilot project, this paper gives an overview of currently available implementations related to web-based technologies for processing large and heterogeneous datasets and discuss their relevance within the context of environmental data processing, simulation and prediction. We found that, the processing of the simple datasets used in the pilot proved to be relatively straightforward using a combination of R, RPy2, PyWPS and PostgreSQL. However, the use of NoSQL databases and more versatile frameworks such as OGC standard based implementations may provide a wider and more flexible set of features that particularly facilitate working with larger volumes and more heterogeneous data sources. We review web service related technologies to manage, transfer and process Big Data.We examine international standards and related implementations.Many existing algorithms can be easily exposed as services and cloud-enabled.The adoption of standards facilitate the implementation of workflows.Use of web technologies to tackle environmental issues is acknowledged worldwide.

203 citations


Journal ArticleDOI
TL;DR: The changing agricultural modelling landscape since 2002 is described, largely from a software perspective, and a case for a focussed effort on the software implementations of the major models is made.
Abstract: During the past decade, the application of agricultural production systems modelling has rapidly expanded while there has been less emphasis on model improvement. Cropping systems modelling has become agricultural modelling, incorporating new capabilities enabling analyses in the domains of greenhouse gas emissions, soil carbon changes, ecosystem services, environmental performance, food security, pests and disease losses, livestock and pasture production, and climate change mitigation and adaptation. New science has been added to the models to support this broadening application domain, and new consortia of modellers have been formed that span the multiple disciplines.There has not, however, been a significant and sustained focus on software platforms to increase efficiency in agricultural production systems research in the interaction between the software industry and the agricultural modelling community. This paper describes the changing agricultural modelling landscape since 2002, largely from a software perspective, and makes a case for a focussed effort on the software implementations of the major models. The agricultural modelling community has broadened its scientific focus over the last decade.The software implementations of the leading agricultural models hasn't changed significantly in the last decade.A focussed effort on agricultural modelling software and process is needed.

174 citations


Journal ArticleDOI
TL;DR: Performance of the STICS model with its standard set of parameters over a dataset covering 15 crops and a wide range of agropedoclimatic conditions in France showed a good overall accuracy, with little bias.
Abstract: Soil-crop models are increasingly used as predictive tools to assess yield and environmental impacts of agriculture in a growing diversity of contexts. They are however seldom evaluated at a given time over a wide domain of use. We tested here the performances of the STICS model (v8.2.2) with its standard set of parameters over a dataset covering 15 crops and a wide range of agropedoclimatic conditions in France. Model results showed a good overall accuracy, with little bias. Relative RMSE was larger for soil nitrate (49%) than for plant biomass (35%) and nitrogen (33%) and smallest for soil water (10%). Trends induced by contrasted environmental conditions and management practices were well reproduced. Finally, limited dependency of model errors on crops or environments indicated a satisfactory robustness. Such performances make STICS a valuable tool for studying the effects of changes in agro-ecosystems over the domain explored. STICS v8.2.2 soil-crop model was evaluated over a large and varied dataset using its standard set of parameters.Level of accuracy is 10-50% for plant, soil water and nitrate outputs.Model reproduces well trends arising from contrasted agro-environmental conditions.Errors are weakly dependent on the agro-environmental conditions tested.Model accuracy and robustness is considered good for scenario testing and large scale use within the conditions tested here.

Journal ArticleDOI
Yong Tian1, Yi Zheng1, Bin Wu1, Xin Wu1, Jie Liu1, Chunmiao Zheng1 
TL;DR: This study demonstrated the applicability of the new model and its value to the water resources management in arid and semi-arid areas with integrated surface water-groundwater modeling coupled with hydraulic simulation.
Abstract: In semi-arid and arid areas with intensive agriculture, surface water-groundwater (SW-GW) interaction and agricultural water use are two critical and closely interrelated hydrological processes. However, the impact of agricultural water use on the hydrologic cycle has been rarely explored by integrated SW-GW modeling, especially in large basins. This study coupled the Storm Water Management Model (SWMM), which is able to simulate highly engineered flow systems, with the Coupled Ground-Water and Surface-Water Flow Model (GSFLOW). The new model was applied to study the hydrologic cycle of the Zhangye Basin, northwest China, a typical arid to semi-arid area with significant irrigation. After the successful calibration, the model produced a holistic view of the hydrological cycle impact by the agricultural water use, and generated insights into the spatial and temporal patterns of the SW-GW interaction in the study area. Different water resources management scenarios were also evaluated via the modeling. The results showed that if the irrigation demand continuous to increase, the current management strategy would lead to acceleration of the groundwater depletion, and therefore introduce ecological problems to this basin. Overall, this study demonstrated the applicability of the new model and its value to the water resources management in arid and semi-arid areas. Integrated surface water-groundwater modeling coupled with hydraulic simulation.A systematic view on how agricultural water use would impact the water cycle.Insights from the modeling into data collection, model improvement and management.

Journal ArticleDOI
TL;DR: The analysis shows that, contrary to the Environmental Kuznets Curve (EKC), environmental quality cannot be maintained or improved via economic growth and can only be achieved by an increase in the environmental self-renewal rate or the recycling ratio.
Abstract: The main purpose of this paper is to present a theoretical model incorporating the concept of circular economic activities. We construct a circular economy model with two types of economic resources, namely, a polluting input and a recyclable input. Overall, our results indicate that the factors affecting economic growth include the marginal product of the recyclable input, the recycling ratio, the cost of using the environmentally polluting input and the level of pollution arising from the employment of the polluting input. Our analysis also shows that, contrary to the Environmental Kuznets Curve (EKC), environmental quality cannot be maintained or improved via economic growth. Instead, the improvement in environmental quality, as measured by a reduction in pollution, can only be achieved by an increase in the environmental self-renewal rate or the recycling ratio. A closed circular economy model is presented.Environmental quality cannot be maintained or improved via economic growth.The improvement in environmental quality can be achieved by an increase in the recycling ratio.

Journal ArticleDOI
TL;DR: Estimations of ship emissions from regions within a 300?km radius of major capital cities suggest that a non-negligible percentage of air pollutants may come from ships.
Abstract: A model is developed to calculate and spatially allocate ship engine exhaust emissions in ports and extensive coastal waters using terrestrial Automatic Identification System data for ship movements and operating modes. The model is applied to the Australian region. The large geographical extent and number of included ports and vessels, and anomalies in the AIS data are challenging. Particular attention is paid to filtering of the movement data to remove anomalies and assign correct operating modes. Data gaps are filled by interpolation and extrapolation. Emissions and fuel consumption are calculated for each individual vessel at frequent intervals and categorised by ship type, ship size, operating mode and machinery type. Comparisons of calculated port emissions with conventional inventories and ship visit data are favourable. Estimations of ship emissions from regions within a 300?km radius of major capital cities suggest that a non-negligible percentage of air pollutants may come from ships. Model for estimating regional and in-port ship engine exhaust emissions.Ship movement data from Automatic Identification System.Movement data filtered for anomalies, interpolated and extrapolated.Emissions data categorised and spatially allocated.Comparisons with emissions from non-ship sources.

Journal ArticleDOI
TL;DR: A concise introductory overview of sensitivity assessment methods for simulation models, based on derivatives, algebraic analysis, sparse sampling, variance decomposition, Fourier analysis and binary classification are given.
Abstract: In view of increasing application of sensitivity assessment (SA) to environmental simulation models, a relatively short, informal introduction to aims and methods of SA is given. Their variety, motivation and scope are illustrated by outlines of a broad selection of approaches. Methods based on derivatives, algebraic analysis, sparse sampling, variance decomposition, Fourier analysis and binary classification are included. A concise introductory overview of sensitivity assessment (SA) methods for simulation models is given.A broad selection of methods is introduced informally and with no more mathematics than necessary.The motivation of the methods, their scope and their limitations are discussed.Derivative-based SA, algebraic SA, SA of dynamical models, sampled SA, regional SA and SA by emulation are outlined.

Journal ArticleDOI
TL;DR: A new heuristic procedure called the Prescreened Heuristic Sampling Method (PHSM) is proposed and tested on seven WDS cases studies of varying size and shows that PHSM clearly performs best overall, both in terms of computational efficiency and the ability to find near-optimal solutions.
Abstract: Over the last two decades, evolutionary algorithms (EAs) have become a popular approach for solving water resources optimization problems. However, the issue of low computational efficiency limits their application to large, realistic problems. This paper uses the optimal design of water distribution systems (WDSs) as an example to illustrate how the efficiency of genetic algorithms (GAs) can be improved by using heuristic domain knowledge in the sampling of the initial population. A new heuristic procedure called the Prescreened Heuristic Sampling Method (PHSM) is proposed and tested on seven WDS cases studies of varying size. The EPANet input files for these case studies are provided as supplementary material. The performance of the PHSM is compared with that of another heuristic sampling method and two non-heuristic sampling methods. The results show that PHSM clearly performs best overall, both in terms of computational efficiency and the ability to find near-optimal solutions. In addition, the relative advantage of using the PHSM increases with network size. A new heuristic sampling method is introduced for the optimization of WDSs using GAs.The proposed PHSM performs better than three other sampling methods.The advantages are both in efficiency and the ability to find near-optimal solutions.The relative advantage of using the PHSM increases with network size and complexity.

Journal ArticleDOI
TL;DR: The OpenMORDM software framework enables decision makers to identify policy-relevant scenarios, quantify the trade-offs between alternative strategies in different scenarios, flexibly explore alternative definitions of robustness, and identify key system factors that should be monitored as triggers for future actions or additional planning.
Abstract: This study introduces a new open source software framework to support bottom-up environmental systems planning under deep uncertainty with a focus on many-objective robust decision making (MORDM), called OpenMORDM. OpenMORDM contains two complementary components: (1) a software application programming interface (API) for connecting planning models to computational exploration tools for many-objective optimization and sensitivity-based discovery of critical deeply uncertain factors; and (2) a web-based visualization toolkit for exploring high-dimensional datasets to better understand system trade-offs, vulnerabilities, and dependencies. We demonstrate the OpenMORDM framework on a challenging environmental management test case termed the "lake problem". The lake problem has been used extensively in the prior environmental decision science literature and, in this study, captures the challenges posed by conflicting economic and environmental objectives, a water quality "tipping point" beyond which the lake may become irreversibly polluted, and multiple deeply uncertain factors that may undermine the robustness of pollution management policies. The OpenMORDM software framework enables decision makers to identify policy-relevant scenarios, quantify the trade-offs between alternative strategies in different scenarios, flexibly explore alternative definitions of robustness, and identify key system factors that should be monitored as triggers for future actions or additional planning. The web-based OpenMORDM visualization toolkit allows decision makers to easily share and visualize their datasets, with the option for analysts to extend the framework with customized scripts in the R programming language. OpenMORDM provides a platform for constructive decision support, allowing analysts and decision makers to interactively discover promising alternatives and potential vulnerabilities while balancing conflicting objectives. Many-objective robust decision making (MORDM) is an emerging approach for eliciting robust strategies under deep uncertainty.This study provides an open source software implementation of MORDM, called OpenMORDM.OpenMORDM aims to provide accessible visualization and analytic techniques to the environmental modeling community.

Journal ArticleDOI
TL;DR: A Fuzzy Decision Support System is demonstrated to improve the irrigation, given the information on the crop and site characteristics, that combines a predictive model of soil moisture and an inference system computing the most appropriate irrigation action to keep this above a prescribed "safe" level.
Abstract: Since agriculture is the major water consumer, web services have been developed to provide the farmers with considerate irrigation suggestions. This study improves an existing irrigation web service, based on the IRRINET model, by describing a protocol for the field implementation of a fully automated irrigation system. We demonstrate a Fuzzy Decision Support System to improve the irrigation, given the information on the crop and site characteristics. It combines a predictive model of soil moisture and an inference system computing the most appropriate irrigation action to keep this above a prescribed "safe" level. Three crops were used for testing the system: corn, kiwi, and potato. This Fuzzy Decision Support System (FDSS) favourably compared with an existing agricultural model and data-base (IRRINET). The sensitivity of the FDSS was tested with random rainfall and also in this extended case the water saving was confirmed. Display Omitted We describe a Fuzzy Decision Support System to decide the irrigation based on soil moisture and rain forecast.Its rules are easily editable and can be specialized for each crop and agricultural condition.The system provided improved irrigation suggestions in terms of timing and water saving.

Journal ArticleDOI
TL;DR: The simulations showed the presence of backwater effects, sudden and numerous changes in the flow regime, induced by the detailed river model, that underline the importance of using 2-D fully dynamic unsteady flow equations for flood mapping.
Abstract: Flood hazard mapping is a topic of increasing interest involving several aspects in which a series of progress steps have occurred in recent years. Among these, a valuable advance has been performed in solving 2-D shallow water equations in complex topographies and in the use of high resolution topographic data. However, reliable predictions of flood-prone areas are not simply related to these two important aspects. A key element is the accurate set up of the river model. This is primarily related to the representation of the topography but also requires particular attention to the insertion of man-made structures and hydrological data within the computational domain. There is the need to use procedures able to 1) obtain a reliable computational domain, characterized by a total number of elements feasible for a common computing machine, starting from the huge amount of data provided by a LIDAR survey, 2) deal with river reach that receives significant lateral inflows, 3) insert bridges, buildings, weirs and all the structures that can interact with the flow dynamics. All these issues have large effects on the modelled water levels and flow velocities but there are very few papers in the literature on these topics in the framework of the 2-D modelling. So, in this work, attention is focused on the techniques to deal with the above-mentioned issues, showing their importance in flood mapping using two actual case studies in Southern Italy. In particular, the simulations showed in this paper highlight the presence of backwater effects, sudden and numerous changes in the flow regime, induced by the detailed river model, that underline the importance of using 2-D fully dynamic unsteady flow equations for flood mapping. Correct representation of the flood-prone areas topography.Getting from LIDAR data a computational grid feasible for a common computing power.Interaction between hydrologic and hydraulic models.Insertion of structures that can interact with the flow dynamics.

Journal ArticleDOI
TL;DR: TheWRTDS Bootstrap Test (WBT) is introduced, an extension of WRTDS that quantifies the uncertainty in WRT DS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties.
Abstract: Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties Monte Carlo experiments are applied to estimate the Type I error probabilities for this method WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6-24 observations per year The software to conduct the test is in the EGRETci R-package Display Omitted Block bootstrap approach for water quality trends is developedUsed in conjunction with a flexible statistical model for river water qualityTrends in concentration and trends in flux can be evaluatedConfidence intervals can be estimated for trend magnitudeBased on WRTDS: Weighted Regressions on Time, Discharge, and Season

Journal ArticleDOI
TL;DR: A methodological framework and software are presented to evaluate and classify data sets into four classes regarding their suitability for different modelling purposes and gives guidelines to experimentalists for experimental design and decide on the most effective measurements to improve the usefulness of their data for modelling, statistical analysis and data assimilation.
Abstract: Experimental field data are used at different levels of complexity to calibrate, validate and improve agro-ecosystem models to enhance their reliability for regional impact assessment. A methodological framework and software are presented to evaluate and classify data sets into four classes regarding their suitability for different modelling purposes. Weighting of inputs and variables for testing was set from the aspect of crop modelling. The software allows users to adjust weights according to their specific requirements. Background information is given for the variables with respect to their relevance for modelling and possible uncertainties. Examples are given for data sets of the different classes. The framework helps to assemble high quality data bases, to select data from data bases according to modellers requirements and gives guidelines to experimentalists for experimental design and decide on the most effective measurements to improve the usefulness of their data for modelling, statistical analysis and data assimilation. A software is presented to classify and label data suitability for modelling.Data requirements for modelling are specific and vary with model purpose.Quantitative classification of data sets facilitates their use for modelling.Test of model and data consistency improves data usability.Guidelines for experimentalist to improve data suitability for modelling.

Journal ArticleDOI
TL;DR: In this paper, the authors developed and tested an integrated methodology for assessing direct and indirect economic impacts of flooding, which combines a spatial analysis of the damage to the physical stock with a general economic equilibrium approach using a regionally-calibrated (to Italy) version of a Computable General Equilibrium (CGE) global model.
Abstract: In this paper we developed and tested an integrated methodology for assessing direct and indirect economic impacts of flooding. The methodology combines a spatial analysis of the damage to the physical stock with a general economic equilibrium approach using a regionally-calibrated (to Italy) version of a Computable General Equilibrium (CGE) global model. We applied the model to the 2000 Po river flood in Northern Italy. To account for the uncertainty in the induced effects on regional economies, we explored three disruption and two recovery scenarios. The results highlight that: i) the flood event produces indirect losses in the national economic system, which are a significant share of the direct losses, and ii) the methodology is able to capture both positive and negative economic effects of the disaster in different areas of the same country. The assessment of indirect impacts, in particular, is essential for a full understanding of the economic outcomes of natural disasters. Rarely the accounting of flood losses includes indirect economic impacts.The proposed method integrates spatial and computable general equilibrium modelling for the estimation of indirect impacts.We analyse a flood event in Northern Italy, reporting indirect economic impacts as around 20 percent of direct impacts.Economic benefits arise in non-flooded sub-regions of Italy.

Journal ArticleDOI
TL;DR: A modelling framework to link and integrate the Soil Water Assessment Tool (SWAT), a widely used surface watershed model with the MODFLOW, a groundwater model, to facilitate dynamic interactions between surface and groundwater domains at the watershed scale, thus providing a platform for simulatingsurface and groundwater interactions.
Abstract: Assessment of long-term anthropogenic impacts on agro-ecosystems requires comprehensive modelling capabilities to simulate water interactions between the surface and groundwater domains. To address this need, a modelling framework, called "SWATmf", was developed to link and integrate the Soil Water Assessment Tool (SWAT), a widely used surface watershed model with the MODFLOW, a groundwater model. The SWATmf is designed to serve as a project manager, builder, and model performance evaluator, and to facilitate dynamic interactions between surface and groundwater domains at the watershed scale, thus providing a platform for simulating surface and groundwater interactions. Using datasets from the Fort Cobb Reservoir experimental watershed (located in Oklahoma, USA), the SWATmf to facilitate linkage and dynamic simulation of SWAT and MODFLOW models. Simulated streamflow and groundwater levels generally agreed with observations trends showing that the SWATmf can be used for simulating surface and groundwater interactions. A modelling framework integrating SWAT and MODFLOW models based on structure grids and a unified spatio-temporal frame.Implementation of model capabilities for simulating multiple surface watersheds contributing to single aquifer domain.Simulation of daily groundwater from irrigation needs and distributed percolation fluxes in the watershed.Assessment of linkages between groundwater levels and extracted volumes during the 2011 and 2012 dry period in Oklahoma.

Journal ArticleDOI
TL;DR: How to compare the outputs from models that simulate transitions among categories through time and how to compare output maps from pairs of model runs with respect to a reference map of transitions during the validation interval is illustrated.
Abstract: Our article illustrates how to compare the outputs from models that simulate transitions among categories through time. We illustrate the concepts by comparing two land change models: Land Change Modeler and Cellular Automata Markov. We show how the modeling options influence the quantity and allocation of simulated transitions, and how to compare output maps from pairs of model runs with respect to a reference map of transitions during the validation interval. We recommend that the first step is to assess the quantity of each transition and to determine the cause of the variation in quantity among model runs. The second step is to assess the allocation of transitions and to determine the cause of the variation in allocation among model runs. The separation of quantity and allocation of the transitions is a helpful approach to communicate how models work and to describe pattern validation. We compare three runs of models that simulate transitions among land categories.Pattern validation compares a reference map of transition to maps from pairs of runs.Quantity and allocation are helpful concepts to describe models and to compare maps.Quantity refers to the size of each transition from one category to another category.Allocation refers to the spatial distribution of the transitions.

Journal ArticleDOI
TL;DR: A paradigm shift is identified in how the integration of models and sensors can contribute to harnessing 'Big Data' and, more importantly, make the vital step from 'Big data' to 'Big Information'.
Abstract: Sensors are becoming ubiquitous in everyday life, generating data at an unprecedented rate and scale. However, models that assess impacts of human activities on environmental and human health, have typically been developed in contexts where data scarcity is the norm. Models are essential tools to understand processes, identify relationships, associations and causality, formalize stakeholder mental models, and to quantify the effects of prevention and interventions. They can help to explain data, as well as inform the deployment and location of sensors by identifying hotspots and areas of interest where data collection may achieve the best results. We identify a paradigm shift in how the integration of models and sensors can contribute to harnessing 'Big Data' and, more importantly, make the vital step from 'Big Data' to 'Big Information'. In this paper, we illustrate current developments and identify key research needs using human and environmental health challenges as an example. Sensors and models play vital roles in harnessing 'Big Data' to extract information.Data analytics can help to diminish monitoring burden and support locating sensors.Exploring 'Big Data' is essential to detect universal associations across space and time.Ethical challenges and issues of standards and harmonisation need to be addressed.Citizen science needs robust sensors and models to crowd-source and interpret data.

Journal ArticleDOI
TL;DR: This study demonstrates a procedure to design a feasible novel configuration for maximizing energy and nutrient recovery in the Netherlands and provides a good starting point for the design of promising layouts that will improve sustainability of municipal wastewater management in the future.
Abstract: Activated sludge systems are commonly used for robust and efficient treatment of municipal wastewater. However, these systems cannot achieve their maximum potential to recover valuable resources from wastewater. This study demonstrates a procedure to design a feasible novel configuration for maximizing energy and nutrient recovery. A simulation model was developed based on literature data and recent experimental research using steady-state energy and mass balances with conversions. The analysis showed that in the Netherlands, proposed configuration consists of four technologies: bioflocculation, cold partial nitritation/Anammox, P recovery, and anaerobic digestion. Results indicate the possibility to increase net energy yield up to 0.24?kWh/m3 of wastewater, while reducing carbon emissions by 35%. Moreover, sensitivity analysis points out the dominant influence of wastewater organic matter on energy production and consumption. This study provides a good starting point for the design of promising layouts that will improve sustainability of municipal wastewater management in the future. We demonstrate a five-step procedure to develop future sewage treatment plants.Steady-state energy and mass balances with conversions help to select scenarios.A promising scenario to treat and recover resources is proposed for Dutch case.Model shows recovery of energy yield of 0.24?kWh/m3 or 39% of organic carbon load.

Journal ArticleDOI
TL;DR: A tool is presented that can be used by a range of end-users for the assessment of the monetary loss from future landslide events, with a particular focus on torrential processes.
Abstract: Global environmental change includes changes in a wide range of global scale phenomena, which are expected to affect a number of physical processes, as well as the vulnerability of the communities that will experience their impact. Decision-makers are in need of tools that will enable them to assess the loss of such processes under different future scenarios and to design risk reduction strategies. In this paper, a tool is presented that can be used by a range of end-users (e.g. local authorities, decision makers, etc.) for the assessment of the monetary loss from future landslide events, with a particular focus on torrential processes. The toolbox includes three functions: a) enhancement of the post-event damage data collection process, b) assessment of monetary loss of future events and c) continuous updating and improvement of an existing vulnerability curve by adding data of recent events. All functions of the tool are demonstrated through examples of its application. We developed a tool that will support decision making for disaster risk reduction strategies in mountain areas.The tool incorporates three functions: damage documentation, loss estimation and updating of the vulnerability curve.The tool was applied and tested in South Tyrol, Italy.Future developments (more elements at risks and hazards, uncertainty analysis, mobile applications) have been pointed out.

Journal ArticleDOI
TL;DR: The newly developed algorithm (MOSPD) is applied to the OTC reservoir releasing problem during the snow melting season in 1998, 2000 and 2001, in which the more spreading and converged non-dominated solutions of MOSPD provide decision makers with better operational alternatives.
Abstract: This study demonstrates the application of an improved Evolutionary optimization Algorithm (EA), titled Multi-Objective Complex Evolution Global Optimization Method with Principal Component Analysis and Crowding Distance Operator (MOSPD), for the hydropower reservoir operation of the Oroville-Thermalito Complex (OTC) - a crucial head-water resource for the California State Water Project (SWP). In the OTC's water-hydropower joint management study, the nonlinearity of hydropower generation and the reservoir's water elevation-storage relationship are explicitly formulated by polynomial function in order to closely match realistic situations and reduce linearization approximation errors. Comparison among different curve-fitting methods is conducted to understand the impact of the simplification of reservoir topography. In the optimization algorithm development, techniques of crowding distance and principal component analysis are implemented to improve the diversity and convergence of the optimal solutions towards and along the Pareto optimal set in the objective space. A comparative evaluation among the new algorithm MOSPD, the original Multi-Objective Complex Evolution Global Optimization Method (MOCOM), the Multi-Objective Differential Evolution method (MODE), the Multi-Objective Genetic Algorithm (MOGA), the Multi-Objective Simulated Annealing approach (MOSA), and the Multi-Objective Particle Swarm Optimization scheme (MOPSO) is conducted using the benchmark functions. The results show that best the MOSPD algorithm demonstrated the best and most consistent performance when compared with other algorithms on the test problems. The newly developed algorithm (MOSPD) is further applied to the OTC reservoir releasing problem during the snow melting season in 1998 (wet year), 2000 (normal year) and 2001 (dry year), in which the more spreading and converged non-dominated solutions of MOSPD provide decision makers with better operational alternatives for effectively and efficiently managing the OTC reservoirs in response to the different climates, especially drought, which has become more and more severe and frequent in California. A new multi objective optimization algorithm, entitled MOSPD, is developed.Comparison study is carried out over eight complex test functions.MOSPD is effective and efficient in searching global Pareto optimal.A reservoir system model is built for Oroville-Thermalito complex in California.MOSPD provides flexible reservoir release strategies to support decision making.

Journal ArticleDOI
TL;DR: The NASA Unified-Weather Research and Forecasting model (NU-WRF) is an observation-driven integrated modeling system that represents aerosol, cloud, precipitation and land processes at satellite-resolved scales to help bridge the continuum between local, regional and global processes.
Abstract: With support from NASA's Modeling and Analysis Program, we have recently developed the NASA Unified-Weather Research and Forecasting model (NU-WRF). NU-WRF is an observation-driven integrated modeling system that represents aerosol, cloud, precipitation and land processes at satellite-resolved scales. "Satellite-resolved" scales (roughly 1-25?km), bridge the continuum between local (microscale), regional (mesoscale) and global (synoptic) processes. NU-WRF is a superset of the National Center for Atmospheric Research (NCAR) Advanced Research WRF (ARW) dynamical core model, achieved by fully integrating the GSFC Land Information System (LIS, already coupled to WRF), the WRF/Chem enabled version of the GOddard Chemistry Aerosols Radiation Transport (GOCART) model, the Goddard Satellite Data Simulation Unit (G-SDSU), and custom boundary/initial condition preprocessors into a single software release, with source code available by agreement with NASA/GSFC. Full coupling between aerosol, cloud, precipitation and land processes is critical for predicting local and regional water and energy cycles. NU-WRF is an observation-driven integrated land-atmosphere modeling system.The software is a NASA-oriented superset of the standard NCAR WRF software.Enhancements include a satellite simulator package, coupling and physics options.Maintained at NASA/GSFC in an SVN repository, software is available by agreement.Supports coupling studies for land, atmosphere, aerosols, clouds and precipitation.

Journal ArticleDOI
TL;DR: The development of a model for assessing TRAffic Noise EXposure (TRANEX) in an open-source geographic information system so that the treatment of source geometry, traffic information and receptors matched as closely as possible to that of the air pollution modelling being undertaken in the TRAFFIC project.
Abstract: This paper describes the development of a model for assessing TRAffic Noise EXposure (TRANEX) in an open-source geographic information system. Instead of using proprietary software we developed our own model for two main reasons: 1) so that the treatment of source geometry, traffic information (flows/speeds/spatially varying diurnal traffic profiles) and receptors matched as closely as possible to that of the air pollution modelling being undertaken in the TRAFFIC project, and 2) to optimize model performance for practical reasons of needing to implement a noise model with detailed source geometry, over a large geographical area, to produce noise estimates at up to several million address locations, with limited computing resources. To evaluate TRANEX, noise estimates were compared with noise measurements made in the British cities of Leicester and Norwich. High correlation was seen between modelled and measured LAeq,1hr (Norwich: r?=?0.85, p?=?.000; Leicester: r?=?0.95, p?=?.000) with average model errors of 3.1?dB. TRANEX was used to estimate noise exposures (LAeq,1hr, LAeq,16hr, Lnight) for the resident population of London (2003-2010). Results suggest that 1.03 million (12%) people are exposed to daytime road traffic noise levels???65?dB(A) and 1.63 million (19%) people are exposed to night-time road traffic noise levels???55?dB(A). Differences in noise levels between 2010 and 2003 were on average relatively small: 0.25?dB (standard deviation: 0.89) and 0.26?dB (standard deviation: 0.87) for LAeq,16hr and Lnight. Display Omitted Adaptation of the Calculation of Road Traffic Noise method for exposure assessment.Freely available open-source software (R with PostgreSQL and GRASS GIS).Model estimates compared well to noise measurements (r: ~0.85-0.95).Noise level exposures modelled for 8.61 million London residents (2003-2010).Over 1 million residents exposed to high daytime and night-time noise levels.