scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2013"


Journal ArticleDOI
TL;DR: General classes of direct value comparison, coupling real and modelled values, preserving data patterns, indirect metrics based on parameter values, and data transformations are discussed.
Abstract: In order to use environmental models effectively for management and decision-making, it is vital to establish an appropriate level of confidence in their performance. This paper reviews techniques available across various fields for characterising the performance of environmental models with focus on numerical, graphical and qualitative methods. General classes of direct value comparison, coupling real and modelled values, preserving data patterns, indirect metrics based on parameter values, and data transformations are discussed. In practice environmental modelling requires the use and implementation of workflows that combine several methods, tailored to the model purpose and dependent upon the data and information available. A five-step procedure for performance evaluation of models is suggested, with the key elements including: (i) (re)assessment of the model's aim, scale and scope; (ii) characterisation of the data for calibration and testing; (iii) visual and other analysis to detect under- or non-modelled behaviour and to gain an overview of overall performance; (iv) selection of basic performance criteria; and (v) consideration of more advanced methods to handle problems such as systematic divergence between modelled and observed values.

1,207 citations


Journal ArticleDOI
TL;DR: A guiding framework is presented that aims to assist modellers and model users in the choice of an appropriate modelling approach for their integrated assessment applications and that enables more effective learning in interdisciplinary settings.
Abstract: The design and implementation of effective environmental policies need to be informed by a holistic understanding of the system processes (biophysical, social and economic), their complex interactions, and how they respond to various changes. Models, integrating different system processes into a unified framework, are seen as useful tools to help analyse alternatives with stakeholders, assess their outcomes, and communicate results in a transparent way. This paper reviews five common approaches or model types that have the capacity to integrate knowledge by developing models that can accommodate multiple issues, values, scales and uncertainty considerations, as well as facilitate stakeholder engagement. The approaches considered are: systems dynamics, Bayesian networks, coupled component models, agent-based models and knowledge-based models (also referred to as expert systems). We start by discussing several considerations in model development, such as the purpose of model building, the availability of qualitative versus quantitative data for model specification, the level of spatio-temporal detail required, and treatment of uncertainty. These considerations and a review of applications are then used to develop a framework that aims to assist modellers and model users in the choice of an appropriate modelling approach for their integrated assessment applications and that enables more effective learning in interdisciplinary settings. We review five common integrated modelling approaches.Model choice considers purpose, data type, scale and uncertainty treatment.We present a guiding framework for selecting the most appropriate approach.

637 citations


Journal ArticleDOI
TL;DR: This paper organizes and presents the results of a number of workshops held that brought IEM practitioners together to share experiences and discuss future needs and directions, and presents IEM as a landscape containing four interdependent elements: applications, science, technology, and community.
Abstract: Integrated environmental modeling (IEM) is inspired by modern environmental problems, decisions, and policies and enabled by transdisciplinary science and computer capabilities that allow the environment to be considered in a holistic way. The problems are characterized by the extent of the environmental system involved, dynamic and interdependent nature of stressors and their impacts, diversity of stakeholders, and integration of social, economic, and environmental considerations. IEM provides a science-based structure to develop and organize relevant knowledge and information and apply it to explain, explore, and predict the behavior of environmental systems in response to human and natural sources of stress. During the past several years a number of workshops were held that brought IEM practitioners together to share experiences and discuss future needs and directions. In this paper we organize and present the results of these discussions. IEM is presented as a landscape containing four interdependent elements: applications, science, technology, and community. The elements are described from the perspective of their role in the landscape, current practices, and challenges that must be addressed. Workshop participants envision a global scale IEM community that leverages modern technologies to streamline the movement of science-based knowledge from its sources in research, through its organization into databases and models, to its integration and application for problem solving purposes. Achieving this vision will require that the global community of IEM stakeholders transcend social, and organizational boundaries and pursue greater levels of collaboration. Among the highest priorities for community action are the development of standards for publishing IEM data and models in forms suitable for automated discovery, access, and integration; education of the next generation of environmental stakeholders, with a focus on transdisciplinary research, development, and decision making; and providing a web-based platform for community interactions (e.g., continuous virtual workshops).

441 citations


Journal ArticleDOI
TL;DR: The ODD + D protocol may prove helpful for describing ABMs in general when human decisions are included and incorporates a section on 'Theoretical and Empirical Background' to encourage model designs and model assumptions that are more closely related to theory.
Abstract: Representing human decisions is of fundamental importance in agent-based models. However, the rationale for choosing a particular human decision model is often not sufficiently empirically or theoretically substantiated in the model documentation. Furthermore, it is difficult to compare models because the model descriptions are often incomplete, not transparent and difficult to understand. Therefore, we expand and refine the 'ODD' (Overview, Design Concepts and Details) protocol to establish a standard for describing ABMs that includes human decision-making (ODD + D). Because the ODD protocol originates mainly from an ecological perspective, some adaptations are necessary to better capture human decision-making. We extended and rearranged the design concepts and related guiding questions to differentiate and describe decision-making, adaptation and learning of the agents in a comprehensive and clearly structured way. The ODD + D protocol also incorporates a section on 'Theoretical and Empirical Background' to encourage model designs and model assumptions that are more closely related to theory. The application of the ODD + D protocol is illustrated with a description of a social-ecological ABM on water use. Although the ODD + D protocol was developed on the basis of example implementations within the socio-ecological scientific community, we believe that the ODD + D protocol may prove helpful for describing ABMs in general when human decisions are included.

386 citations


Journal ArticleDOI
TL;DR: This thematic issue reviews progress in spatial agent-based models along the lines of four methodological challenges: design and parameterizing of agent decision models, verification, validation and sensitivity analysis, integration of socio-demographic, ecological, and biophysical models, and spatial representation.
Abstract: Departing from the comprehensive reviews carried out in the field, we identify the key challenges that agent-based methodology faces when modeling coupled socio-ecological systems. Focusing primarily on the papers presented in this thematic issue, we review progress in spatial agent-based models along the lines of four methodological challenges: (1) design and parameterizing of agent decision models, (2) verification, validation and sensitivity analysis, (3) integration of socio-demographic, ecological, and biophysical models, and (4) spatial representation. Based on this we critically reflect on the future work that is required to make agent-based modeling widely accepted as a tool to support the real world policy. Progress of agent-based methodology in modeling coupled socio-ecological systems.Key methodological challenges for ABM.Societal issues and critical reflection on the prospects of ABM.

371 citations


Journal ArticleDOI
TL;DR: MORDM is introduced and results suggest that including robustness as a decision criterion can dramatically change the formulation of complex environmental management problems as well as the negotiated selection of candidate alternatives to implement.
Abstract: This paper introduces many objective robust decision making (MORDM). MORDM combines concepts and methods from many objective evolutionary optimization and robust decision making (RDM), along with extensive use of interactive visual analytics, to facilitate the management of complex environmental systems. Many objective evolutionary search is used to generate alternatives for complex planning problems, enabling the discovery of the key tradeoffs among planning objectives. RDM then determines the robustness of planning alternatives to deeply uncertain future conditions and facilitates decision makers' selection of promising candidate solutions. MORDM tests each solution under the ensemble of future extreme states of the world (SOW). Interactive visual analytics are used to explore whether solutions of interest are robust to a wide range of plausible future conditions (i.e., assessment of their Pareto satisficing behavior in alternative SOW). Scenario discovery methods that use statistical data mining algorithms are then used to identify what assumptions and system conditions strongly influence the cost-effectiveness, efficiency, and reliability of the robust alternatives. The framework is demonstrated using a case study that examines a single city's water supply in the Lower Rio Grande Valley (LRGV) in Texas, USA. Results suggest that including robustness as a decision criterion can dramatically change the formulation of complex environmental management problems as well as the negotiated selection of candidate alternatives to implement. MORDM also allows decision makers to characterize the most important vulnerabilities for their systems, which should be the focus of ex post monitoring and identification of triggers for adaptive management.

356 citations


Journal ArticleDOI
TL;DR: This review discusses a number of BBN-based ESS models developed in the last decade and highlights the advantages and disadvantages of BBNs in ESS modelling and pinpoints remaining challenges for future research.
Abstract: A wide range of quantitative and qualitative modelling research on ecosystem services (ESS) has recently been conducted. The available models range between elementary, indicator-based models and complex process-based systems. A semi-quantitative modelling approach that has recently gained importance in ecological modelling is Bayesian belief networks (BBNs). Due to their high transparency, the possibility to combine empirical data with expert knowledge and their explicit treatment of uncertainties, BBNs can make a considerable contribution to the ESS modelling research. However, the number of applications of BBNs in ESS modelling is still limited. This review discusses a number of BBN-based ESS models developed in the last decade. A SWOT analysis highlights the advantages and disadvantages of BBNs in ESS modelling and pinpoints remaining challenges for future research. The existing BBN models are suited to describe, analyse, predict and value ESS. Nevertheless, some weaknesses have to be considered, including poor flexibility of frequently applied software packages, difficulties in eliciting expert knowledge and the inability to model feedback loops. BBNs are increasingly used to analyse, predict and value ecosystem services (ESS).Most BBN applications in ESS modelling target only a single service.Numerous advantages of BBNs in ESS modelling are demonstrated in current applications.Model drawbacks are absence of feedback loops and obligatory variable discretization.Spatially explicit modelling and modelling of ESS bundles are future opportunities.

298 citations


Journal ArticleDOI
TL;DR: The Connectivity Modeling System is described, a probabilistic, multi-scale model that provides Lagrangian descriptions of oceanic phenomena and can be used in a broad range of oceanographic applications, from the fate of pollutants to the pathways of water masses in the global ocean.
Abstract: Pelagic organisms' movement and motion of buoyant particles are driven by processes operating across multiple, spatial and temporal scales. We developed a probabilistic, multi-scale model, the Connectivity Modeling System (CMS), to gain a mechanistic understanding of dispersion and migration processes in the ocean. The model couples offline a new nested-grid technique to a stochastic Lagrangian framework where individual variability is introduced by drawing particles' attributes at random from specified probability distributions of traits. This allows 1) to track seamlessly a large number of both actively swimming and inertial particles over multiple, independent ocean model domains and 2) to generate ensemble forecasts or hindcasts of the particles' three dimensional trajectories, dispersal kernels, and transition probability matrices used for connectivity estimates. In addition, CMS provides Lagrangian descriptions of oceanic phenomena (advection, dispersion, retention) and can be used in a broad range of oceanographic applications, from the fate of pollutants to the pathways of water masses in the global ocean. Here we describe the CMS modular system where particle behavior can be augmented with specific features, and a parallel module implementation simplifies data management and CPU intensive computations associated with solving for the tracking of millions of active particles. Some novel features include on-the-fly data access of operational hydrodynamic models, individual particle variability and inertial motion, and multi-nesting capabilities to optimize resolution. We demonstrate the performance of the interpolation algorithm by testing accuracy in tracing the flow stream lines in both time and space and the efficacy of probabilistic modeling in evaluating the bio-physical coupling against empirical data. Finally, following recommended practices for the development of community models, we provide an open source code with a series of coupled standalone, optional modules detailed in a user's guide.

281 citations


Journal ArticleDOI
TL;DR: This work facilitates well-informed design and application of Zonation analyses for the purpose of spatial conservation planning and focuses on common pre- and post-processing stages of analysis.
Abstract: Spatial conservation prioritization concerns the effective allocation of conservation action. Its stages include development of an ecologically based model of conservation value, data pre-processing, spatial prioritization analysis, and interpretation of results for conservation action. Here we investigate the details of each stage for analyses done using the Zonation prioritization framework. While there is much literature about analytical methods implemented in Zonation, there is only scattered information available about what happens before and after the computational analysis. Here we fill this information gap by summarizing the pre-analysis and post-analysis stages of the Zonation framework. Concerning the entire process, we summarize the full workflow and list examples of operational best-case, worst-case, and typical scenarios for each analysis stage. We discuss resources needed in different analysis stages. We also discuss benefits, disadvantages, and risks involved in the application of spatial prioritization from the perspective of different stakeholders. Concerning pre-analysis stages, we explain the development of the ecological model and discuss the setting of priority weights and connectivity responses. We also explain practical aspects of data pre-processing and the post-processing interpretation of results for different conservation objectives. This work facilitates well-informed design and application of Zonation analyses for the purpose of spatial conservation planning. It should be useful for both scientists working on conservation related research as well as for practitioners looking for useful tools for conservation resource allocation. We summarize spatial conservation prioritization analysis workflow using Zonation.We focus on common pre- and post-processing stages of analysis.The utility of several Zonation analysis types are discussed.We summarize operational best-case and worst-case scenarios for analysis stages.This study helps conservation practitioners implement prioritization analyses.

270 citations


Journal ArticleDOI
TL;DR: Three different approaches were employed to calibrate and validate the HEC-HMS 3.4 model to Attanagalu Oya (River) catchment and generate long term flow data for the Oya and the tributaries to determine the most suitable simulation method to the study catchment.
Abstract: Hydrologic simulation employing computer models has advanced rapidly and computerized models have become essential tools for understanding human influences on river flows and designing ecologically sustainable water management approaches. The HEC-HMS is a reliable model developed by the US Army Corps of Engineers that could be used for many hydrological simulations. This model is not calibrated and validated for Sri Lankan watersheds and need reliable data inputs to check the suitability of the model for the study location and purpose. Therefore, this study employed three different approaches to calibrate and validate the HEC-HMS 3.4 model to Attanagalu Oya (River) catchment and generate long term flow data for the Oya and the tributaries.Twenty year daily rainfall data from five rain gauging stations scattered within the Attanagalu Oya catchment and monthly evaporation data for the same years for the agro meteorological station Henarathgoda together with daily flow data at Dunamale from 2005 to 2010 were used in the study. GIS layers that were needed as input data for the flow simulation were prepared using Arc GIS 9.2 and used in the HEC-HMS 3.4 calibration of the Dunamale sub catchment using daily flow data from 2005 to 2007. The model was calibrated adjusting three different methods. The model parameters were changed and the model calibration was performed separately for the three selected methods, the Soil Conservation Service Curve Number loss method, the deficit constant loss method (the Snyder unit hydrograph method and the Clark unit hydrograph method) in order to determine the most suitable simulation method to the study catchment. The calibrated model was validated with a new set of rainfall and flow data (2008-2010). The flows simulated from each methods were tested statistically employing the coefficient of performance, the relative error and the residual method. The Snyder unit hydrograph method simulates flows more reliably than the Clark unit hydrograph method. As the loss method, the SCS Curve Number method does not perform well. HEC-HMS can reliably be used to simulate flows in ungauged tropical watersheds.Snyder unit hydrograph method is more reliably than Clark unit hydrograph method.As the loss method, the SCS Curve Number method does not perform well.Deficit and constant method is more reliable than SCS Curve Number method.

244 citations


Journal ArticleDOI
TL;DR: It is argued that one possible remedy is to learn to use data sets as modules and integrate them into the models, so that calibration can become an important limiting factor, giving more promise to the integral approach, when the system is modeled and simplified as a whole.
Abstract: In many cases model integration treats models as software components only, ignoring the fluid relationship between models and reality, the evolving nature of models and their constant modification and recalibration. As a result, with integrated models we find increased complexity, where changes that used to impact only relatively contained models of subsystems, now propagate throughout the whole integrated system. This makes it harder to keep the overall complexity under control and, in a way, defeats the purpose of modularity, when efficiency is supposed to be gained from independent development of modules. Treating models only as software in solving the integration challenge may give birth to 'integronsters' - constructs that are perfectly valid as software products but ugly or even useless as models. We argue that one possible remedy is to learn to use data sets as modules and integrate them into the models. Then the data that are available for module calibration can serve as an intermediate linkage tool, sitting between modules and providing a module-independent baseline dynamics, which is then incremented when scenarios are to be run. In this case it is not the model output that is directed into the next model input, but model output is presented as a variation around the baseline trajectory, and it is this variation that is then fed into the next module down the chain. However still with growing overall complexity, calibration can become an important limiting factor, giving more promise to the integral approach, when the system is modeled and simplified as a whole.

Journal ArticleDOI
TL;DR: It is concluded that the parametric approach, here the FVI, is the only one which evaluates vulnerability to floods; whilst although the deterministic approach has limited evaluation of vulnerability, it has a better science base.
Abstract: Floods are one of the most common and widely distributed natural risks to life and property. There is a need to identify the risk in flood-prone areas to support decisions for risk management, from high-level planning proposals to detailed design. There are many methods available to undertake such studies. The most accepted, and therefore commonly used, of which is computer-based inundation mapping. By contrast the parametric approach of vulnerability assessment is increasingly accepted. Each of these approaches has advantages and disadvantages for decision makers and this paper focuses on how the two approaches compare in use. It is concluded that the parametric approach, here the FVI, is the only one which evaluates vulnerability to floods; whilst although the deterministic approach has limited evaluation of vulnerability, it has a better science base.

Journal ArticleDOI
TL;DR: Key EMF design goals/constraints are discussed and software engineering aspects that have made OMS3 framework development efficacious and its application practical are addressed, as demonstrated by leveraging software engineering efforts outside of the modeling community and lessons learned from over a decade of EMF development.
Abstract: The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to address this problem, but much work remains before EMFs are adopted as mainstream modeling tools. Environmental model development requires both scientific understanding of environmental phenomena and software developer proficiency. EMFs support the modeling process through streamlining model code development, allowing seamless access to data, and supporting data analysis and visualization. EMFs also support aggregation of model components into functional units, component interaction and communication, temporal-spatial stepping, scaling of spatial data, multi-threading/multi-processor support, and cross-language interoperability. Some EMFs additionally focus on high-performance computing and are tailored for particular modeling domains such as ecosystem, socio-economic, or climate change research. The Object Modeling System Version 3 (OMS3) EMF employs new advances in software framework design to better support the environmental model development process. This paper discusses key EMF design goals/constraints and addresses software engineering aspects that have made OMS3 framework development efficacious and its application practical, as demonstrated by leveraging software engineering efforts outside of the modeling community and lessons learned from over a decade of EMF development. Software engineering approaches employed in OMS3 are highlighted including a non-invasive lightweight framework design supporting component-based model development, use of implicit parallelism in system design, use of domain specific language design patterns, and cloud-based support for computational scalability. The key advancements in EMF design presented herein may be applicable and beneficial for other EMF developers seeking to better support environmental model development through improved framework design.

Journal ArticleDOI
TL;DR: One of the key features of the Delft-FEWS operational forecasting platform is its flexibility in integrating (third-party) models and data, and the available approaches to linking models and accessing data are highlighted.
Abstract: Since its introduction in 2002/2003, the current generation of the Delft-FEWS operational forecasting platform has found application in over forty operational centres. In these it is used to link data and models in real time, producing forecasts on a daily basis. In some cases it forms a building block of a country-wide national forecasting system using distributed client-server technology. In other cases it is applied at a much smaller scale on a simple desktop workstation, providing forecasts for a single basin. The flexibility of the software in open integration of models and data has additionally appealed to the research community.This paper discusses the principles on which the Delft-FEWS system has been developed, as well as a brief background of the architecture of the system and concepts used for storing and handling data. One of the key features of the system is its flexibility in integrating (third-party) models and data, and the available approaches to linking models and accessing data are highlighted. A brief overview of different applications of the system is given to illustrate how the software is used to support differing objectives in the domain of real time environmental modelling. Highlights? A state of the art real time environmental decision support system is presented. ? Open model integration is shown to be key to sustainable application. ? Clear interfaces are shown to be effective in reducing complexity.

Journal ArticleDOI
TL;DR: The soft-sensors presented in the case studies have been found to be effective and inexpensive technologies for extracting and modelling relevant process information directly from the process and laboratory data routinely acquired in biological wastewater treatment facilities.
Abstract: This paper surveys and discusses the application of data-derived soft-sensing techniques in biological wastewater treatment plants. Emphasis is given to an extensive overview of the current status and to the specific challenges and potential that allow for an effective application of these soft-sensors in full-scale scenarios. The soft-sensors presented in the case studies have been found to be effective and inexpensive technologies for extracting and modelling relevant process information directly from the process and laboratory data routinely acquired in biological wastewater treatment facilities. The extracted information is in the form of timely analysis of hard-to-measure primary process variables and process diagnostics that characterize the operation of the plants and their instrumentation. The information is invaluable for an effective utilization of advanced control and optimization strategies. We review data-derived soft-sensors proposed for biological wastewater treatment.Increased amount of measured process data has made data-driven modelling attractive.A general guideline for the data-derived soft-sensor development is presented.Artificial intelligence and multivariate statistical methods are popular in soft-sensor design.Popularity of the on-line prediction applications has increased during recent years.

Journal ArticleDOI
TL;DR: The Source IMS is an integrated modelling environment containing algorithms and approaches that allow defensible predictions of water flow and constituents from catchment sources to river outlets at the sea, designed and developed to underpin a wide range of water planning and management purposes.
Abstract: Management of regulated water systems has become increasingly complex due to rapid socio-economic growth and environmental changes in river basins over recent decades. This paper introduces the Source Integrated Modelling System (IMS), and describes the individual modelling components and how they are integrated within it. It also describes the methods employed for tracking and assessment of uncertainties, as well as presenting outcomes of two case study applications. Traditionally, the mathematical tools for water resources planning and management were generally designed for sectoral applications with, for example, groundwater being modelled separately from surface water. With the increasing complexity of water resources management in the 21st century those tools are becoming outmoded. Water management organisations are increasingly looking for new generation tools that allow integration across domains to assist their decision making processes for short-term operations and long-term planning; not only to meet current needs, but those of the future as well. In response to the need for an integrated tool in the water industry in Australia, the eWater Cooperative Research Centre (CRC) has developed a new generation software package called the Source IMS. The Source IMS is an integrated modelling environment containing algorithms and approaches that allow defensible predictions of water flow and constituents from catchment sources to river outlets at the sea. It is designed and developed to provide a transparent, robust and repeatable approach to underpin a wide range of water planning and management purposes. It can be used to develop water sharing plans and underpin daily river operations, as well as be used for assessments on water quantity and quality due to changes in: i) land-use and climate; ii) demands (irrigation, urban, ecological); iii) infrastructure, such as weirs and reservoirs; iv) management rules that might be associated with these; and v) the impacts of all of the above on various ecological indices. The Source IMS integrates the existing knowledge and modelling capabilities used by different state and federal water agencies across Australia and has additional functionality required for the river system models that will underpin the next round of water sharing plans in the country. It is built in a flexible modelling environment to allow stakeholders to incorporate new scientific knowledge and modelling methods as they evolve, and is designed as a generic tool suitable for use across different jurisdictions. Due to its structure, the platform can be extended/customised for use in other countries and basins, particularly where there are boundary issues.

Journal ArticleDOI
TL;DR: This study developed a unique methodology which extends the AHP-SA model proposed by Chen et al. (2010) to a more comprehensive framework to analyze weight sensitivity caused by both direct and indirect weight changes using the one-at-a-time (OAT) technique.
Abstract: Criteria weights determined from pairwise comparisons are often the greatest contributor to the uncertainties in the AHP-based multi-criteria decision making (MCDM). During an MCDM process, the weights can be changed directly by adjusting the output from a pairwise comparison matrix, or indirectly by recalculating the matrix after varying its input. Corresponding weight sensitivity on multi-criteria evaluation results is generally difficult to be quantitatively assessed and spatially visualized. This study developed a unique methodology which extends the AHP-SA model proposed by Chen et al. (2010) to a more comprehensive framework to analyze weight sensitivity caused by both direct and indirect weight changes using the one-at-a-time (OAT) technique. With increased efficiency, improved flexibility and enhanced visualization capability, the spatial framework was developed as AHP-SA2 within a GIS platform. A case study with in-depth discussion is provided to demonstrate the new toolset. It assists stakeholders and researchers with better understanding of weight sensitivity for characterising, reporting and minimising uncertainty in the AHP-based spatial MCDM.

Journal ArticleDOI
TL;DR: This study highlights the importance of considering multiple characteristics of the model parameters and the responses of the models in specific phenological stages, including the biomass, yield, leaf area index, and transpiration coefficient.
Abstract: Sensitivity analysis (SA) has become a basic tool for the understanding, application and development of models. However, in the past, little attention has been paid to the effects of the parameter sample size and parameter variation range on the parameter SA and its temporal properties. In this paper, the corn crop planted in 2008 in the Yingke Oasis of northwest China is simulated based on meteorological observation data for the inputs and statistical data for the parameters. Furthermore, using the extended Fourier Amplitude Sensitivity (EFAST) algorithm, SA is performed on the 47 crop parameters of the WOrld FOod STudies (WOFOST) crop growth models. A deep analysis is conducted, including the effects of the parameter sample size and variation range on the parameter SA, the temporal properties and the multivariable output issues of SA. The results show that sample size highly affects the convergence of the sensitivity indices. Two types of parameter variation ranges are used for the analysis, and the results show that the sensitive parameters of the two parameter spaces are distinctly different. In addition, taking the storage organ biomasses at the different growth stages as the objective output, the time-dependent characteristics of the parameter sensitivity are discussed. The results show that several sensitive parameters exist in the grain biomass throughout the entire development stage. In addition, analyzing the twelve sensitive parameters has proven that although certain parameters have no effect on the final yield, they play key roles in certain growth stages, and the importance of these parameters gradually increases. Finally, the sensitivity analyses of different state variable outputs are performed, including the biomass, yield, leaf area index, and transpiration coefficient. The results suggest that the sensitive parameters of various variable processes differ. This study highlights the importance of considering multiple characteristics of the model parameters and the responses of the models in specific phenological stages.

Journal ArticleDOI
TL;DR: This paper links identified clusters to known emission characteristics to confirm the inferences made in the analysis and should have wide application to the analysis of air pollution monitoring data.
Abstract: This paper develops the idea of bivariate polar plots as a method for source detection and characterisation. Bivariate polar plots provide a graphical method for showing the joint wind speed, wind direction dependence of air pollutant concentrations. Bivariate polar plots provide an effective graphical means of discriminating different source types and characteristics. In the current work we apply k-means clustering techniques directly to bivariate polar plots to identify and group similar features. The technique is analogous to clustering applied to back trajectories at the regional scale. When applied to data from a monitoring site with high source complexity it is shown that the technique is able to identify important clusters in ambient monitoring data that additional analysis shows to exhibit different source characteristics. Importantly, this paper links identified clusters to known emission characteristics to confirm the inferences made in the analysis. The approaches developed should have wide application to the analysis of air pollution monitoring data and have been made freely available as part of the openair R package.

Journal ArticleDOI
TL;DR: The Group on Earth Observation Model Web initiative utilizes a Model as a Service approach to increase model access and sharing, and a flexible architecture, capable of integrating different existing distributed computing infrastructures, is required to address the performance requirements.
Abstract: The Group on Earth Observation (GEO) Model Web initiative utilizes a Model as a Service approach to increase model access and sharing. It relies on gradual, organic growth leading towards dynamic webs of interacting models, analogous to the World Wide Web. The long term vision is for a consultative infrastructure that can help address "what if" and other questions that decision makers and other users have. Four basic principles underlie the Model Web: open access, minimal barriers to entry, service-driven, and scalability; any implementation approach meeting these principles will be a step towards the long term vision. Implementing a Model Web encounters a number of technical challenges, including information modelling, minimizing interoperability agreements, performance, and long term access, each of which has its own implications. For example, a clear information model is essential for accommodating the different resources published in the Model Web (model engines, model services, etc.), and a flexible architecture, capable of integrating different existing distributed computing infrastructures, is required to address the performance requirements. Architectural solutions, in keeping with the Model Web principles, exist for each of these technical challenges. There are also a variety of other key challenges, including difficulties in making models interoperable; calibration and validation; and social, cultural, and institutional constraints. Although the long term vision of a consultative infrastructure is clearly an ambitious goal, even small steps towards that vision provide immediate benefits. A variety of activities are now in progress that are beginning to take those steps.

Journal ArticleDOI
TL;DR: Examination of error statistics of operational wildland fire spread models found that empirical-based fire behaviour models developed from a solid foundation of field observations and well accepted functional forms adequately predicted rates of fire spread far outside of the bounds of the original dataset used in their development.
Abstract: The degree of accuracy in model predictions of rate of spread in wildland fires is dependent on the model's applicability to a given situation, the validity of the model's relationships, and the reliability of the model input data. On the basis of a compilation of 49 fire spread model evaluation datasets involving 1278 observations in seven different fuel type groups, the limits on the predictability of current operational models are examined. Only 3% of the predictions (i.e. 35 out of 1278) were considered to be exact predictions according to the criteria used in this study. Mean percent error varied between 20 and 310% and was homogeneous across fuel type groups. Slightly more than half of the evaluation datasets had mean errors between 51 and 75%. Under-prediction bias was prevalent in 75% of the 49 datasets analysed. A case is made for suggesting that a ?35% error interval (i.e. approximately one standard deviation) would constitute a reasonable standard for model performance in predicting a wildland fire's forward or heading rate of spread. We also found that empirical-based fire behaviour models developed from a solid foundation of field observations and well accepted functional forms adequately predicted rates of fire spread far outside of the bounds of the original dataset used in their development. We examined error statistics of operational wildland fire spread models.We compiled 49 fire spread model evaluation datasets involving 1278 observations.Mean percent error varied between 20 and 310% and was homogeneous across fuel type groups.The analysis suggests that a ?35% error interval is a reasonable standard for model adequacy.

Journal ArticleDOI
TL;DR: An efficient integrated approach that integrates a qualitative screening method with a quantitative analysis method based on the statistical emulator to reduce the computational burden of GSA for time-consuming models to reveal that the soil moisture parameter WM is the most sensitive of all the responses of interest.
Abstract: Efficient sensitivity analysis, particularly for the global sensitivity analysis (GSA) to identify the most important or sensitive parameters, is crucial for understanding complex hydrological models, e.g., distributed hydrological models. In this paper, we propose an efficient integrated approach that integrates a qualitative screening method (the Morris method) with a quantitative analysis method based on the statistical emulator (variance-based method with the response surface method, named the RSMSobol' method) to reduce the computational burden of GSA for time-consuming models. Using the Huaihe River Basin of China as a case study, the proposed approach is used to analyze the parameter sensitivity of distributed time-variant gain model (DTVGM). First, the Morris screening method is used to qualitatively identify the parameter sensitivity. Subsequently, the statistical emulator using the multivariate adaptive regression spline (MARS) method is chosen as an appropriate surrogate model to quantify the sensitivity indices of the DTVGM. The results reveal that the soil moisture parameter WM is the most sensitive of all the responses of interest. The parameters Kaw and g"1 are relatively important for the water balance coefficient (WB) and Nash-Sutcliffe coefficient (NS), while the routing parameter RoughRss is very sensitive for the Nash-Sutcliffe coefficient (NS) and correlation coefficient (RC) response of interest. The results also demonstrate that the proposed approach is much faster than the brute-force approach and is an effective and efficient method due to its low CPU cost and adequate degree of accuracy.

Journal ArticleDOI
TL;DR: This work advances the idea of service-oriented modeling by presenting a design for a modeling service that builds from the Open Geospatial Consortium Web Processing Service (WPS) protocol, and demonstrates how the WPS protocol can be used to create modeling services, and how these modeling services can be brought into workflow environments using generic client-side code.
Abstract: Environmental modeling often requires the use of multiple data sources, models, and analysis routines coupled into a workflow to answer a research question. Coupling these computational resources can be accomplished using various tools, each requiring the developer to follow a specific protocol to ensure that components are linkable. Despite these coupling tools, it is not always straight forward to create a modeling workflow due to platform dependencies, computer architecture requirements, and programming language incompatibilities. A service-oriented approach that enables individual models to operate and interact with others using web services is one method for overcoming these challenges. This work advances the idea of service-oriented modeling by presenting a design for a modeling service that builds from the Open Geospatial Consortium (OGC) Web Processing Service (WPS) protocol. We demonstrate how the WPS protocol can be used to create modeling services, and then demonstrate how these modeling services can be brought into workflow environments using generic client-side code. We implemented this approach within the HydroModeler environment, a model coupling tool built on the Open Modeling Interface standard (version 1.4), and show how a hydrology model can be hosted as a WPS web service and used within a client-side workflow. The primary advantage of this approach is that the server-side software follows an established standard that can be leveraged and reused within multiple workflow environments and decision support systems.

Journal ArticleDOI
TL;DR: The scope and architecture required to support uncertainty management as developed in UncertWeb, which includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis, is described.
Abstract: Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed 'Model Web'. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models.

Journal ArticleDOI
TL;DR: An integrated modeling framework for simulating land-use decision making under the influence of payments for ecosystem services that combines agent-based modeling (ABM) with Bayesian belief networks (BBNs) and opinion dynamics models (ODM).
Abstract: We present an integrated modeling framework for simulating land-use decision making under the influence of payments for ecosystem services. The model combines agent-based modeling (ABM) with Bayesian belief networks (BBNs) and opinion dynamics models (ODM). The model endows agents with the ability to make land-use decisions at the household and plot levels. The decision-making process is captured with the BBNs that were constructed and calibrated with both qualitative and quantitative information, i.e., knowledge gained from group discussions with stakeholders and empirical survey data. To represent interpersonal interactions within social networks, the decision process is further modulated by the opinion dynamics model. The goals of the model are to improve the ability of ABM to emulate land-use decision making and thus provide a better understanding of the potential impacts of payments for ecosystem services on land use and household livelihoods. Our approach provides three important innovations. First, decision making is represented in a causal directed graph. Second, the model provides a natural framework for combining knowledge from experts and stakeholders with quantitative data. Third, the modular architecture and the software implementation can be customized with modest efforts. The model is therefore a flexible, general platform that can be tailored to other studies by mounting the appropriate case-specific ''brain'' into the agents. The model was calibrated for the Sloping Land Conversion Program (SLCP) in Yunnan, China using data from participatory mapping, focus group interviews, and a survey of 509 farm households in 17 villages.

Journal ArticleDOI
TL;DR: The Genetic Algorithm tool was able to overcome overfitting and improve validation fitness scores with acceptable computational costs and is flexible enough to embrace a variety of models as well as their specific fitness functions, thus offering a practical way to optimize the performance of land-use change models.
Abstract: Spatially explicit land-use models simulate the patterns of change on the landscape in response to coupled human-ecological dynamics. As these models become more complex involving larger than ever data sets, the need to improve calibration techniques as well as methods that test model accuracy also increases. To this end, we developed a Genetic Algorithm tool and applied it to optimize probability maps of deforestation generated from the Weights of Evidence method for 12 case-study sites in the Brazilian Amazon. We show that the Genetic Algorithm tool, after being constrained during the reproduction process within a specified range and trend of variation of the Weights of Evidence coefficients, was able to overcome overfitting and improve validation fitness scores with acceptable computational costs. In addition to modeling deforestation, the Genetic Algorithm tool coupled with the Weights of Evidence method is flexible enough to embrace a variety of models as well as their specific fitness functions, thus offering a practical way to optimize the performance of land-use change models.

Journal ArticleDOI
TL;DR: The latest version of ANUCLIM, Version 6.1, incorporates substantial upgrades that allows each of its four component programs to systematically incorporate the impacts of projected climate change.
Abstract: ANUCLIM (Xu and Hutchinson, 2011) is a unique software package used to support the spatial modelling and mapping of environmental and natural resources. It has been extensively employed for scientific research, teaching and policy making across study areas at various spatial scales. The package enables users to readily interrogate estimated values, in point and grid form, of monthly, seasonal and annual mean climate variables from supplied elevation dependent monthly mean climate surfaces and an underlying digital elevation model (DEM). The climate surfaces have been derived by the ANUSPLIN package (Hutchinson, 2004) and support interrogation at sub-kilometre scale. A key strength of the ANUCLIM package is its ability to generate bioclimatic profiles from known species locations to predict and map species distributions, in current, projected future and past climates. It can also generate a comprehensive set of climate parameters and growth indices for modelling growth of crops and plants. The package currently has four programs, MTHCLIM, BIOCLIM, BIOMAP and GROCLIM. MTHCLIM is used to obtain estimates of monthly mean climate variables from supplied climate surfaces at specified points or grids. BIOCLIM, in conjunction with BIOMAP, is a bioclimatic prediction system based on the bioclimatic envelope method devised by Nix (1986). GROCLIM is used to generate plant growth indices based on a simplified model of plant growth response to light, thermal and water regimes (Nix, 1981). The latest version of ANUCLIM, Version 6.1, incorporates substantial upgrades. In particular, the package now allows each of its four component programs to systematically incorporate the impacts of projected climate change. These projected climate changes can be provided either as simple constants, or more commonly, as grids of broad scale changes as obtained from outputs of General Circulation Models (GCMs) under various emission scenarios. For Australia, such grids can be obtained from the OzClim website of CSIRO (2007). This enables the systematic investigation of the impacts of projected climate change on socio-environmental systems.

Journal ArticleDOI
TL;DR: An overview of the design and capabilities of the IFIS that was developed as a platform to provide one-stop access to flood-related information is provided.
Abstract: The Iowa Flood Information System (IFIS) is a web-based platform developed at the Iowa Flood Center (IFC) in order to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information, applications, and interactive visualizations for communities in Iowa. The IFIS provides community-centric watershed and river characteristics, rainfall conditions, and stream-flow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values as well as to flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, hourly and seasonal flood forecasts, and rainfall conditions are made available by integrating data from NEXRAD radars, IFC stream sensors, and USGS and National Weather Service (NWS) stream gauges. The IFIS provides customized flood-related data, information, and visualization for over 1000 communities in Iowa. To help reduce the damage from floods, the IFIS helps communities make better-informed decisions about the occurrence of floods and alerts communities in advance using NWS and IFC forecasts. The integrated and modular design and structure of the IFIS allows easy adaptation of the system in other regional and scientific domains. This paper provides an overview of the design and capabilities of the IFIS that was developed as a platform to provide one-stop access to flood-related information.

Journal ArticleDOI
TL;DR: Flexibility of the hydroPSO package suggests it can be implemented in a wider range of models requiring some form of parameter optimisation, and is effective and efficient compared to commonly used optimisation algorithms.
Abstract: This work presents and illustrates the application of hydroPSO, a novel multi-OS and model-independent R package used for model calibration. hydroPSO allows the modeller to perform a standard modelling work flow including, sensitivity analysis, parameter calibration, and assessment of the calibration results, using a single piece of software. hydroPSO implements several state-of-the-art enhancements and fine-tuning options to the Particle Swarm Optimisation (PSO) algorithm to meet specific user needs. hydroPSO easily interfaces the calibration engine to different model codes through simple ASCII files and/or R wrapper functions for exchanging information on the calibration parameters. Then, optimises a user-defined goodness-of-fit measure until a maximum number of iterations or a convergence criterion are met. Finally, advanced plotting functionalities facilitate the interpretation and assessment of the calibration results. The current hydroPSO version allows easy parallelization and works with single-objective functions, with multi-objective functionalities being the subject of ongoing development. We compare hydroPSO against standard algorithms (SCE_UA, DE, DREAM, SPSO-2011, and GML) using a series of benchmark functions. We further illustrate the application of hydroPSO in two real-world case studies: we calibrate, first, a hydrological model for the Ega River Basin (Spain) and, second, a groundwater flow model for the Pampa del Tamarugal Aquifer (Chile). Results from the comparison exercise indicate that hydroPSO is: i) effective and efficient compared to commonly used optimisation algorithms, ii) ''scalable'', i.e. maintains a high performance for increased problem dimensionality, and iii) versatile to adapt to different response surfaces of the objective function. Case study results highlight the functionality and ease of use of hydroPSO to handle several issues that are commonly faced by the modelling community such as: working on different operating systems, single or batch model execution, transient- or steady-state modelling conditions, and the use of alternative goodness-of-fit measures to drive parameter optimisation. Although we limit the application of hydroPSO to hydrological models, flexibility of the package suggests it can be implemented in a wider range of models requiring some form of parameter optimisation.

Journal ArticleDOI
TL;DR: Biophysical trade-offs among bioenergy crop production based on rape seed, food crop production, water quantity, and water quality in the Parthe catchment in Central Germany are analyzed and Pareto optimal frontiers among multiple objectives are estimated.
Abstract: Political agendas worldwide include increased production of biofuel, which multiplies the trade-offs among conflicting objectives, including food and fodder production, water quantity, water quality, biodiversity, and ecosystem services Quantification of trade-offs among objectives in bioenergy crop production is most frequently accomplished by a comparison of a limited number of plausible scenarios Here we analyze biophysical trade-offs among bioenergy crop production based on rape seed, food crop production, water quantity, and water quality in the Parthe catchment in Central Germany Based on an integrated river basin model (SWAT) and a multi-objective genetic algorithm (NSGA-II), we estimated Pareto optimal frontiers among multiple objectives Results indicate that the same level of bioenergy crop production can be achieved at different costs with respect to the other objectives Intermediate rapeseed production does not lead to strong trade-offs with water quality and low flow if a reduction of food and fodder production can be accepted Compared to solutions focused on maximizing food and fodder yield, solutions with intermediate rapeseed production even improve with respect to water quality and low flow If rapeseed production is further increased, negative effects on low flow prevail The major achievement of the optimization approach is the quantification of the functional trade-offs for the feasible range of all objectives The application of the approach provides the results of what is in effect an infinite number of scenarios We offer a general methodology that may be used to support recommendations for the best way to achieve certain goals, and to compare the optimal outcomes given different policy preferences In addition, visualization options of the resulting non-dominated solutions are discussed