scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2012"


Journal ArticleDOI
TL;DR: It is demonstrated how air pollution data can be analysed quickly and efficiently and in an interactive way, freeing time to consider the problem at hand.
Abstract: openair is an R package primarily developed for the analysis of air pollution measurement data but which is also of more general use in the atmospheric sciences. The package consists of many tools for importing and manipulating data, and undertaking a wide range of analyses to enhance understanding of air pollution data. In this paper we consider the development of the package with the purpose of showing how air pollution data can be analysed in more insightful ways. Examples are provided of importing data from UK air pollution networks, source identification and characterisation using bivariate polar plots, quantitative trend estimates and the use of functions for model evaluation purposes. We demonstrate how air pollution data can be analysed quickly and efficiently and in an interactive way, freeing time to consider the problem at hand. One of the central themes of openair is the use of conditioning plots and analyses, which greatly enhance inference possibilities. Finally, some consideration is given to future developments.

1,303 citations


Journal ArticleDOI
TL;DR: A functionality overview of the more than 400 modules available in the latest stable GRASS software release is provided, giving basic and advanced functionality to casual and expert users.
Abstract: The GIS software sector has developed rapidly over the last ten years. Open Source GIS applications are gaining relevant market shares in academia, business, and public administration. In this paper, we illustrate the history and features of a key Open Source GIS, the Geographical Resources Analysis Support System (GRASS). GRASS has been under development for more than 28 years, has strong ties into academia, and its review mechanisms led to the integration of well tested and documented algorithms into a joint GIS suite which has been used regularly for environmental modelling. The development is community-based with developers distributed globally. Through the use of an online source code repository, mailing lists and a Wiki, users and developers communicate in order to review existing code and develop new methods. In this paper, we provide a functionality overview of the more than 400 modules available in the latest stable GRASS software release. This new release runs natively on common operating systems (MS-Windows, GNU/Linux, Mac OSX), giving basic and advanced functionality to casual and expert users. In the second part, we review selected publications with a focus on environmental modelling to illustrate the wealth of use cases for this open and free GIS.

658 citations


Journal ArticleDOI
TL;DR: In this article, a case study habitat suitability model for juvenile Astacopsis gouldi, the giant freshwater crayfish of Tasmania, is presented, where the authors define the model objectives and scope and use a conceptual model of the system to form the structure of the BN.
Abstract: Bayesian networks (BNs) are increasingly being used to model environmental systems, in order to: integrate multiple issues and system components; utilise information from different sources; and handle missing data and uncertainty. BNs also have a modular architecture that facilitates iterative model development. For a model to be of value in generating and sharing knowledge or providing decision support, it must be built using good modelling practice. This paper provides guidelines to developing and evaluating Bayesian network models of environmental systems, and presents a case study habitat suitability model for juvenile Astacopsis gouldi, the giant freshwater crayfish of Tasmania. The guidelines entail clearly defining the model objectives and scope, and using a conceptual model of the system to form the structure of the BN, which should be parsimonious yet capture all key components and processes. After the states and conditional probabilities of all variables are defined, the BN should be assessed by a suite of quantitative and qualitative forms of model evaluation. All the assumptions, uncertainties, descriptions and reasoning for each node and linkage, data and information sources, and evaluation results must be clearly documented. Following these standards will enable the modelling process and the model itself to be transparent, credible and robust, within its given limitations.

455 citations


Journal ArticleDOI
TL;DR: The framework and case study are intended to support and guide future studies of wind comfort and wind safety with CFD and, this way, to contribute to improved wind environmental quality in urban areas.
Abstract: Wind comfort and wind safety for pedestrians are important requirements in urban areas. Many city authorities request studies of pedestrian wind comfort and wind safety for new buildings and new urban areas. These studies involve combining statistical meteorological data, aerodynamic information and criteria for wind comfort and wind safety. Detailed aerodynamic information can be obtained using Computational Fluid Dynamics (CFD), which offers considerable advantages compared to wind tunnel testing. However, the accuracy and reliability of CFD simulations can easily be compromised. For this reason, several sets of best practice guidelines have been developed in the past decades. Based on these guidelines, this paper presents a general simulation and decision framework for the evaluation of pedestrian wind comfort and wind safety in urban areas with CFD. As a case study, pedestrian wind comfort and safety at the campus of Eindhoven University of Technology are analysed. The turbulent wind flow pattern over the campus terrain is obtained by solving the 3D steady Reynolds-averaged Navier-Stokes equations with the realisable [email protected] model on an extensive high-resolution grid based on grid-convergence analysis. The simulation results are compared with long-term and short-term on-site wind speed measurements. Wind comfort and wind safety are assessed and potential design improvements are evaluated. The framework and the case study are intended to support and guide future studies of wind comfort and wind safety with CFD and, this way, to contribute to improved wind environmental quality in urban areas.

390 citations


Journal ArticleDOI
TL;DR: It is noted that model scrutiny and use of expert opinion in modelling will benefit from formal, systematic and transparent procedures that include as wide a range of stakeholders as possible and the role for science to maintain and enhance the rigour and formality of the information that informs decision making is emphasised.
Abstract: The inevitable though frequently informal use of expert opinion in modelling, the increasing number of models that incorporate formally expert opinion from a diverse range of experience and stakeholders, arguments for participatory modelling and analytic-deliberative-adaptive approaches to managing complex environmental problems, and an expanding but uneven literature prompt this critical review and analysis. Aims are to propose common definitions, identify and categorise existing concepts and practice, and provide a frame of reference and guidance for future environmental modelling. The extensive literature review and classification conducted demonstrate that a broad and inclusive definition of experts and expert opinion is both required and part of current practice. Thus an expert can be anyone with relevant and extensive or in-depth experience in relation to a topic of interest. The literature review also exposes informal model assumptions and modeller subjectivity, examines in detail the formal uses of expert opinion and expert systems, and critically analyses the main concepts of, and issues arising in, expert elicitation and the modelling of associated uncertainty. It is noted that model scrutiny and use of expert opinion in modelling will benefit from formal, systematic and transparent procedures that include as wide a range of stakeholders as possible. Enhanced awareness and utilisation of expert opinion is required for modelling that meets the informational needs of deliberative fora. These conclusions in no way diminish the importance of conventional science and scientific opinion but recognise the need for a paradigmatic shift from traditional ideals of unbiased and impartial experts towards unbiased processes of expert contestation and a plurality of expertise and eventually models. Priority must be given to the quality of the enquiry for those responsible for environmental management and policy formulation, and this review emphasises the role for science to maintain and enhance the rigour and formality of the information that informs decision making.

389 citations


Journal ArticleDOI
TL;DR: The components that comprise Geo-Wiki are outlined and how they are integrated in the architectural design, in particular the need to add a mechanism for feedback and interaction as part of community building, and theneed to address issues of data quality.
Abstract: Land cover derived from remotely sensed products is an important input to a number of different global, regional and national scale applications including resource assessments and economic land use models. During the last decade three global land cover datasets have been created, i.e. the GLC-2000, MODIS and GlobCover, but comparison studies have shown that there are large spatial discrepancies between these three products. One of the reasons for these discrepancies is the lack of sufficient in-situ data for the development of these products. To address this issue, a crowdsourcing tool called Geo-Wiki has been developed. Geo-Wiki has two main aims: to increase the amount of in-situ land cover data available for training, calibration and validation, and to create a hybrid global land cover map that provides more accurate land cover information than any current individual product. This paper outlines the components that comprise Geo-Wiki and how they are integrated in the architectural design. An overview of the main functionality of Geo-Wiki is then provided along with the current usage statistics and the lessons learned to date, in particular the need to add a mechanism for feedback and interaction as part of community building, and the need to address issues of data quality. The tool is located at geo-wiki.org.

290 citations


Journal ArticleDOI
TL;DR: Geographical detector is software based on spatial variation analysis of the geographical strata of variables to assess the environmental risks to human health.
Abstract: Human health is affected by many environmental factors. Geographical detector is software based on spatial variation analysis of the geographical strata of variables to assess the environmental risks to human health: the risk detector indicates where the risk areas are; the factor detector identifies which factors are responsible for the risk; the ecological detector discloses the relative importance of the factors; and the interaction detector reveals whether the risk factors interact or lead independently to disease.

283 citations


Journal ArticleDOI
TL;DR: The performance of the WRF model in wind simulation was evaluated under different numerical and physical options for an area of Portugal, located in complex terrain and characterized by its significant wind energy resource, and results suggest that error minimization in the wind simulation can be achieved by testing and choosing a suitable numerical andphysical configuration for the region of interest together with the use of high resolution terrain data, if available.
Abstract: The performance of the Weather Research and Forecast (WRF) model in wind simulation was evaluated under different numerical and physical options for an area of Portugal, located in complex terrain and characterized by its significant wind energy resource The grid nudging and integration time of the simulations were the tested numerical options Since the goal is to simulate the near-surface wind, the physical parameterization schemes regarding the boundary layer were the ones under evaluation Also, the influences of the local terrain complexity and simulation domain resolution on the model results were also studied Data from three wind measuring stations located within the chosen area were compared with the model results, in terms of Root Mean Square Error, Standard Deviation Error and Bias Wind speed histograms, occurrences and energy wind roses were also used for model evaluation Globally, the model accurately reproduced the local wind regime, despite a significant underestimation of the wind speed The wind direction is reasonably simulated by the model especially in wind regimes where there is a clear dominant sector, but in the presence of low wind speeds the characterization of the wind direction (observed and simulated) is very subjective and led to higher deviations between simulations and observations Within the tested options, results show that the use of grid nudging in simulations that should not exceed an integration time of 2 days is the best numerical configuration, and the parameterization set composed by the physical schemes MM5-Yonsei University-Noah are the most suitable for this site Results were poorer in sites with higher terrain complexity, mainly due to limitations of the terrain data supplied to the model The increase of the simulation domain resolution alone is not enough to significantly improve the model performance Results suggest that error minimization in the wind simulation can be achieved by testing and choosing a suitable numerical and physical configuration for the region of interest together with the use of high resolution terrain data, if available

255 citations


Journal ArticleDOI
TL;DR: SUSTAIN is a tool designed to provide critically needed support to watershed practitioners in evaluating stormwater management options based on effectiveness and cost to meet their existing program needs and is intended for users who have a fundamental understanding of watershed and BMP modeling processes.
Abstract: U.S. Environmental Protection Agency developed a decision-support system, System for Urban Stormwater Treatment and Analysis Integration (SUSTAIN), to evaluate alternative plans for stormwater quality management and flow abatement techniques in urban and developing areas. SUSTAIN provides a public domain tool capable of evaluating the optimal location, type, and cost of stormwater best management practices (BMPs) needed to meet water quality and quantity goals. It is a tool designed to provide critically needed support to watershed practitioners in evaluating stormwater management options based on effectiveness and cost to meet their existing program needs. SUSTAIN is intended for users who have a fundamental understanding of watershed and BMP modeling processes. How SUSTAIN is setup described here using a case study, conducted by actual data from an existing urban watershed. The developed SUSTAIN model was calibrated by observed rainfall and flow data, representing the existing conditions. The SUSTAIN model developed two BMP cost-effectiveness curves for flow volume and pollutant load reductions. A sensitivity analysis was also conducted by varying important BMP implementation specifications.

250 citations


Journal ArticleDOI
TL;DR: The retrospective analysis of the case studies indicates that the ten-steps approach is very well applicable to CFD for EFM and that it provides a comprehensive framework that encompasses and extends the existing best practice guidelines.
Abstract: Computational Fluid Dynamics (CFD) is increasingly used to study a wide variety of complex Environmental Fluid Mechanics (EFM) processes, such as water flow and turbulent mixing of contaminants in rivers and estuaries and wind flow and air pollution dispersion in urban areas. However, the accuracy and reliability of CFD modeling and the correct use of CFD results can easily be compromised. In 2006, Jakeman et al. set out ten iterative steps of good disciplined model practice to develop purposeful, credible models from data and a priori knowledge, in consort with end-users, with every stage open to critical review and revision (Jakeman et al., 2006). This paper discusses the application of the ten-steps approach to CFD for EFM in three parts. In the first part, the existing best practice guidelines for CFD applications in this area are reviewed and positioned in the ten-steps framework. The second and third part present a retrospective analysis of two case studies in the light of the ten-steps approach: (1) contaminant dispersion due to transverse turbulent mixing in a shallow water flow and (2) coupled urban wind flow and indoor natural ventilation of the Amsterdam ArenA football stadium. It is shown that the existing best practice guidelines for CFD mainly focus on the last steps in the ten-steps framework. The reasons for this focus are outlined and the value of the additional - preceding - steps is discussed. The retrospective analysis of the case studies indicates that the ten-steps approach is very well applicable to CFD for EFM and that it provides a comprehensive framework that encompasses and extends the existing best practice guidelines.

228 citations


Journal ArticleDOI
TL;DR: Graphab 1.0 provides a full set of coherent modelling functions for analysing and exploring landscape graphs with a single application, integrating a complete set of connectivity analysis functions.
Abstract: Since landscape connectivity reflects a basic form of interaction between species and their environment, the modelling of landscape networks is currently an important issue for researchers in ecology and practitioners of landscape management alike. Graph-based modelling has recently been shown to be a powerful way of representing and analysing landscape networks. Graphab 1.0 is designed as a package integrating a complete set of connectivity analysis functions. The application can build graphs from a given landscape map by exploring several possibilities for link topology, types of distances and graph definitions. A wide range of connectivity metrics can be computed from these graphs at the global, component or local levels. By extrapolating patch-based metrics outside of the graph using a distance-dependent function, the relationship between the graph and any set of point data can be established in order to compare the connectivity properties of the landscape network and field observations of a given species. In conclusion, Graphab 1.0 provides a full set of coherent modelling functions for analysing and exploring landscape graphs with a single application.

Journal ArticleDOI
TL;DR: An efficient three-dimensional method based on discrete cosine transforms is introduced, which explicitly utilizes information from both time and space to predict the missing values in the global soil moisture dataset with very high accuracy.
Abstract: The presence of data gaps is always a concern in geophysical records, creating not only difficulty in interpretation but, more importantly, also a large source of uncertainty in data analysis. Filling the data gaps is a necessity for use in statistical modeling. There are numerous approaches for this purpose. However, particularly challenging are the increasing number of very large spatio-temporal datasets such as those from Earth observations satellites. Here we introduce an efficient three-dimensional method based on discrete cosine transforms, which explicitly utilizes information from both time and space to predict the missing values. To analyze its performance, the method was applied to a global soil moisture product derived from satellite images. We also executed a validation by introducing synthetic gaps. It is shown this method is capable of filling data gaps in the global soil moisture dataset with very high accuracy.

Journal ArticleDOI
TL;DR: The design and architecture of HydroDesktop is described, its novel contributions in web services-based hydrologic data search and discovery, and its unique extensibility interface that enables developers to create custom data analysis and visualization plug-ins are described.
Abstract: Discovering and accessing hydrologic and climate data for use in research or water management can be a difficult task that consumes valuable time and personnel resources. Until recently, this task required discovering and navigating many different data repositories, each having its own website, query interface, data formats, and descriptive language. New advances in cyberinfrastructure and in semantic mediation technologies have provided the means for creating better tools supporting data discovery and access. In this paper we describe a freely available and open source software tool, called HydroDesktop, that can be used for discovering, downloading, managing, visualizing, and analyzing hydrologic data. HydroDesktop was created as a means for searching across and accessing hydrologic data services that have been published using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). We describe the design and architecture of HydroDesktop, its novel contributions in web services-based hydrologic data search and discovery, and its unique extensibility interface that enables developers to create custom data analysis and visualization plug-ins. The functionality of HydroDesktop and some of its existing plug-ins are introduced in the context of a case study for discovering, downloading, and visualizing data within the Bear River Watershed in Idaho, USA.

Journal ArticleDOI
TL;DR: This paper provides the initial results of a joint SD-CA model and an ABM that both operationalize social science knowledge regarding urban shrinkage and discusses the combination of system dynamics, cellular automata and agent-based model approaches to cover the main characteristics, processes and patterns of urban shrinkages.
Abstract: Both modelers and social scientists attempt to find better explanations of complex urban systems. They include development paths, underlying driving forces and their expected impacts. So far, land-use research has predominantly focused on urban growth. However, new challenges have arisen since urban shrinkage entered the research agenda of the social and land-use sciences. Therefore, the focus of this paper is a twofold one: Using the example of urban shrinkage, we first discuss the capacity of existing land-use modeling approaches to integrate new social science knowledge in terms of land-use, demography and governance because social science models are indispensable for accurately explaining the processes behind shrinkage. Second, we discuss the combination of system dynamics (SD), cellular automata (CA) and agent-based model (ABM) approaches to cover the main characteristics, processes and patterns of urban shrinkage. Using Leipzig, Germany, as a case study, we provide the initial results of a joint SD-CA model and an ABM that both operationalize social science knowledge regarding urban shrinkage.

Journal ArticleDOI
TL;DR: Development and assessment of maps of change potential produced by two spatially explicit models and applied to a Tropical Deciduous Forest in western Mexico showed that the prospective LUCC maps tended to identify locations with higher biodiversity levels as the most threatened areas as opposed to areas that had actually undergone deforestation.
Abstract: Land use/cover change (LUCC) modeling is an important approach to evaluating global biodiversity loss and is the topic of a wide range of research in ecology, geography and environmental social science. This paper reports on development and assessment of maps of change potential produced by two spatially explicit models and applied to a Tropical Deciduous Forest in western Mexico. The first model, DINAMICA EGO, uses the weights of evidence method which generates a map of change potential based on a set of explanatory variables and past trends involving some degree of expert knowledge. The second model, Land Change Modeler (LCM), is based upon neural networks. Both models were assessed through Relative Operating Characteristic and Difference in Potential. At the per transition level, we obtained better results using DINAMICA. However, when the per transition susceptibilities are combined to compose an overall change potential map, the map generated using LCM is more accurate because neural networks outputs are able to express the simultaneous change potential to various land cover types more adequately than individual probabilities obtained through the weights of evidence method. An analysis of the change potential obtained from both models, compared with observed deforestation and selected biodiversity indices (species richness, rarity, and biological value) showed that the prospective LUCC maps tended to identify locations with higher biodiversity levels as the most threatened areas as opposed to areas that had actually undergone deforestation. Overall however, the approximate assessment of biodiversity given by both models was more accurate than a random model.

Journal ArticleDOI
TL;DR: A new interactive framework for sensitivity-informed de Novo planning to confront the deep uncertainty within water management problems and illustrates how to adaptively improve the value and robustness of the problem formulations by evolving the definition of optimality while discovering key tradeoffs.
Abstract: This paper proposes and demonstrates a new interactive framework for sensitivity-informed de Novo planning to confront the deep uncertainty within water management problems. The framework couples global sensitivity analysis using Sobol' variance decomposition with multiobjective evolutionary algorithms (MOEAs) to generate planning alternatives and test their robustness to new modeling assumptions and scenarios. We explore these issues within the context of a risk-based water supply management problem, where a city seeks the most efficient use of a water market. The case study examines a single city's water supply in the Lower Rio Grande Valley (LRGV) in Texas, using a suite of 6-objective problem formulations that have increasing decision complexity for both a 10-year planning horizon and an extreme single-year drought scenario. The de Novo planning framework demonstrated illustrates how to adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality while discovering key tradeoffs.

Journal ArticleDOI
TL;DR: The main aim of the paper is to provide an introduction to emulation modelling together with a unified strategy for its application, so that modellers from different disciplines can better appreciate how it may be applied in their area of expertise.
Abstract: Emulation modelling is an effective way of overcoming the large computational burden associated with the process-based models traditionally adopted by the environmental modelling community. An emulator is a low-order, computationally efficient model identified from the original large model and then used to replace it for computationally intensive applications. As the number and forms of the problem that benefit from the identification and subsequent use of an emulator is very large, emulation modelling has emerged in different sectors of science, engineering and social science. For this reason, a variety of different strategies and techniques have been proposed in the last few years. The main aim of the paper is to provide an introduction to emulation modelling, together with a unified strategy for its application, so that modellers from different disciplines can better appreciate how it may be applied in their area of expertise. Particular emphasis is devoted to Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original process-based model, with consequent advantages in a wide variety of problem areas. The different techniques and approaches to DEMo are considered in two macro categories: structure-based methods, where the mathematical structure of the original model is manipulated to a simpler, more computationally efficient form; and data-based approaches, where the emulator is identified and estimated from a data-set generated from planned experiments conducted on the large simulation model. The main contribution of the paper is a unified, six-step procedure that can be applied to most kinds of dynamic emulation problem.

Journal ArticleDOI
TL;DR: A methodology where a parallel processing scheme is constructed to work in the Windows platform and parallelized the calibration of the SWAT (Soil and Water Assessment Tool) hydrological model, where one could submit many simultaneous jobs taking advantage of the capabilities of modern PC and laptops.
Abstract: Large-scale hydrologic models are being used more and more in watershed management and decision making. Sometimes rapid modeling and analysis is needed to deal with emergency environmental disasters. However, time is often a major impediment in the calibration and application of these models. To overcome this, most projects are run with fewer simulations, resulting in less-than-optimum solutions. In recent years, running time-consuming projects on gridded networks or clouds in Linux systems has become more and more prevalent. But this technology, aside from being tedious to use, has not yet become fully available for common usage in research, teaching, and small to medium-size applications. In this paper we explain a methodology where a parallel processing scheme is constructed to work in the Windows platform. We have parallelized the calibration of the SWAT (Soil and Water Assessment Tool) hydrological model, where one could submit many simultaneous jobs taking advantage of the capabilities of modern PC and laptops. This offers a powerful alternative to the use of grid or cloud computing. Parallel processing is implemented in SWAT-CUP (SWAT Calibration and Uncertainty Procedures) using the optimization program SUFI2 (Sequential Uncertainty FItting ver. 2). We tested the program with large, medium, and small-size hydrologic models on several computer systems, including PCs, laptops, and servers with up to 24 CPUs. The performance was judged by calculating speedup, efficiency, and CPU usage. In each case, the parallelized version performed much faster than the non-parallelized version, resulting in substantial time saving in model calibration.

Journal ArticleDOI
TL;DR: It is argued that replacement of the original model by a metamodel can contribute in lowering the computation burden and allow an efficient estimation of moment-independent sensitivity measures while leading to a notable reduction in computational burden.
Abstract: Moment-independent sensitivity methods are attracting increasing attention among practitioners, since they provide a thorough way of investigating the sensitivity of model output under uncertainty. However, their estimation is challenging, especially in the presence of computationally intensive models. We argue that replacement of the original model by a metamodel can contribute in lowering the computation burden. A numerical estimation procedure is set forth. The procedure is first tested on analytical cases with increased structural complexity. We utilize the emulator proposed in Ratto and Pagano (2010). Results show that the emulator allows an accurate estimation of density-based sensitivity measures, when the main structural features of the original model are captured. However, performance deteriorates for a model with interactions of order higher than 2. For this test case, also a kriging emulator is investigated, but no gain in performance is registered. However, an accurate estimation is obtained by applying a logarithmic transformation of the model output for both the kriging and Ratto and Pagano (2010) emulators. These findings are then applied to the investigation of a benchmark environmental case study, the LevelE model. Results show that use of the metamodel allows an efficient estimation of moment-independent sensitivity measures while leading to a notable reduction in computational burden.

Journal ArticleDOI
TL;DR: A comparative assessment framework is developed which presents a clear computational budget dependent definition for the success/failure of the metamodelling strategies, and a robust numerical assessment is conducted over four test functions commonly used in optimization as well as two real-world computationally intensive optimization problems in environmental and water resources systems.
Abstract: Metamodelling is an increasingly more popular approach for alleviating the computational burden associated with computationally intensive optimization/management problems in environmental and water resources systems. Some studies refer to the metamodelling approach as function approximation, surrogate modelling, response surface methodology or model emulation. A metamodel-enabled optimizer approximates the objective (or constraint) function in a way that eliminates the need to always evaluate this function via a computationally expensive simulation model. There is a sizeable body of literature developing and applying a variety of metamodelling strategies to various environmental and water resources related problems including environmental model calibration, water resources systems analysis and management, and water distribution network design and optimization. Overall, this literature generally implies metamodelling yields enhanced solution efficiency and (almost always) effectiveness of computationally intensive optimization problems. This paper initially develops a comparative assessment framework which presents a clear computational budget dependent definition for the success/failure of the metamodelling strategies, and then critically evaluates metamodelling strategies, through numerical experiments, against other common optimization strategies not involving metamodels. Three different metamodel-enabled optimizers involving radial basis functions, kriging, and neural networks are employed. A robust numerical assessment within different computational budget availability scenarios is conducted over four test functions commonly used in optimization as well as two real-world computationally intensive optimization problems in environmental and water resources systems. Numerical results show that metamodelling is not always an efficient and reliable approach to optimizing computationally intensive problems. For simpler response surfaces, metamodelling can be very efficient and effective. However, in some cases, and in particular for complex response surfaces when computational budget is not very limited, metamodelling can be misleading and a hindrance, and better solutions are achieved with optimizers not involving metamodels. Results also demonstrate that neural networks are not appropriate metamodelling tools for limited computational budgets while metamodels employing kriging and radial basis functions show comparable overall performance when the available computational budget is very limited.

Journal ArticleDOI
TL;DR: It is shown that the fitted MS-AR models are interpretable and provide a good description of important properties of the data such as the marginal distributions, the second-order structure or the length of the stormy and calm periods.
Abstract: In this paper, non-homogeneous Markov-Switching Autoregressive (MS-AR) models are proposed to describe wind time series. In these models, several autoregressive models are used to describe the time evolution of the wind speed and the switching between these different models is controlled by a hidden Markov chain which represents the weather types. We first block the data by month in order to remove seasonal components and propose a MS-AR model with non-homogeneous autoregressive models to describe daily components. Then we discuss extensions where the hidden Markov chain is also non-stationary to handle seasonal and interannual fluctuations. The different models are fitted using the EM algorithm to a long time series of wind speed measurement on the Island of Ouessant (France). It is shown that the fitted models are interpretable and provide a good description of important properties of the data such as the marginal distributions, the second-order structure or the length of the stormy and calm periods.

Journal ArticleDOI
TL;DR: An approach to generate river cross-sections from the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) is discussed and results show good potential for using the suggested method in the areas of topographic data scarcity.
Abstract: An approach to generate river cross-sections from the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) is discussed. The low resolution and the inadequate vertical accuracy of such global data present difficulties in differentiating features of hydraulic importance, which necessitate pre-processing of the DEMs before they are used. A vertical bias correction carried out by comparison of elevation points with a high accuracy terrain model produces a considerable improvement to the cross-sections obtained. In a situation where there are some flow/stage measurements at either end of the river reach, an optimization routine combined with a conceptual flow routing method can provide an additional tool to identify the parameters of an equivalent river section. The extracted cross-sections were used in a 1D river modeling tool HEC-RAS/GeoRAS to simulate flooding on a part of the Tisza River, Hungary. Model results are encouraging and show good potential for using the suggested method in the areas of topographic data scarcity.

Journal ArticleDOI
TL;DR: This paper describes calibration methods for models of agricultural production and water use in which economic variables can directly interact with hydrologic network models or other biophysical system models and demonstrates the use of systematic calibration checks at different stages for efficient debugging of models.
Abstract: This paper describes calibration methods for models of agricultural production and water use in which economic variables can directly interact with hydrologic network models or other biophysical system models. We also describe and demonstrate the use of systematic calibration checks at different stages for efficient debugging of models. The central model is the California Statewide Agricultural Production Model (SWAP), a Positive Mathematical Programming (PMP) model of California irrigated agriculture. We outline the six step calibration procedure and demonstrate the model with an empirical policy analysis. Two new techniques are included compared with most previous PMP-based models: exponential PMP cost functions and Constant Elasticity of Substitution (CES) regional production functions. We then demonstrate the use of this type of disaggregated production model for policy analysis by evaluating potential water transfers under drought conditions. The analysis links regional production functions with a water supply network. The results show that a more flexible water market allocation can reduce revenue losses from drought up to 30%. These results highlight the potential of self-calibrated models in policy analysis. While the empirical application is for a California agricultural and environmental water system, the approach is general and applicable to many other situations and locations.

Journal ArticleDOI
TL;DR: The model presented in this paper focuses on household dynamics according to the concept of the second demographic transition (SDT), applying aging and population shrinkage to simulate respective effects on residential choice and the resulting LUC.
Abstract: This paper introduces an enhancement of a cellular automata (CA) model by integrating system dynamics (SD) to incorporate household dynamics and housing decisions as driving forces of residential development. CA macro-models used to simulate the quantitative land-use change (LUC) for urban areas are, thus far, lacking profound dynamics driven by demographic change. The model presented in this paper focuses on household dynamics according to the concept of the second demographic transition (SDT), applying aging and population shrinkage to simulate respective effects on residential choice and the resulting LUC. Such a perspective becomes especially important for urban areas exhibiting growth and shrinkage simultaneously, as currently seen in cities in Europe and in the U.S. To analyze this simultaneity in detail, we implement the residential land use CA model Metronamica and apply it to Berlin's metropolitan region, which was selected as a typical example that displays contrasting growth and shrinkage processes. The pre-implemented macro-model has been replaced with our new SD model H2D"C"A (Household Decision Dynamics for Cellular Automata). For the simulation, we used empirical census data, economic data, data on residential satisfaction and numerous types of geo-information representing land-use zoning, accessibility and suitability for the time span 1990-2008. A comparison of a null model (NM) and H2D"C"A provides a very satisfying reproduction of land-use patterns reflected by kappa coefficient values. The causal relation of LUC drivers is considerably improved by sophisticated housing choice-feedback mechanisms. Although specific residential land-use classes exhibit shrinkage, others expand. Due to the detailed residential land-use classification, current re-urbanization processes could also be simulated.

Journal ArticleDOI
TL;DR: This case study is based on the Sheffield Dynamic Global Vegetation Model, which is used to estimate the combined carbon flux from vegetation in England and Wales in a given year, and shows how different approaches were used to characterise uncertainty in vegetation model parameters, soil conditions and land cover.
Abstract: It is widely recognised that the appropriate representation for expert judgements of uncertainty is as a probability distribution for the unknown quantity of interest. However, formal elicitation of probability distributions is a non-trivial task. We provide an overview of this field, including an outline of the process of eliciting knowledge from experts in probabilistic form. We explore approaches to probabilistic uncertainty specification including direct elicitation and Bayesian analysis. In particular, we introduce the generic technique of elaboration and present a variety of forms of elaboration, illustrated with a series of examples. The methods are applied to the expression of uncertainty in a case study. Mechanistic models are built in just about every area of science and technology, to represent complex physical processes. They are used to predict, understand and control those processes, and increasingly play a role in national and international policy making. As such models gain higher prominence, recipients of their forecasts are increasingly demanding to know how accurate they are. There is therefore a growing interest in quantifying the uncertainties in model predictions. Uncertainty in model outputs, as representations of reality, arise from uncertainty about model inputs (such as initial conditions, external forcing variables and parameters in model equations) and from uncertainty about model structure. Our case study is based on the Sheffield Dynamic Global Vegetation Model (SDGVM), which is used to estimate the combined carbon flux from vegetation in England and Wales in a given year. The extent to which vegetation acts as a carbon sink is an important component of the debate about climate change. We show how different approaches were used to characterise uncertainty in vegetation model parameters, soil conditions and land cover.

Journal ArticleDOI
TL;DR: A new methodology and associated decision support tool is demonstrated that suggests the optimal location for placing BMPs to minimise diffuse surface water pollution at the catchment scale, by determining the trade-off among economic and multiple environmental objectives.
Abstract: The effort to manage diffuse pollution at the catchment scale is an ongoing challenge that needs to take into account trade-offs between environmental and economic objectives. Best Management Practices (BMPs) are gaining ground as a means to address the problem, but their application (and impact) is highly dependant on the characteristics of the crops and of the land in which they are to be applied. In this paper, we demonstrate a new methodology and associated decision support tool that suggests the optimal location for placing BMPs to minimise diffuse surface water pollution at the catchment scale, by determining the trade-off among economic and multiple environmental objectives. The decision support tool consists of a non-point source (NPS) pollution estimator, the SWAT (Soil and Water Assessment Tool) model, a genetic algorithm (GA), which serves as the optimisation engine for the selection and placement of BMPs across the agricultural land of the catchment, and of an empirical economic function for the estimation of the mean annual cost of BMP implementation. In the proposed decision support tool, SWAT was run a number of times equal to the number of tested BMPs, to predict nitrates nitrogen (N-NO3) and total phosphorus (TP) losses from all the agricultural Hydrologic Response Units (HRUs) and possible BMPs implemented on them. The results were then saved in a database which was subsequently used for the optimisation process. Fifty different BMPs, including sole or combined changes in livestock, crop, soil and nutrient application management in alfalfa, corn and pastureland fields, were evaluated in the reported application of the tool in a catchment in Greece, by solving a three-objective optimisation process (cost, TP and N-NO3). The relevant two-dimensional trade-off curves of cost-TP, cost-N-NO3 and N-NO3-TP are presented and discussed. The strictest environmental target, expressed as a 45% reduction of TP at the catchment outlet, which also resulted in a 25% reduction of the annual N-NO3 yield was met at an affordable annual cost of 25 @?/person by establishing an optimal combination of BMPs. The methodology could be used to assist in a more cost-effective implementation of environmental legislation.

Journal ArticleDOI
TL;DR: A decision support system (DSS) for assessing management strategies, including a ''business-as-usual'' strategy and alternative site closure strategies are assessed using the proposed DSS and some results from a sensitivity analysis focussing on changes in preferences are presented.
Abstract: The management of recreational fishing requires resolving conflicting interests and is thus among the most controversial natural resource related issues. Decision making is difficult because of two main factors: first, there is lack of prediction tools that help managers and other stakeholders assess the potential impacts of management changes; second, decisions or management strategies affect multiple social and ecological outcomes and picking the best among sets of multiple outcomes is a complex task. Resource management and stakeholder dialogue can be greatly improved by addressing these problems. In this paper, we propose a decision support system (DSS) for assessing management strategies. The DSS incorporates an integrated agent-based simulation model for tackling the first obstacle and an analytical hierarchy process (AHP)-fuzzy comprehensive evaluation approach to facilitate multi-criteria decision making. The agent-based simulation model incorporates recreational fishing behaviour within a reef ecosystem. Angler behaviour is driven by empirically estimated site choice models which link recreational choices to site attributes and angler characteristics. Coral reef ecosystem dynamics is modelled using a trophic-dynamic model describing the relationship among fish populations, fishing activities as well as algal and coral growth. The second component of the DSS, the AHP-fuzzy comprehensive evaluation part, allows one to combine resource managers' preferences with simulated economic and ecosystem outcomes in the assessment of alternative strategies. A fuzzy multi-criteria, multi-layer evaluation method is used to obtain final ranking. As a case study for this paper, we focus on the management of recreational fishing sites from the Ningaloo Marine Park, an iconic coral reef system in Western Australia. A set of management strategies, including a ''business-as-usual'' strategy and alternative site closure strategies are assessed using the proposed DSS. The site closure strategies evaluated vary in length and timing. Further, these evaluations are undertaken for two fishing pressure scenarios (high and low). We illustrate the usefulness of the DSS by evaluating these strategies. We also present some results from a sensitivity analysis focussing on changes in preferences.

Journal ArticleDOI
TL;DR: New notation is introduced that describes the Sobol indices in terms of the Pearson correlation of outputs from pairs of runs, and introduces correction terms to remove some of the spurious correlation.
Abstract: Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on the output. This paper introduces new notation that describes the Sobol indices in terms of the Pearson correlation of outputs from pairs of runs, and introduces correction terms to remove some of the spurious correlation. A variety of estimation techniques are compared for accuracy and precision using the G function as a test case.

Journal ArticleDOI
TL;DR: This thematic issue aims at providing a guide and reference for modellers in choosing appropriate emulation modelling approaches and understanding their features, and tools and applications of sensitivity analysis in the context of environmental modelling are addressed.
Abstract: Emulation (also denoted as metamodelling in the literature) is an important and expanding area of research and represents one of the major advances in the study of complex mathematical models, with applications ranging from model reduction to sensitivity analysis. Despite the stunning increase in computing power over recent decades, computational limitations remain a major barrier to the effective and systematic use of large-scale, process-based simulation models in rational environmental decision-making. Whereas complex models may provide clear advantages when the goal of the modelling exercise is to enhance our understanding of the natural processes, they introduce problems of model identifiability caused by over-parameterization and suffer from high computational burden when used in management and planning problems, i.e. when they are combined with optimization routines. Therefore, a combination of techniques for complex model reduction with procedures for data assimilation and learning-based control could help to bridge the gap between science and the operational use of models for decision-making. Furthermore sensitivity analysis is a well known and established tool for evaluating robustness of model based results in management and planning, and is often performed in tandem with emulation. Indeed, emulators provide an efficient means for doing a sensitivity analysis for large and expensive models. This thematic issue aims at providing a guide and reference for modellers in choosing appropriate emulation modelling approaches and understanding their features. Tools and applications of sensitivity analysis in the context of environmental modelling are also addressed, which is a typical complement of emulation in most applications. We hope that this thematic issue provides a useful benchmark in the academic literature for this important and expanding area of research, and will create an opportunity for dialogue between methodological and user-focused research.

Journal ArticleDOI
TL;DR: The use of R for reactive transport modeling is illustrated by three applications spanning several orders of magnitude with respect to spatial and temporal scales, which makes R extremely well suited for rapid model prototyping.
Abstract: The concentrations of many natural compounds are altered by chemical and biological transformations, and physical processes such as adsorption and transport Their fate can be predicted using reactive transport models that describe reaction and advective and dispersive movement of these components in their natural environment Recently a number of software packages have been implemented in the open source software R that allow one to implement reactive transport models Central to this is the ReacTran R-package, a comprehensive collection of functions for modeling reactive components that may be distributed over multiple phases, whose dynamics are coupled through biological and geochemical reactions, and that are transported in one-, two- or three-dimensional domains with simple geometries Dedicated solution methods are in R-packages deSolve and rootSolve The modeling packages facilitate the simulation of reaction and transport of components for spatial scales ranging from micrometers to kilometers and spanning multiple time-scales As they are influenced in similar ways, the same functions can solve biogeochemical models of the sediment, groundwater, rivers, estuaries, lakes or water columns, experimental setups, or even describe reaction and transport within flat, cylindrical or spherical bodies, such as organisms, aggregates, or the dispersion of individuals on flat surfaces and so on We illustrate the use of R for reactive transport modeling by three applications spanning several orders of magnitude with respect to spatial and temporal scales They comprise (1) a model of an experimental flow-through sediment reactor, where fitting so-called breakthrough curves are used to derive sulfate reduction rates in an estuarine sediment, (2) a conservative and reactive tracer addition experiment in a small stream, which implements the concept of river spiraling, and (3) a 2-D and 3-D model that describes oxygen dynamics in the upper layers of the sediment, interspersed with several hotspots of increased reaction intensities The packages ReacTran, deSolve and rootSolve are implemented in the software R and thus available for all popular platforms (Linux, Windows, Mac) Models implemented using this software are short and easily readable, yet they are efficiently solved This makes R extremely well suited for rapid model prototyping