scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2018"


Journal ArticleDOI
TL;DR: Putting more thought into the method selection process and choosing the most appropriate method for the project can produce better results, according to expert opinion and a survey of modelers engaged in participatory processes.
Abstract: Various tools and methods are used in participatory modelling, at different stages of the process and for different purposes. The diversity of tools and methods can create challenges for stakeholders and modelers when selecting the ones most appropriate for their projects. We offer a systematic overview, assessment, and categorization of methods to assist modelers and stakeholders with their choices and decisions. Most available literature provides little justification or information on the reasons for the use of particular methods or tools in a given study. In most of the cases, it seems that the prior experience and skills of the modelers had a dominant effect on the selection of the methods used. While we have not found any real evidence of this approach being wrong, we do think that putting more thought into the method selection process and choosing the most appropriate method for the project can produce better results. Based on expert opinion and a survey of modelers engaged in participatory processes, we offer practical guidelines to improve decisions about method selection at different stages of the participatory modeling process.

236 citations


Journal ArticleDOI
TL;DR: Differences between random k-fold and target-oriented CV indicate spatial over-fitting caused by misleading variables, and a forward feature selection in conjunction with target- oriented CV is proposed to decrease over- fitting.
Abstract: Importance of target-oriented validation strategies for spatio-temporal prediction models is illustrated using two case studies: (1) modelling of air temperature ( T a i r ) in Antarctica, and (2) modelling of volumetric water content (VW) for the R.J. Cook Agronomy Farm, USA. Performance of a random k-fold cross-validation (CV) was compared to three target-oriented strategies: Leave-Location-Out (LLO), Leave-Time-Out (LTO), and Leave-Location-and-Time-Out (LLTO) CV. Results indicate that considerable differences between random k-fold ( R 2 = 0.9 for T a i r and 0.92 for VW) and target-oriented CV (LLO R 2 = 0.24 for T a i r and 0.49 for VW) exist, highlighting the need for target-oriented validation to avoid an overoptimistic view on models. Differences between random k-fold and target-oriented CV indicate spatial over-fitting caused by misleading variables. To decrease over-fitting, a forward feature selection in conjunction with target-oriented CV is proposed. It decreased over-fitting and simultaneously improved target-oriented performances (LLO CV R 2 = 0.47 for T a i r and 0.55 for VW).

217 citations


Journal ArticleDOI
TL;DR: Applications are presented to illustrate UMEP's potential in the identification of heat waves and cold waves; the effect of green infrastructure on runoff; the effects of buildings on human thermal stress; solar energy production; and the impact of human activities on heat emissions.
Abstract: UMEP (Urban Multi-scale Environmental Predictor), a city-based climate service tool, combines models and tools essential for climate simulations. Applications are presented to illustrate UMEP's potential in the identification of heat waves and cold waves; the impact of green infrastructure on runoff; the effects of buildings on human thermal stress; solar energy production; and the impact of human activities on heat emissions. UMEP has broad utility for applications related to outdoor thermal comfort, wind, urban energy consumption and climate change mitigation. It includes tools to enable users to input atmospheric and surface data from multiple sources, to characterise the urban environment, to prepare meteorological data for use in cities, to undertake simulations and consider scenarios, and to compare and visualise different combinations of climate indicators. An open-source tool, UMEP is designed to be easily updated as new data and tools are developed, and to be accessible to researchers, decision-makers and practitioners. A GIS-based climate planning tool for researcher and practitioners is presented.UMEP has broad utility for applications related to e.g. climate change mitigation.Applications are presented to illustrate UMEP's potential.

160 citations


Journal ArticleDOI
TL;DR: This manuscript outlines the approach taken and lessons learnt in the new, modern rewrite of APSIM, an agricultural modelling framework used extensively worldwide.
Abstract: From 1990, the Agricultural Production Systems sIMulator (APSIM) has grown from a field-focused farming systems framework used by a small number of people, into a large collection of models used by thousands of modellers internationally. The software grew to consist of several hundred thousand lines of code in multiple programming languages. This has led to a large, complex software ecosystem that is difficult to maintain. In addition, systems modellers increasingly require software systems that integrate multiple disciplines, can represent evermore complex farming systems, can run on multiple operating systems (desktop, web, mobile), can operate at or be adjusted to multiple temporal and spatial scales (field, farm, region, continent, global) and run faster for larger simulation analyses. This is difficult to achieve in an aging framework.For these reasons, the APSIM Initiative is building the next generation of APSIM. This manuscript outlines the approach taken and lessons learnt. APSIM is an agricultural modelling framework used extensively worldwide.APSIM Next Generation is a new, modern rewrite of APSIM.Many lessons have been learnt during this rewrite.A good software process is important in all model development activities.

159 citations


Journal ArticleDOI
TL;DR: A new holistic framework for using information collected from multiple sources for setting parameters of a 2D flood model is developed and the results indicate that the representation of urban micro- features is critical to the accuracy of modelling results.
Abstract: High accuracy models are required for informed decision making in urban flood management. This paper develops a new holistic framework for using information collected from multiple sources for setting parameters of a 2D flood model. This illustrates the importance of identifying key urban features from the terrain data for capturing high resolution flood processes. A Cellular Automata based model CADDIES was used to simulate surface water flood inundation. Existing reports and flood photos obtained via social media were used to set model parameters and investigate different approaches for representing infiltration and drainage system capacity in urban flood modelling. The results of different approaches to processing terrain datasets indicate that the representation of urban micro-features is critical to the accuracy of modelling results. The constant infiltration approach is better than the rainfall reduction approach in representing soil infiltration and drainage capacity, as it describes the flood recession process better. This study provides an in-depth insight into high resolution flood modelling.

132 citations


Journal ArticleDOI
TL;DR: SAGA wetness index showed promising ability to distinguish wetland environments, and in combination with Sentinel-1 and 2 synergies can successfully produce a land use and land cover classification in a location where both wetland and non-wetland classes exist.
Abstract: In this work the synergistic use of Sentinel-1 and 2 combined with the System for Automated Geoscientific Analyses (SAGA) Wetness Index in the content of land use/cover (LULC) mapping with emphasis in wetlands is evaluated. A further objective has been to develop a new Object-based Image Analysis (OBIA) approach for mapping wetland areas using Sentinel-1 and 2 data, where the latter is also tested against two popular machine learning algorithms (Support Vector Machines - SVMs and Random Forests - RFs). The highly vulnerable iSimangaliso Wetland Park was used as the study site. Results showed that two-part image segmentation could efficiently create object features across the study area. For both classification algorithms, an increase in overall accuracy was observed when the full synergistic combination of available datasets. A statistically significant difference in classification accuracy at all levels between SVMs and RFs was also reported, with the latter being up to 2.4% higher. SAGA wetness index showed promising ability to distinguish wetland environments, and in combination with Sentinel-1 and 2 synergies can successfully produce a land use and land cover classification in a location where both wetland and non-wetland classes exist.

129 citations


Journal ArticleDOI
TL;DR: The aim of this paper is to describe the state-of-the art computer-based techniques for data analysis to improve operation of wastewater treatment plants and several limitations that currently prevent the application of computer- based techniques in practice are highlighted.
Abstract: The aim of this paper is to describe the state-of-the art computer-based techniques for data analysis to improve operation of wastewater treatment plants. A comprehensive review of peer-reviewed papers shows that European researchers have led academic computer-based method development during the last two decades. The most cited techniques are artificial neural networks, principal component analysis, fuzzy logic, clustering, independent component analysis and partial least squares regression. Even though there has been progress on techniques related to the development of environmental decision support systems, knowledge discovery and management, the research sector is still far from delivering systems that smoothly integrate several types of knowledge and different methods of reasoning. Several limitations that currently prevent the application of computer-based techniques in practice are highlighted.

121 citations


Journal ArticleDOI
TL;DR: BenMAP-CE is a publicly available, PC-based open source software program that can be configured to conduct health impact assessments to inform air quality policies anywhere in the world.
Abstract: A number of software tools exist to estimate the health and economic impacts associated with air quality changes. Over the past 15 years, the U.S. Environmental Protection Agency and its partners invested substantial time and resources in developing the Environmental Benefits Mapping and Analysis Program - Community Edition (BenMAP-CE). BenMAP-CE is a publicly available, PC-based open source software program that can be configured to conduct health impact assessments to inform air quality policies anywhere in the world. The developers coded the platform in C# and made the source code available in GitHub, with the goal of building a collaborative relationship with programmers with expertise in other environmental modeling programs. The team recently improved the BenMAP-CE user experience and incorporated new features, while also building a cadre of analysts and BenMAP-CE training instructors in Latin America and Southeast Asia.

114 citations


Journal ArticleDOI
TL;DR: This study provides an overview of optimization methods used for targeting land use decisions in agricultural areas, exploring their relative abilities for the integration of stakeholders and the identification of ecosystem service trade-offs since these are especially pertinent to land use planners.
Abstract: Optimal land use allocation with the intention of ecosystem services provision and biodiversity conservation is one of the key challenges in agricultural management. Optimization techniques have been especially prevalent for solving land use problems; however, there is no guideline supporting the selection of an appropriate method. To enhance the applicability of optimization techniques for real-world case studies, this study provides an overview of optimization methods used for targeting land use decisions in agricultural areas. We explore their relative abilities for the integration of stakeholders and the identification of ecosystem service trade-offs since these are especially pertinent to land use planners. Finally, we provide recommendations for the use of the different optimization methods. For example, scalarization methods (e.g., reference point methods, tabu search) are particularly useful for a priori or interactive stakeholder integration; whereas Pareto-based approaches (e.g., evolutionary algorithms) are appropriate for trade-off analyses and a posteriori stakeholder involvement.

98 citations


Journal ArticleDOI
TL;DR: Two stochastic approaches for wildfire susceptibility mapping are compared versus a well established deterministic method, and results obtained are very alike with the deterministic methods, with the advantage of not depending on a priori knowledge of the phenomenon.
Abstract: Wildfire susceptibility is a measure of land propensity for the occurrence of wildfires based on terrain's intrinsic characteristics. In the present study, two stochastic approaches (i.e., extreme learning machine and random forest) for wildfire susceptibility mapping are compared versus a well established deterministic method. The same predisposing variables were combined and used as predictors in all models. The Portuguese region of Dao-Lafoes was selected as a pilot site since it presents national average values of fire incidence and a high heterogeneity in land cover and slope. Maps representing the susceptibility of the study area to wildfires were finally elaborated. Two measures were used to compare the different methods, namely the location of the pixels with similar standardized susceptibility and total validation burnt area. Results obtained with the stochastic methods are very alike with the deterministic ones, with the advantage of not depending on a priori knowledge of the phenomenon.

89 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed a comprehensive tool, CTRL−T (C alculation of T hresholds for R ainfall-induced L andslides−T ool) that automatically and objectively reconstructs rainfall events and the triggering conditions responsible for the failure, and calculates rainfall thresholds at different exceedance probabilities.
Abstract: Empirical rainfall thresholds are commonly used to forecast landslide occurrence in wide areas. Thresholds are affected by several uncertainties related to the rainfall and the landslide information accuracy, the reconstruction of the rainfall responsible for the failure, and the method to calculate the thresholds. This limits the use of the thresholds in landslide early warning systems. To face the problem, we developed a comprehensive tool, CTRL–T ( C alculation of T hresholds for R ainfall-induced L andslides− T ool) that automatically and objectively reconstructs rainfall events and the triggering conditions responsible for the failure, and calculates rainfall thresholds at different exceedance probabilities. CTRL−T uses a set of adjustable parameters to account for different morphological and climatic settings. We tested CTRL−T in Liguria region (Italy), which is highly prone to landslides. We expect CTRL−T has an impact on the definition of rainfall thresholds in Italy, and elsewhere, and on the reduction of the risk posed by rainfall-induced landslides.

Journal ArticleDOI
TL;DR: This study develops and assess a one-dimensional lake model and applies it to 32 lakes from a global observatory network and provides guidance to where the general model approach and associated assumptions work, and cases where adjustments to model parameterisations and/or structure are required.
Abstract: The modelling community has identified challenges for the integration and assessment of lake models due to the diversity of modelling approaches and lakes. In this study, we develop and assess a one-dimensional lake model and apply it to 32 lakes from a global observatory network. The data set included lakes over broad ranges in latitude, climatic zones, size, residence time, mixing regime and trophic level. Model performance was evaluated using several error assessment metrics, and a sensitivity analysis was conducted for nine parameters that governed the surface heat exchange and mixing efficiency. There was low correlation between input data uncertainty and model performance and predictions of temperature were less sensitive to model parameters than prediction of thermocline depth and Schmidt stability. The study provides guidance to where the general model approach and associated assumptions work, and cases where adjustments to model parameterisations and/or structure are required.

Journal ArticleDOI
TL;DR: An open-source schematized 2D area model (Delft3D) is applied that couples intertidal flow, wave-action, sediment transport, geomorphological development with a population dynamics approach including temporal and spatial growth of vegetation and bio-accumulation to fundamentally assess the resilience of salt marsh-mudflat systems under sea level rise.
Abstract: This paper aims to fundamentally assess the resilience of salt marsh-mudflat systems under sea level rise. We applied an open-source schematized 2D area model (Delft3D) that couples intertidal flow, wave-action, sediment transport, geomorphological development with a population dynamics approach including temporal and spatial growth of vegetation and bio-accumulation. Wave-action maintains a high sediment concentration on the mudflat while the tidal motion transports the sediments within the vegetated marsh areas during flood. The marsh-mudflat system attained dynamic equilibrium within 120 years. Sediment deposition and bio-accumulation within the marsh make the system initially resilient to sea level rise scenarios. However, after 50–60 years the marsh system starts to drown with vegetated-levees being the last surviving features. Biomass accumulation and sediment supply are critical determinants for the marsh drowning rate and survival. Our model methodology can be applied to assess the resilience of vegetated coast lines and combined engineering solutions for long-term sustainability.

Journal ArticleDOI
TL;DR: Results show that the model can recreate the impact of both shallow landslides, debris flow runout, and debris floods with acceptable accuracy and general patterns in slope failure and runout are well-predicted, leading to a fully physically based prediction of rainfall induced debris flood behavior in the downstream areas.
Abstract: An integrated, modeling method for shallow landslides, debris flows and catchment hydrology is developed and presented in this paper. Existing two-phase debris flow equations and an adaptation on the infinite slope method are coupled with a full hydrological catchment model. We test the approach on the 4 km2 Scaletta catchment, North-Eastern Sicily, where the 1-10-2009 convective storm caused debris flooding after 395 shallow landslides. Validation is done based on the landslide inventory and photographic evidence from the days after the event. Results show that the model can recreate the impact of both shallow landslides, debris flow runout, and debris floods with acceptable accuracy (91 percent inventory overlap with a 0.22 Cohens Kappa). General patterns in slope failure and runout are well-predicted, leading to a fully physically based prediction of rainfall induced debris flood behavior in the downstream areas, such as the creation of a debris fan at the coastal outlet.

Journal ArticleDOI
TL;DR: An alternative approximation procedure is presented that makes PAWN applicable to a generic sample of inputs and outputs while requiring only one tuning parameter and allows the user to estimate PAWN indices as complementary metrics in multi-method GSA applications without additional computational cost.
Abstract: In a previous paper we introduced a distribution-based method for Global Sensitivity Analysis (GSA), called PAWN, which uses cumulative distribution functions of model outputs to assess their sensitivity to the model's uncertain input factors. Over the last three years, PAWN has been employed in the environmental modelling field as a useful alternative or complement to more established variance-based methods. However, a major limitation of PAWN up to now was the need for a tailored sampling strategy to approximate the sensitivity indices. Furthermore, this strategy required three tuning parameters whose optimal choice was rather unclear. In this paper, we present an alternative approximation procedure that tackles both issues and makes PAWN applicable to a generic sample of inputs and outputs while requiring only one tuning parameter. The new implementation therefore allows the user to estimate PAWN indices as complementary metrics in multi-method GSA applications without additional computational cost.

Journal ArticleDOI
TL;DR: The origins and a brief history of Data Science are presented, revisit prior efforts to define Data Science and provide a more modern, working definition, and the new professional profile of a data scientist is described.
Abstract: Environmental data are growing in complexity, size, and resolution. Addressing the types of large, multidisciplinary problems faced by today's environmental scientists requires the ability to leverage available data and information to inform decision making. Successfully synthesizing heterogeneous data from multiple sources to support holistic analyses and extraction of new knowledge requires application of Data Science. In this paper, we present the origins and a brief history of Data Science. We revisit prior efforts to define Data Science and provide a more modern, working definition. We describe the new professional profile of a data scientist and new and emerging applications of Data Science within Environmental Sciences. We conclude with a discussion of current challenges for Environmental Data Science and suggest a path forward.

Journal ArticleDOI
TL;DR: An open-source, scalable and model-independent (non-intrusive) implementation of an iterative ensemble smoother has been developed to alleviate the computational burden associated with history-matching and uncertainty quantification of real-world-scale environmental models that have very high dimensional parameter spaces.
Abstract: An open-source, scalable and model-independent (non-intrusive) implementation of an iterative ensemble smoother has been developed to alleviate the computational burden associated with history-matching and uncertainty quantification of real-world-scale environmental models that have very high dimensional parameter spaces. The tool, named pestpp-ies, implements the ensemble-smoother form of the popular Gauss-Levenberg-Marquardt algorithm, uses the pest model-interface protocols and includes a built-in parallel run manager, multiple lambda testing and model run failure tolerance. As a demonstration of its capabilities, pestpp-ies is applied to a synthetic groundwater model with thousands of parameters and to a real-world groundwater flow and transport model with tens of thousands of parameters. pestpp-ies is shown to efficiently and effectively condition parameters in both cases and can provide means to estimate posterior forecast uncertainty when the forecasts depend on large numbers of parameters.

Journal ArticleDOI
TL;DR: This paper provides a vision of the required transformative process and features of an integrated multi-utility service provider covering the system architecture, opportunities and benefits, impediments and strategies, and business opportunities.
Abstract: Advanced metering technologies coupled with informatics creates an opportunity to form digital multi-utility service providers. These providers will be able to concurrently collect a customers’ medium-high resolution water, electricity and gas demand data and provide user-friendly platforms to feed this information back to customers and supply/distribution utility organisations. Providers that can install low-cost integrative systems will reap the benefits of derived operational synergies and access to mass markets not bounded by historical city, state or country limits. This paper provides a vision of the required transformative process and features of an integrated multi-utility service provider covering the system architecture, opportunities and benefits, impediments and strategies, and business opportunities. The heart of the paper is focused on demonstrating data modelling processes and informatics opportunities for contemporaneously collected demand data, through illustrative examples and four informative water-energy nexus case studies. Finally, the paper provides an overview of the transformative R&D priorities to realise the vision.

Journal ArticleDOI
TL;DR: The aim of this research is to provide infrastructure planners with a detailed understanding of how granular data generated by an intelligent water management system (Autoflow©) can be utilised to obtain significant efficiencies throughout different stages of an urban water cycle, from supply, distribution, customer engagement, and even wastewater treatment.
Abstract: Current practice for the design of an urban water system usually relies on various models that are often founded on a number of assumptions on how bulk water consumption is attributed to customer connections and outdated demand information that does not reflect present consumption trends; meaning infrastructure is often unnecessarily overdesigned. The recent advent of high resolution smart water meters and advanced data analytics allow for a new era of using the continuous ‘big data’ generated by these meter fleets to create an intelligent system for urban water management to overcome this problem. The aim of this research is to provide infrastructure planners with a detailed understanding of how granular data generated by an intelligent water management system (Autoflow©) can be utilised to obtain significant efficiencies throughout different stages of an urban water cycle, from supply, distribution, customer engagement, and even wastewater treatment.

Journal ArticleDOI
TL;DR: ‘meteoland’ is presented, an R package that integrates several tools to facilitate the estimation of daily weather over landscapes, both under current and future conditions, and contains functions to interpolate daily weather including topographic effects.
Abstract: High-resolution meteorological data are necessary to understand and predict climate-driven impacts on the structure and function of terrestrial ecosystems. However, the spatial resolution of climate reanalysis data and climate model outputs is often too coarse for studies at local/landscape scales. Additionally, climate model projections usually contain important biases, requiring the application of statistical corrections. Here we present ‘meteoland’, an R package that integrates several tools to facilitate the estimation of daily weather over landscapes, both under current and future conditions. The package contains functions: (1) to interpolate daily weather including topographic effects; and (2) to correct the biases of a given weather series (e.g., climate model outputs). We illustrate and validate the functions of the package using weather station data from Catalonia (NE Spain), re-analysis data and climate model outputs for a specific county. We conclude with a discussion of current limitations and potential improvements of the package.

Journal ArticleDOI
TL;DR: A reduced-complexity numerical model developed to simulate shoreline evolution along wave-dominated sandy coasts, LX-Shore, opens new perspectives in terms of knowledge on the primary mechanisms locally driving shoreline change and for ensemble-based simulations of future Shoreline evolution.
Abstract: A reduced-complexity numerical model, LX-Shore, is developed to simulate shoreline evolution along wave-dominated sandy coasts. The model can handle any sandy shoreline geometries (e.g. sand spits, islands), including non-erodible areas such as coastal defenses and headlands, and is coupled with a spectral wave model to cope with complex nearshore wave fields. Shoreline change is primarily driven by the gradients in total longshore sediment transport and by the cross-shore transport owing to variability in incident wave energy. Application to academic cases and a real coast highlights the potential of LX-Shore to simulate shoreline change on timescales from hours (storm) to decades with low computational cost. LX-Shore opens new perspectives in terms of knowledge on the primary mechanisms locally driving shoreline change and for ensemble-based simulations of future shoreline evolution.

Journal ArticleDOI
TL;DR: The FREEWAT platform couples the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models, and provides a database framework and visualization capabilities for hydrochemical analysis.
Abstract: Integrating advanced simulation techniques and data analysis tools in a freeware Geographic Information System (GIS) provides a valuable contribution to the management of conjunctive use of groundwater (the world's largest freshwater resource) and surface-water. To this aim, we describe here the FREEWAT (FREE and open source software tools for WATer resource management) platform. FREEWAT is a free and open source, QGIS-integrated interface for planning and management of water resources, with specific attention to groundwater. The FREEWAT platform couples the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. The FREEWAT environment allows storage of large spatial datasets, data management and visualization, and running of several distributed modelling codes (mainly belonging to the MODFLOW family). It simulates hydrologic and transport processes, and provides a database framework and visualization capabilities for hydrochemical analysis. Examples of real case study applications are provided.

Journal ArticleDOI
TL;DR: The City Catchment Analysis Tool is a novel software system for rapid assessment of combined pluvial and fluvial flood risk using a unique combination of efficient software architecture throughout and especially in the numerical part, use of standard, readily available data sets, and robust and accurate solutions of the flow equations.
Abstract: City Catchment Analysis Tool – CityCAT-is a novel software system for rapid assessment of combined pluvial and fluvial flood risk using a unique combination of: efficient software architecture throughout and especially in the numerical part; use of standard, readily available data sets; efficient algorithms for grid generation; and robust and accurate solutions of the flow equations. It is based on advanced software architecture and accurate solutions for complex free-surface flow over the terrain distinguishing between permeable and impermeable surfaces and taking into account effects of man-made features such as buildings as obstacles to flow. The software is firstly rigorously validated with demanding test cases based on analytical solutions and laboratory studies. Then the unique capability for assessment of the effectiveness of specific flood alleviation interventions across large urban domains, such as roof storage on buildings or introduction of permeable surfaces, is demonstrated.

Journal ArticleDOI
TL;DR: A hybrid parallel code, H12, is developed for 1D-2D coupled urban flood modelling that enables street-resolving hyper-resolution simulation over a large area by combining Open Multi-Processing (OpenMP) and Message Passing Interface (MPI) parallelization.
Abstract: Coupled 1D-2D modelling is a widely used approach to predict water movement in complicated surface and subsurface drainage systems in urban or peri-urban areas. In this study, a hybrid parallel code, H12, is developed for 1D-2D coupled urban flood modelling. Hybrid-1D-2D, or H12, enables street-resolving hyper-resolution simulation over a large area by combining Open Multi-Processing (OpenMP) and Message Passing Interface (MPI) parallelization. Variable grid sizing is adopted for detailed geometric representation of urban surfaces as well as efficient computation. To assess the capability of H12, simulation experiments were carried for the Johnson Creek Catchment (∼40 km2) in Arlington, Texas. The LiDAR-derived digital elevation model (DEM) and detailed land cover map at 1-m resolution are used to represent the terrain and urban features in flood modelling. Hybrid parallelization achieves up to a 79-fold reduction in simulation time compared to the serial run and is more efficient than either OpenMP or MPI alone especially in hyper-resolution simulations.

Journal ArticleDOI
TL;DR: It is argued that no existing game allows for preference elicitation; one of the most challenging steps of Multi-Criteria Decision Analysis (MCDA), and many research opportunities for behavioral operational research are proposed.
Abstract: Serious games and gamification are nowadays pervasive. They are used to communicate about science and sometimes to involve citizens in science (e.g. citizen science). Concurrently, environmental decision analysis is challenged by the high cognitive load of the decision-making process and the possible biases threatening the rationality assumptions. Difficult decision-making processes can result in incomplete preference construction, and are generally limited to few participants. We reviewed 43 serious games and gamified applications related to water. We covered the broad diversity of serious games, which could be explained by the still unsettled terminology in the research area of gamification and serious gaming. We discuss how existing games could benefit early steps of Multi-Criteria Decision Analysis (MCDA), including problem structuring, stakeholder analysis, defining objectives, and exploring alternatives. We argue that no existing game allows for preference elicitation; one of the most challenging steps of MCDA. We propose many research opportunities for behavioral operational research.

Journal ArticleDOI
TL;DR: A real time immersive prototype MAR app for on site content authoring and flood visualisation combining available technologies to reduce implementation complexity and to understand how it is judged by water experts is developed.
Abstract: Mobile Augmented Reality (MAR) for environmental planning and design has hardly been touched upon, yet mobile smart devices are now capable of complex, interactive, and immersive real time visualisations. We present a real time immersive prototype MAR app for on site content authoring and flood visualisation combining available technologies to reduce implementation complexity. Networked access to live sensor readings provides rich real time annotations. Our main goal was to develop a novel MAR app to complement existing flood risk management (FRM) tools and to understand how it is judged by water experts. We present app development in context of the literature and conduct a small user study. Going beyond the presented work, the flexibility of the app permits a broad range of applications in planning, design and environmental management.

Journal ArticleDOI
TL;DR: STREaM is presented, a STochastic Residential water End-use Model that generates synthetic water end-use time series with 10-s and progressively coarser sampling resolutions and shows that increased sampling resolution allows more accurate end- use disaggregation, prompt water leakage detection, and accurate and timely estimates of peak demand.
Abstract: Understanding the tradeoff between the information of high-resolution water use data and the costs of smart meters to collect data with sub-minute resolution is crucial to inform smart meter networks. To explore this tradeoff, we first present STREaM, a STochastic Residential water End-use Model that generates synthetic water end-use time series with 10-s and progressively coarser sampling resolutions. Second, we apply a comparative framework to STREaM output and assess the impact of data sampling resolution on end-use disaggregation, post meter leak detection, peak demand estimation, data storage, and meter availability. Our findings show that increased sampling resolution allows more accurate end-use disaggregation, prompt water leakage detection, and accurate and timely estimates of peak demand. Simultaneously, data storage requirements and limited product availability mean most large-scale, commercial smart metering deployments sense data with hourly, daily, or coarser sampling frequencies. Overall, this work provides insights for further research and commercial deployment of smart water meters.

Journal ArticleDOI
TL;DR: A general method for deriving coefficients from detailed, bottom-up LCA suitable for application in IA models, thus allowing IA analysts to explore the life cycle impacts of technology and scenario alternatives and facilitating attribution of life cycle effects to appropriate years.
Abstract: The fields of life cycle assessment (LCA) and integrated assessment (IA) modelling today have similar interests in assessing macro-level transformation pathways with a broad view of environmental concerns. Prevailing IA models lack a life cycle perspective, while LCA has traditionally been static- and micro-oriented. We develop a general method for deriving coefficients from detailed, bottom-up LCA suitable for application in IA models, thus allowing IA analysts to explore the life cycle impacts of technology and scenario alternatives. The method decomposes LCA coefficients into life cycle phases and energy carrier use by industries, thus facilitating attribution of life cycle effects to appropriate years, and consistent and comprehensive use of IA model-specific scenario data when the LCA coefficients are applied in IA scenario modelling. We demonstrate the application of the method for global electricity supply to 2050 and provide numerical results (as supplementary material) for future use by IA analysts.

Journal ArticleDOI
TL;DR: In this paper, the authors compare the ability of eight machine learning models (elastic net, gradient boosting, kernel-k-nearest neighbors, two variants of support vector machines, M5-cubist, random forest, and a meta-learning ensemble M5Cubist model) and four baseline models (ordinary kriging, a unit area discharge model, and two variant of censored regression) to generate estimates of the annual minimum 7-day mean streamflow with an annual exceedance probability of 90% (7Q10) at 224 unregulated sites in
Abstract: We compare the ability of eight machine-learning models (elastic net, gradient boosting, kernel-k-nearest neighbors, two variants of support vector machines, M5-cubist, random forest, and a meta-learning ensemble M5-cubist model) and four baseline models (ordinary kriging, a unit area discharge model, and two variants of censored regression) to generate estimates of the annual minimum 7-day mean streamflow with an annual exceedance probability of 90% (7Q10) at 224 unregulated sites in South Carolina, Georgia, and Alabama, USA. The machine-learning models produced substantially lower cross validation errors compared to the baseline models. The meta-learning M5-cubist model had the lowest root-mean-squared-error of 26.72 cubic feet per second. Partial dependence plots show that 7Q10s are likely moderated by late summer and early fall precipitation and the infiltration capacity of basin soils.

Journal ArticleDOI
TL;DR: This paper provides a general overview and guidelines of DM techniques to a non-expert user, who can decide with this support which is the more suitable technique to solve their problem at hand.
Abstract: Data Mining (DM) is a fundamental component of the Data Science process. Over recent years a huge library of DM algorithms has been developed to tackle a variety of problems in fields such as medical imaging and traffic analysis. Many DM techniques are far more flexible than more classical numerial simulation or statistical modelling approaches. These could be usefully applied to data-rich environmental problems. Certain techniques such as artificial neural networks, clustering, case-based reasoning or Bayesian networks have been applied in environmental modelling, while other methods, like support vector machines among others, have yet to be taken up on a wide scale. There is greater scope for many lesser known techniques to be applied in environmental research, with the potential to contribute to addressing some of the current open environmental challenges. However, selecting the best DM technique for a given environmental problem is not a simple decision, and there is a lack of guidelines and criteria that helps the data scientist and environmental scientists to ensure effective knowledge extraction from data. This paper provides a broad introduction to the use of DM in Data Science processes for environmental researchers. Data Science contains three main steps (pre-processing, data mining and post-processing). This paper provides a conceptualization of Environmental Systems and a conceptualization of DM methods, which are in the core step of the Data Science process. These two elements define a conceptual framework that is on the basis of a new methodology proposed for relating the characteristics of a given environmental problem with a family of Data Mining methods. The paper provides a general overview and guidelines of DM techniques to a non-expert user, who can decide with this support which is the more suitable technique to solve their problem at hand. The decision is related to the bidimensional relationship between the type of environmental system and the type of DM method. An illustrative two way table containing references for each pair Environmental System-Data Mining method is presented and discussed. Some examples of how the proposed methodology is used to support DM method selection are also presented, and challenges and future trends are identified.