scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2009"


Journal ArticleDOI
TL;DR: Using these methods, the geographic information system (GIS) based software, TrajStat, was developed to view, query, and cluster the trajectories and compute the potential source contribution function (PSCF) and concentration weighted trajectory (CWT) analyses when measurement data are included.
Abstract: Statistical analysis of air mass back trajectories combined with long-term ambient air pollution measurements are useful tools for source identification. Using these methods, the geographic information system (GIS) based software, TrajStat, was developed to view, query, and cluster the trajectories and compute the potential source contribution function (PSCF) and concentration weighted trajectory (CWT) analyses when measurement data are included.

741 citations


Journal ArticleDOI
TL;DR: The new Conefor Sensinode 2.2 (CS22) software is described, which quantifies the importance of habitat patches for maintaining or improving functional landscape connectivity and is conceived as a tool for decision-making support in landscape planning and habitat conservation.
Abstract: Maintaining and restoring landscape connectivity is currently a central concern in ecology and biodiversity conservation, and there is an increasing demand of user-driven tools for integrating connectivity in landscape planning. Here we describe the new Conefor Sensinode 2.2 (CS22) software, which quantifies the importance of habitat patches for maintaining or improving functional landscape connectivity and is conceived as a tool for decision-making support in landscape planning and habitat conservation. CS22 is based on graph structures, which have been suggested to possess the greatest benefit to effort ratio for conservation problems regarding landscape connectivity. CS22 includes new connectivity metrics based on the habitat availability concept, which considers a patch itself as a space where connectivity occurs, integrating in a single measure the connected habitat area existing within the patches with the area made available by the connections between different habitat patches. These new metrics have been shown to present improved properties compared to other existing metrics and are particularly suited to the identification of critical landscape elements for connectivity. CS22 is distributed together with GIS extensions that allow for directly generating the required input files from a GIS layer. CS22 and related documentation can be freely downloaded from the World Wide Web.

641 citations


Journal ArticleDOI
TL;DR: The Marxan with Zones as mentioned in this paper is a decision support tool that provides land-use zoning options in geographical regions for biodiversity conservation, allowing any parcel of land or sea to be allocated to a specific zone, not just reserved or unreserved.
Abstract: Marxan is the most widely used conservation planning software in the world and is designed for solving complex conservation planning problems in landscapes and seascapes. In this paper we describe a substantial extension of Marxan called Marxan with Zones, a decision support tool that provides land-use zoning options in geographical regions for biodiversity conservation. We describe new functions designed to enhance the original Marxan software and expand on its utility as a decision support tool. The major new element in the decision problem is allowing any parcel of land or sea to be allocated to a specific zone, not just reserved or unreserved. Each zone then has the option of its own actions, objectives and constraints, with the flexibility to define the contribution of each zone to achieve targets for pre-specified features (e.g. species or habitats). The objective is to minimize the total cost of implementing the zoning plan while ensuring a variety of conservation and land-use objectives are achieved. We outline the capabilities, limitations and additional data requirements of this new software and perform a comparison with the original version of Marxan. We feature a number of case studies to demonstrate the functionality of the software and highlight its flexibility to address a range of complex spatial planning problems. These studies demonstrate the design of multiple-use marine parks in both Western Australia and California, and the zoning of forest use in East Kalimantan.

441 citations


Journal ArticleDOI
TL;DR: Major recommendations for future research in this area include proper consideration of uncertainty in scenario studies in particular in relation to stakeholder relevant information, construction of scenarios that are more diverse in nature, and sharing of information and resources among the scenario development research community.
Abstract: Scenarios are possible future states of the world that represent alternative plausible conditions under different assumptions. Often, scenarios are developed in a context relevant to stakeholders involved in their applications since the evaluation of scenario outcomes and implications can enhance decision-making activities. This paper reviews the state-of-the-art of scenario development and proposes a formal approach to scenario development in environmental decision-making. The discussion of current issues in scenario studies includes advantages and obstacles in utilizing a formal scenario development framework, and the different forms of uncertainty inherent in scenario development, as well as how they should be treated. An appendix for common scenario terminology has been attached for clarity. Major recommendations for future research in this area include proper consideration of uncertainty in scenario studies in particular in relation to stakeholder relevant information, construction of scenarios that are more diverse in nature, and sharing of information and resources among the scenario development research community.

357 citations


Journal ArticleDOI
TL;DR: The PREVAH components introduced here support a modelling task from pre-processing the data over the actual model calibration and validation to visualising and interpreting the results (post-processing).
Abstract: Spatially distributed modelling is an important instrument for studying the hydrological cycle, both concerning its present state as well as possible future changes in climate and land use. Results of such simulations are particularly relevant for the fields of water resources, natural hazards and hydropower. The semi-distributed hydrological modelling system PREVAH (PREecipitation-Runoff-EVApotranspiration HRU Model) implements a conceptual process-oriented approach and has been developed especially to suit conditions in mountainous environments with their highly variable environmental and climatic conditions. This article presents an overview of the actual model core of PREVAH and introduces the various tools which have been developed for obtaining a comprehensive, user-friendly modelling system: DATAWIZARD for importing and managing hydrometeorological data, WINMET for pre-processing meteorological data, GRIDMATH for carrying out elementary raster data operations, FAOSOIL for processing FAO World Soil Map information, WINHRU for pre-processing spatial data and aggregating hydrological response units (HRU), WINPREVAH for operating the model, HYDROGRAPH for visualising hydrograph data and VIEWOPTIM for visualising the calibration procedure. The PREVAH components introduced here support a modelling task from pre-processing the data over the actual model calibration and validation to visualising and interpreting the results (post-processing). A brief overview of current PREVAH applications demonstrates the flexibility of the modelling system with examples that range from water balance modelling over flood estimation and flood forecasting to drought analysis in Switzerland, Austria, China, Russia and Sweden.

245 citations


Journal ArticleDOI
TL;DR: A multi-criteria model evaluation protocol is presented to check the performance of rainfall-runoff models during model calibration and validation phases based on a high frequency river flow series and a Microsoft Excel-based tool based on the assessment of graphical displays is developed.
Abstract: A multi-criteria model evaluation protocol is presented to check the performance of rainfall-runoff models during model calibration and validation phases based on a high frequency (e.g. hourly, daily) river flow series. The multiple criteria or objectives are based on multiple and non-commensurable measures of information derived from river flow series by means of a number of sequential time series processing tasks. These include separation of the river flow series in subflows, split of the series in nearly independent quick and slow flow hydrograph periods, and the extraction of nearly independent peak and low flows. The protocol accounts for the statistical assumptions and requirements on independency and homoscedasticity of the model residuals, significantly advanced through the use of nearly independent flow values extracted from the flow series. Next to the separate evaluation of the subflow recessions, the quick and slow runoff peak and low values and event volumes, also the performance of the model in predicting extreme high and low flow statistics is validated. To support the time series processing tasks as well as the application of the multi-criteria model evaluation protocol, a Microsoft Excel-based tool (WETSPRO: Water Engineering Time Series PROcessing tool) has been developed. It is based on the assessment of graphical displays, which complement traditional goodness-of-fit statistics.

218 citations


Journal ArticleDOI
TL;DR: The developed GUI-HDMR software copes very well with the test cases and sensitivity indices of first and second order could be calculated accurately with only low computational effort and is shown to be competitive.
Abstract: The high dimensional model representation (HDMR) method is a set of tools which can be used to construct a fully functional metamodel and to calculate variance based sensitivity indices very efficiently. Extensions to the existing set of random sampling (RS)-HDMR tools have been developed in order to make the method more applicable for complex models with a large number of input parameters as often appear in environmental modelling. The HDMR software described here combines the RS-HDMR tools and its extensions in one Matlab package equipped with a graphical user interface (GUI). This makes the HDMR method easily available for all interested users. The performance of the GUI-HDMR software has been tested in this paper using two analytical test models, the Ishigami function and the Sobol' g-function. In both cases the model is highly non-linear, non-monotonic and has significant parameter interactions. The developed GUI-HDMR software copes very well with the test cases and sensitivity indices of first and second order could be calculated accurately with only low computational effort. The efficiency of the software has also been compared against other recently developed approaches and is shown to be competitive. GUI-HDMR can be applied to a wide range of applications in all fields, because in principle only one random or quasi-random set of input and output values is required to estimate all sensitivity indices up to second order. The size of the set of samples is however dependent on the problem and can be successively increased if additional accuracy is required. A brief description of its application within a range of modelling environments is given.

209 citations


Journal ArticleDOI
TL;DR: It is concluded that different levels of expertise represent an opportunity for stimulating cross-fertilisation in the vast field of water research rather than simply yielding a collection of case studies to be re-examined.
Abstract: We provide some additional input and perspectives on Kalteh et al's review of the Self-Organizing Map (SOM) approach (Environ. Model. Softw. (2008), 23, 835-845). Map size selection is a key issue in SOM applications. Although there is no theoretical principle to determine the optimum map size, quantitative indicators such as quantization error, topographic error and eigenvalues have proven to be relevant tools to determine the optimal number of map units. Second, one of the most innovative applications of the SOM is the possibility of introducing a set of variables (e.g., biological) into a SOM previously trained with other variables (e.g. environmental). This can be achieved by calculating the mean value of each environmental variable in each output neuron of a SOM trained with biological variables, or by using a mask function to give a null weight to the biological variables, whereas environmental variables are given a weight of 1 so that the values for biological variables are visualized on a SOM previously trained with environmental variables only. We conclude that our different levels of expertise represent an opportunity for stimulating cross-fertilisation in the vast field of water research rather than simply yielding a collection of case studies to be re-examined.

195 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared three different interpolation methods based on three selected years (1981, 1990, 2002) as starting points: inverse distance weighting (IDW), radial basis function (RBF), and kriging.
Abstract: Severe water shortages and dramatic declines in groundwater levels have resulted in environmental deterioration in the Minqin oasis, an arid region of northwest China. Understanding temporal and spatial variations in the depth to groundwater in the region is important for developing management strategies. Depth to groundwater records for 48 observation wells in the Minqin oasis were available for 22 years from 1981 to 2003, allowing us to compare three different interpolation methods based on three selected years (1981, 1990, 2002) as starting points. The three methods were inverse distance weighting (IDW), radial basis function (RBF), and kriging (including ordinary kriging (OK), simple kriging (SK), and universal kriging (UK)). Cross-validation was applied to evaluate the accuracy of the various methods, and two indices - the correlation coefficient (R^2) and the root mean squared error (RMSE) - were used to compare the interpolation methods. Another two indices - deviation of estimation errors (@s) and 95% prediction interval (95PPI) - were used to assess prediction errors. Comparison of interpolated values with observed values indicates that simple kriging is the optimal method for interpolating depth to groundwater in this region: it had the lowest standard deviation of estimation errors and smallest 95% prediction interval (95PPI). By using the simple kriging method and an autoregressive model for depth to groundwater based on the data from 1981 to 2003, this work revealed systematic temporal and spatial variations in the depth to groundwater in the Minqin oasis. The water table has declined rapidly over the past 22 years, with the average depth to groundwater increasing from 4.95m in 1981 to 14.07m in 2002. We attribute the decline in the water table to excessive extraction and to decreases in irrigation channel leakage.

180 citations


Journal ArticleDOI
TL;DR: A multistage fuzzy-stochastic programming (MFSP) model is developed for tackling uncertainties presented as fuzzy sets and probability distributions and a vertex analysis approach is proposed for solving multiple fuzzy sets in the MFSP model.
Abstract: In this study, a multistage fuzzy-stochastic programming (MFSP) model is developed for tackling uncertainties presented as fuzzy sets and probability distributions. A vertex analysis approach is proposed for solving multiple fuzzy sets in the MFSP model. Solutions under a set of @a-cut levels can be generated by solving a series of deterministic submodels. The developed method is applied to the planning of a case study for water-resources management. Dynamics and uncertainties of water availability (and thus water allocation and shortage) could be taken into account through generation of a set of representative scenarios within a multistage context. Moreover, penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. The modeling results can help to generate a range of alternatives under various system conditions, and thus help decision makers to identify desired water-resources management policies under uncertainty.

174 citations


Journal ArticleDOI
TL;DR: This work presents a new method for publishing research datasets consisting of point observations that employs a standard observations data model populated using controlled vocabularies for environmental and water resources data along with web services for transmitting data to consumers.
Abstract: Over the next decade, it is likely that science and engineering research will produce more scientific data than has been created over the whole of human history. The successful use of these data to achieve new scientific breakthroughs will depend on the ability to access, integrate, and analyze these large datasets. Robust data organization and publication methods are needed within the research community to enable data discovery and scientific analysis by researchers other than those that collected the data. We present a new method for publishing research datasets consisting of point observations that employs a standard observations data model populated using controlled vocabularies for environmental and water resources data along with web services for transmitting data to consumers. We describe how these components have reduced the syntactic and semantic heterogeneity in the data assembled within a national network of environmental observatory test beds and how this data publication system has been used to create a federated network of consistent research data out of a set of geographically decentralized and autonomous test bed databases.

Journal ArticleDOI
TL;DR: The present and future of semantic modelling in environmental science are reviewed: from the mediation approach, where formal knowledge is the key to automatic integration of datasets, models and analytical pipelines, to the knowledge-driven approach,Where the knowledge is a key not only to integration, but also to overcoming scale and paradigm differences and to novel potentials for model design and automated knowledge discovery.
Abstract: Models, and to a lesser extent datasets, embody sophisticated statements of environmental knowledge. Yet, the knowledge they incorporate is rarely self-contained enough for them to be understood and used - by humans or machines - without the modeller's mediation. This severely limits the options in reusing environmental models and connecting them to datasets or other models. The notion of ''declarative modelling'' has been suggested as a remedy to help design, communicate, share and integrate models. Yet, not all these objectives have been achieved by declarative modelling in its current implementations. Semantically aware environmental modelling is a way of designing, implementing and deploying environmental datasets and models based on the independent, standardized formalization of the underlying environmental science. It can be seen as the result of merging the rationale of declarative modelling with modern knowledge representation theory, through the mediation of the integrative vision of a Semantic Web. In this paper, we review the present and preview the future of semantic modelling in environmental science: from the mediation approach, where formal knowledge is the key to automatic integration of datasets, models and analytical pipelines, to the knowledge-driven approach, where the knowledge is the key not only to integration, but also to overcoming scale and paradigm differences and to novel potentials for model design and automated knowledge discovery.

Journal ArticleDOI
TL;DR: A GIS-based tool, or a GEPIC model, to estimate crop water productivity on the land surface with spatial resolution of 30arc-min is presented and shows a non-linear relationship between virtual water content (or the inverse of CWP) and crop yield.
Abstract: Recent research on crop-water relations has increasingly been directed towards the application of locally acquired knowledge to answering the questions raised on larger scales. However, the application of the local results to larger scales is often questionable. This paper presents a GIS-based tool, or a GEPIC model, to estimate crop water productivity (CWP) on the land surface with spatial resolution of 30arc-min. The GEPIC model can estimate CWP on a large-scale by considering the local variations in climate, soil and management conditions. The results show a non-linear relationship between virtual water content (or the inverse of CWP) and crop yield. The simulated CWP values are generally more sensitive to three parameters, i.e. potential harvest index for a crop under ideal growing conditions (HI), biomass-energy ratio indicating the energy conversion to biomass (WA), and potential heat unit accumulation from emergence to maturity (PHU), than other parameters. The GEPIC model is a useful tool to study crop-water relations on large scales with high spatial resolution; hence, it can be used to support large-scale decision making in water management and crop production.

Journal ArticleDOI
TL;DR: Results show that the use of representative training data can help the classifier to produce more accurate and reliable results, and confirm the value of visualization tools for the assessment of training pixels through decision boundary analysis.
Abstract: Image classification is a complex process affected by some uncertainties and decisions made by the researchers. The accuracy achieved by a supervised classification is largely dependent upon the training data provided by the analyst. The use of representative training data sets is of significant importance for the performance of all classification methods. However, this issue is more important for neural network classifiers since they take each sample into consideration in the training stage. The representativeness is related to the size and quality of the training data that are highly important in assessing the accuracy of the thematic maps derived from remotely sensed data. Quality analysis of training data helps to identify outlier and mixed pixels that can undermine the reliability and accuracy of a classification resulting from an incorrect class boundary definition. Training data selection can be thought of as an iterative process conducted to form a representative data set after some refinements. Unfortunately, in many applications the quality of the training data is not questioned, and the data set is directly employed in the training stage. In order to increase the representativeness of the training data, a two-stage approach is presented, and performance tests are conducted for a selected region. Multi-layer perceptron model trained with backpropagation learning algorithm is employed to classify major land cover/land use classes present in the study area, the city of Trabzon in Turkey. Results show that the use of representative training data can help the classifier to produce more accurate and reliable results. An improvement of several percent in classification accuracy can make significant effect on the quality of the classified image. Results also confirm the value of visualization tools for the assessment of training pixels through decision boundary analysis.

Journal ArticleDOI
TL;DR: This paper discusses and compares two types of control signals to use the thermal storage of electrical household appliances as balancing power and develops a model of the synergetic behaviour of an ensemble of refrigerators' reaction on control signals.
Abstract: Load balancing in electricity grids becomes a more sophisticated problem by the increased availability of time-varying stochastic supply of electricity from conversion of renewable resources like wind or sunlight. Due to the fact that large quantities of electrical energy cannot be stored easily, demand side management by shifting electrical loads is one attempt to cope with this problem. In this paper we discuss and compare two types of control signals to use the thermal storage of electrical household appliances as balancing power. As the system of our research consists of a high number of controllable refrigerators with independent parameters and behaviour, we investigate the synergetic behaviour by a simulation model. For this objective we analyze a simulation model of controllable refrigerators with respect to their ability to shift their energy demand depending on parameterized external signals. We show that both types of control signals can be used for short term reserves with delivery within 15min of time, but they differ in possible shapes of the resulting load curves and in the reaction time of the controlled system. In addition to the simulation model we develop a model of the synergetic behaviour of an ensemble of refrigerators' reaction on control signals. This mathematical model predicts the electricity demand of ensembles of controlled appliances. As it reduces the simulation model's complexity it could be used in a sophisticated control strategy, e.g. in a model predictive control approach. The general attempt to integrate the load shift potential of cooling devices into the control of an electricity grid can probably be transferred to other electrical appliances with thermal storage capacities.

Journal ArticleDOI
TL;DR: The spatially distributed sediment budget model is described that assesses the primary sources (hillslope soil erosion, gully and riverbank erosion) and sinks (floodplain and reservoir deposition) of fine sediment for each link in a river network and is suitable for guiding the targeting of remediation measures within river basins to reduce downstream sediment yields.
Abstract: Identifying the erosion processes contributing to increased basin fine sediment yield is important for reducing downstream impacts on aquatic ecosystems. However, erosion rates are spatially variable, and much eroded sediment is stored within river basins and not delivered downstream. A spatially distributed sediment budget model is described that assesses the primary sources (hillslope soil erosion, gully and riverbank erosion) and sinks (floodplain and reservoir deposition) of fine sediment for each link in a river network. The model performance is evaluated in a 17,000-km^2 basin in south-east Australia using measured suspended sediment yields from eight catchments within the basin, each 100-700km^2 in area. Spatial variations within the basin in yield and area-specific yield were reliably predicted. Observed yields and area-specific yields varied by 17-fold and 15-fold respectively between the catchments, while predictions were generally within a factor of 2 of observations. Model efficiency at predicting variations in area-specific yield was good outside forested areas (0.58), and performance was weakly sensitive to parameter values. Yields from forested areas were under-predicted, and reducing the predicted influence of riparian vegetation on bank erosion improved model performance in those areas. The model provided more accurate and higher resolution predictions than catchment area interpolation of measured yields from neighbouring river basins. The model is suitable for guiding the targeting of remediation measures within river basins to reduce downstream sediment yields.

Journal ArticleDOI
TL;DR: The software implementation of The OECD P"O"V & LRTP Screening Tool (The Tool) that is used to assess the environmental hazard of organic chemicals using metrics of overall persistence (P" O"V) and long-range transport potential (LRTP).
Abstract: We present the software implementation of The OECD P"O"V & LRTP Screening Tool (The Tool) that is used to assess the environmental hazard of organic chemicals using metrics of overall persistence (P"O"V) and long-range transport potential (LRTP). The Tool is designed to support decision making for chemical management and includes features that are recommended by the Organization for Economic Cooperation and Development (OECD) expert group on multimedia modeling. The Tool is useful for screening the environmental hazard potential of non-ionizing organic chemicals whose environmental partitioning can be described by absorptive capacities of environmental media estimated from partitioning between air, water and octanol in the laboratory. The software includes data storage functionality, and a user interface that is designed to facilitate simple data input and straightforward interpretation of the model results. The effect of uncertainties in input properties describing chemicals can be assessed with a Monte Carlo analysis. The software is evaluated and illustrated by comparing results from The Tool with those from other models and by evaluating four substances that are candidates for regulation or ban under the Stockholm Convention on Persistent Organic Pollutants.

Journal ArticleDOI
TL;DR: The best predicting factors for IS usefulness across the life cycle were found to be user participation, user perceptions and intentions, user computer experience, top management support, support and training, external pressure, IS unit professionalism and the availability of external information sources.
Abstract: The potential usefulness of different kinds of Information System (IS) for environmental management is well recognised. However, concerns have been raised about the translation of this potential into actual use and benefit to policy and planning organisations and outcomes. The aims of this paper are to identify those factors which have been found to influence the use and usefulness of IS and in doing so to provide advice for managing development and implementation processes. There is no body of empirical work on the topic for environmental application. However a substantial literature on non-environmental IS has been developed and is used as source material. A classification of IS life cycle processes is developed and the best, worst and possible predicting factors for each process identified. The best predicting factors for IS usefulness across the life cycle were found to be user participation, user perceptions and intentions, user computer experience, top management support, support and training, external pressure, IS unit professionalism and the availability of external information sources. The state of knowledge about the determinants of IS usefulness is discussed and priorities for future research are identified. The factors identified are then discussed in terms of what they mean for managing IS development and for overcoming concerns about environmental IS development and use.

Journal ArticleDOI
TL;DR: The simulation results demonstrated that the rate and patterns of bioclogging development are sensitive to the initial biomass distribution, and the common assumption of an initially uniform biomass distribution may not be appropriate and may introduce a significant error in the modeling results.
Abstract: This work presents a numerical model able to simulate the effect of biomass growth on the hydraulic properties of saturated porous media, i.e., bioclogging. A new module for an existing coupled flow and reactive-transport code-PHWAT-was implemented. Laboratory experiments were used to validate the model. Good agreement with the experimental data was found. Model behavior was satisfactory in terms of numerical discretization errors and parameter calibration, although-grid-independent results were difficult to achieve. The new code was applied to investigate the effect of the initial conditions on clogging development. A set of simulations was conducted considering 1D and 2D flow conditions, for both uniform and heterogeneous initial biomass concentrations. The simulation results demonstrated that the rate and patterns of bioclogging development are sensitive to the initial biomass distribution. Thus, the common assumption of an initially uniform biomass distribution may not be appropriate and may introduce a significant error in the modeling results.

Journal ArticleDOI
TL;DR: A proto-type coupled modelling system to simulate land-use change is presented by bringing together three simple process models: an agent-based model of subsistence farming; an individual- based model of forest dynamics; and a spatially explicit hydrological model which predicts distributed soil moisture and basin scale water fluxes.
Abstract: Subsistence farming communities are dependent on the landscape to provide the resource base upon which their societies can be built. A key component of this is the role of climate and the feedback between rainfall, crop growth, land clearance and their coupling to the hydrological cycle. Temporal fluctuations in rainfall alter the spatial distribution of water availability, which in turn is mediated by soil-type, slope and landcover. This pattern ultimately determines the locations within the landscape that can support agriculture and controls sustainability of farming practices. The representation of such a system requires us to couple together the dynamics of human and ecological systems and landscape change, each of which constitutes a significant modelling challenge on its own. Here we present a proto-type coupled modelling system to simulate land-use change by bringing together three simple process models: (a) an agent-based model of subsistence farming; (b) an individual-based model of forest dynamics; and (c) a spatially explicit hydrological model which predicts distributed soil moisture and basin scale water fluxes. Using this modelling system we investigate how demographic changes influence deforestation and assess its impact on forest ecology, stream hydrology and changes in water availability.

Journal ArticleDOI
TL;DR: This paper reports on development and deployment of an EDSS that encompasses a new approach to DSS tools, generators and specific DSS applications, built upon a conceptualisation of terrestrial and aquatic environmental systems that has resulted in a robust and flexible system architecture.
Abstract: The concepts and technology of environmental decision support systems (EDSS) have developed considerably over recent decades, although core concepts such as flexibility and adaptability within a changing decision environment remain paramount. Much recent EDSS theory has focussed on model integration and re-use in decision support system (DSS) tools and for design and construction of 'DSS generators'. Many current specific DSS have architectures, tools, models and operational characteristics that are either fixed or difficult to change in the face of changing management needs. This paper reports on development and deployment of an EDSS that encompasses a new approach to DSS tools, generators and specific DSS applications. The system, named E2, is built upon a conceptualisation of terrestrial and aquatic environmental systems that has resulted in a robust and flexible system architecture. The architecture provides a set of base classes to represent fundamental concepts, and which can be instantiated and combined to form DSS generators of varying complexity. A DSS generator is described within which system users are able to select and link models, data, analysis tools and reporting tools to create specific DSS for particular problems, and for which new models and tools can be created and, through software reflection (introspection), discovered to provide expanded capability where required. This system offers a new approach within which environmental systems can be described in the form of specific DSS at a scale and level of complexity suited to the problems and needs of decision makers.

Journal ArticleDOI
TL;DR: A novel multi-objective genetic algorithm (MOGA) based on the NSGA-II algorithm, which uses metamodels to determine optimal sampling locations for installing pressure loggers in a water distribution system (WDS) when parameter uncertainty is considered.
Abstract: This paper presents a novel multi-objective genetic algorithm (MOGA) based on the NSGA-II algorithm, which uses metamodels to determine optimal sampling locations for installing pressure loggers in a water distribution system (WDS) when parameter uncertainty is considered. The new algorithm combines the multi-objective genetic algorithm with adaptive neural networks (MOGA-ANN) to locate pressure loggers. The purpose of pressure logger installation is to collect data for hydraulic model calibration. Sampling design is formulated as a two-objective optimization problem in this study. The objectives are to maximize the calibrated model accuracy and to minimize the number of sampling devices as a surrogate of sampling design cost. Calibrated model accuracy is defined as the average of normalized traces of model prediction covariance matrices, each of which is constructed from a randomly generated sampling set of calibration parameter values. This method of calculating model accuracy is called the 'full' fitness model. Within the genetic algorithm search process, the full fitness model is progressively replaced with the periodically (re)trained adaptive neural network metamodel where (re)training is done using the data collected by calling the full model. The methodology was first tested on a hypothetical (benchmark) problem to configure the setting requirement. Then the model was applied to a real case study. The results show that significant computational savings can be achieved by using the MOGA-ANN when compared to the approach where MOGA is linked to the full fitness model. When applied to the real case study, optimal solutions identified by MOGA-ANN are obtained 25 times faster than those identified by the full model without significant decrease in the accuracy of the final solution.

Journal ArticleDOI
TL;DR: It is suggested that data-driven models can complement expert knowledge-based approaches and hence improve model reliability and have important implications on the application of expert knowledge in ecological studies, especially if this knowledge is extrapolated to other areas.
Abstract: Aquatic habitat suitability models have increasingly received attention due to their wide management applications. Ecological expert knowledge has been frequently incorporated in such models to link environmental conditions to the quantitative habitat suitability of aquatic species. Since the formalisation of problem-specific human expert knowledge is often difficult and tedious, data-driven machine learning techniques may be helpful to extract knowledge from ecological datasets. In this paper, both expert knowledge-based and data-driven fuzzy habitat suitability models were developed and the performance of these models was compared. For the data-driven models, a hill-climbing optimisation algorithm was applied to derive ecological knowledge from the available data. Based on the available ecological expert knowledge and on biological samples from the Zwalm river basin (Belgium), habitat suitability models were generated for the mayfly Baetis rhodani (Pictet 1843). Data-driven models appeared to outperform expert knowledge-based models substantially, while a step-forward model selection procedure indicated that physical habitat variables adequately described the mayfly habitat suitability in the studied area. This study has important implications on the application of expert knowledge in ecological studies, especially if this knowledge is extrapolated to other areas. The results suggest that data-driven models can complement expert knowledge-based approaches and hence improve model reliability.

Journal ArticleDOI
TL;DR: A comparison of uncertainty analysis approaches showed that, regarding the models evaluated here, the classical Bayesian method is more effective at discriminating models according to their uncertainty, but the GLUE approach performs similarly when it is based on the same founding assumptions as theBayesian method.
Abstract: Urban stormwater quality modelling plays a central role in evaluation of the quality of the receiving water body. However, the complexity of the physical processes that must be simulated and the limited amount of data available for calibration may lead to high uncertainty in the model results. This study was conducted to assess modelling uncertainty associated with catchment surface pollution evaluation. Eight models were compared based on the results of a case study in which there was limited data available for calibration. Uncertainty analysis was then conducted using three different methods: the Bayesian Monte Carlo method, the GLUE pseudo-Bayesian method and the GLUE method revised by means of a formal distribution of residuals between the model and measured data (GLUE_f). The uncertainty assessment of the models enabled evaluation of the advantages and limitations of the three methodologies adopted. The models were then tested using the quantity-quality data gathered for the Fossolo catchment in Bologna, Italy. The results revealed that all of the models evaluated here provided good calibration results, even if the model reliability (in terms of related uncertainty) varied, which suggests the adoption of a specific modelling approach with respect to the others. Additionally, a comparison of uncertainty analysis approaches showed that, regarding the models evaluated here, the classical Bayesian method is more effective at discriminating models according to their uncertainty, but the GLUE approach performs similarly when it is based on the same founding assumptions as the Bayesian method.

Journal ArticleDOI
TL;DR: A simple fire danger index F that is intuitive and easy to calculate is introduced and compared to a number of fire danger indices pertaining to different fuel types that are used in an operational setting in Australia and the United States and suggest that F provides a plausible measure of fire Danger rating.
Abstract: Fire danger rating systems are used to assess the potential for bushfire occurrence, fire spread and difficulty of fire suppression. Typically, fire danger rating systems combine meteorological information with estimates of the moisture content of the fuel to produce a fire danger index. Fire danger indices are used to declare fire bans and to schedule prescribed burns, among other applications. In this paper a simple fire danger index F that is intuitive and easy to calculate is introduced and compared to a number of fire danger indices pertaining to different fuel types that are used in an operational setting in Australia and the United States. The comparisons suggest that F provides a plausible measure of fire danger rating and that it may be a useful pedagogical tool in the context of fire danger and fire weather.

Journal ArticleDOI
TL;DR: Two recently introduced multi-objective, hybrid algorithms, ParEGO and LEMMO, are tested on the design problem of a real medium-size network in Southern Italy, and a real large- size network in the UK under a scenario of a severely restricted number of function evaluations, suggesting that the use of both algorithms could be successfully extended to the efficient design of large-scale water distribution networks.
Abstract: The design of water distribution networks is a large-scale combinatorial, non-linear optimisation problem, involving many complex implicit constraint sets, such as nodal mass balance and energy conservation, which are commonly satisfied through the use of hydraulic network solvers. These problem properties have motivated several prior studies to use stochastic search optimisation, because these derivative-free global search algorithms have been shown to obtain higher quality solutions for large network design problems. Global stochastic search methods, however, require many iterations to be performed in order to achieve a satisfactory solution, and each iteration may involve running computationally expensive simulations. Recently, this problem has been compounded by the evident need to embrace more than a single measure of performance into the design process, since by nature multi-objective optimisation methods require even more iterations. The use of metamodels as surrogates for the expensive simulation functions has been investigated as a possible remedy to this problem. However, the identification of reliable surrogates is not always a viable alternative. Under these circumstances, methods that are capable of achieving a satisfactory level of performance with a limited number of function evaluations represent a valuable alternative. This paper represents a first step towards filling this gap. Two recently introduced multi-objective, hybrid algorithms, ParEGO and LEMMO, are tested on the design problem of a real medium-size network in Southern Italy, and a real large-size network in the UK under a scenario of a severely restricted number of function evaluations. The results obtained suggest that the use of both algorithms, in particular LEMMO, could be successfully extended to the efficient design of large-scale water distribution networks.

Journal ArticleDOI
TL;DR: An SDM based on an analogue approach has been developed within the Australian Bureau of Meteorology and applied to six regions covering the southern half of Australia, contributing to a selection of global climate models which contributed to the Intergovernmental Panel on Climate Change 4th assessment report released in 2007.
Abstract: Climate change information required for impact studies is of a much finer spatial scale than climate models can directly provide. Statistical downscaling models (SDMs) are commonly used to fill this scale gap. SDMs are based on the view that the regional climate is conditioned by two factors: (1) the large-scale climatic state and (2) local physiographic features. An SDM based on an analogue approach has been developed within the Australian Bureau of Meteorology and applied to six regions covering the southern half of Australia. Six surface predictands (daily minimum and maximum temperature and dew-point temperature, daily total rainfall and pan evaporation) were modelled. The skill of the SDMs is evaluated by comparing reconstructed and observed series using a range of metrics: first two moments of the series, the ability to reproduce day-to-day and inter-annual variability, and long-term trends. Once optimised, the SDMs are applied to a selection of global climate models which contributed to the Intergovernmental Panel on Climate Change 4th assessment report released in 2007. A user-friendly graphical interface has been developed to facilitate dissemination of the SDM results and provides a range of options for users to obtain tailored information. Once the projections are calculated for the places of interest, graphical outputs are displayed and can be downloaded jointly with the underlying data, allowing the user to use the data in their own application.

Journal ArticleDOI
TL;DR: Artificial neural networks were used to predict stormwater quality at urbanized catchments located throughout the United States and it is inferred that ANN models are not more applicable than regression models when predicting urbanStormwater quality.
Abstract: There are a vast number of complex, interrelated processes influencing urban stormwater quality. However, the lack of measured fundamental variables prevents the construction of process-based models. Furthermore, hybrid models such as the buildup-washoff models are generally crude simplifications of reality. This has created the need for statistical models, capable of making use of the readily accessible data. In this paper, artificial neural networks (ANN) were used to predict stormwater quality at urbanized catchments located throughout the United States. Five constituents were analysed: chemical oxygen demand (COD), lead (Pb), suspended solids (SS), total Kjeldhal nitrogen (TKN) and total phosphorus (TP). Multiple linear regression equations were initially constructed upon logarithmically transformed data. Input variables were primarily selected using a stepwise regression approach, combined with process knowledge. Variables found significant in the regression models were then used to construct ANN models. Other important network parameters such as learning rate, momentum and the number of hidden nodes were optimized using a trial and error approach. The final ANN models were then compared with the multiple linear regression models. In summary, ANN models were generally less accurate than the regression models and more time consuming to construct. This infers that ANN models are not more applicable than regression models when predicting urban stormwater quality.

Journal ArticleDOI
TL;DR: A new nestedness estimator that takes into account the weight of the interactions, that is, it runs over frequency matrices, is proposed, the first methodological approach that allows for the characterization of weighted nestedness.
Abstract: We propose a new nestedness estimator that takes into account the weight of the interactions, that is, it runs over frequency matrices. A nestedness measurement is calculated through the average distance from each matrix cell containing a link to the cell with the lowest marginal totals, in the packed matrix, using a weighted Manhattan distance. The significance of this nestedness measure is tested against a null model that constraints matrix fill to observed values and retains the distribution of number of events. This is the first methodological approach that allows for the characterization of weighted nestedness. We have developed a graphical user interface (GUI) running in Matlab to compute all these parameters. The software is also available as a script for R-package and in C++ version.

Journal ArticleDOI
TL;DR: The discussed applications of the SD approach conclude that it helps to conceptualize and simulate complex and dynamic water system processes deterministically which are otherwise partly simulated by conventional hydrologic and stochastic modelling approaches.
Abstract: The interaction among various water cycle components consists of complex, non-linear, and bidirectional (interdependent) biophysical processes which can be interpreted using feedback loops in a system dynamics (SD) environment. This paper demonstrates application of an SD approach with two case studies using a specialised software tool, Vensim. The first case study simulates water balance in a rice field system on a daily basis under aerobic conditions with provision of supplemental irrigation on demand. A physically based conceptual water balance model was developed and then implemented using Vensim to simulate the processes that occur in the field water balance system including percolation, surface runoff, actual evapotranspiration, and capillary rise. The second case study simulates surface-groundwater dynamic interactions in an irrigation area where river water and groundwater are two key sources of irrigation. The modelled system encompasses dynamically linked processes including seepage from the river, evaporation from a shallow watertable, groundwater storage, and lateral flow from upland to lowland areas. The model can be applied to simulate responses of different irrigation management scenarios, to develop strategies to improve water use efficiency and control watertable, to prevent salinization in upland, and to reduce the cost of groundwater abstraction in lowland areas. The discussed applications of the SD approach conclude that it helps to conceptualize and simulate complex and dynamic water system processes deterministically which are otherwise partly simulated by conventional hydrologic and stochastic modelling approaches. It is recognised that conceptualization and implementation phases of this approach are challenging, however, the latter is greatly assisted by modern computer softwares.