scispace - formally typeset
Search or ask a question

Showing papers in "Environmental Modelling and Software in 2007"


Journal ArticleDOI
TL;DR: A revised version of the elementary effects method is proposed, improved in terms of both the definition of the measure and the sampling strategy, having the advantage of a lower computational cost.
Abstract: In 1991 Morris proposed an effective screening sensitivity measure to identify the few important factors in models with many factors. The method is based on computing for each input a number of incremental ratios, namely elementary effects, which are then averaged to assess the overall importance of the input. Despite its value, the method is still rarely used and instead local analyses varying one factor at a time around a baseline point are usually employed. In this piece of work we propose a revised version of the elementary effects method, improved in terms of both the definition of the measure and the sampling strategy. In the present form the method shares many of the positive qualities of the variance-based techniques, having the advantage of a lower computational cost, as demonstrated by the analytical examples. The method is employed to assess the sensitivity of a chemical reaction model for dimethylsulphide (DMS), a gas involved in climate change. Results of the sensitivity analysis open up the ground for model reconsideration: some model components may need a more thorough modelling effort while some others may need to be simplified.

1,528 citations


Journal ArticleDOI
TL;DR: This study illustrates the usefulness of multivariate statistical techniques for analysis and interpretation of complex data sets, and in water quality assessment, identification of pollution sources/factors and understanding temporal/spatial variations in waterquality for effective river water quality management.
Abstract: Multivariate statistical techniques, such as cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA), were applied for the evaluation of temporal/spatial variations and the interpretation of a large complex water quality data set of the Fuji river basin, generated during 8 years (1995–2002) monitoring of 12 parameters at 13 different sites (14 976 observations). Hierarchical cluster analysis grouped 13 sampling sites into three clusters, i.e., relatively less polluted (LP), medium polluted (MP) and highly polluted (HP) sites, based on the similarity of water quality characteristics. Factor analysis/principal component analysis, applied to the data sets of the three different groups obtained from cluster analysis, resulted in five, five and three latent factors explaining 73.18, 77.61 and 65.39% of the total variance in water quality data sets of LP, MP and HP areas, respectively. The varifactors obtained from factor analysis indicate that the parameters responsible for water quality variations are mainly related to discharge and temperature (natural), organic pollution (point source: domestic wastewater) in relatively less polluted areas; organic pollution (point source: domestic wastewater) and nutrients (non-point sources: agriculture and orchard plantations) in medium polluted areas; and organic pollution and nutrients (point sources: domestic wastewater, wastewater treatment plants and industries) in highly polluted areas in the basin. Discriminant analysis gave the best results for both spatial and temporal analysis. It provided an important data reduction as it uses only six parameters (discharge, temperature, dissolved oxygen, biochemical oxygen demand, electrical conductivity and nitrate nitrogen), affording more than 85% correct assignations in temporal analysis, and seven parameters (discharge, temperature, biochemical oxygen demand, pH, electrical conductivity, nitrate nitrogen and ammonical nitrogen), affording more than 81% correct assignations in spatial analysis, of three different sampling sites of the basin. Therefore, DA allowed a reduction in the dimensionality of the large data set, delineating a few indicator parameters responsible for large variations in water quality. Thus, this study illustrates the usefulness of multivariate statistical techniques for analysis and interpretation of complex data sets, and in water quality assessment, identification of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.

1,481 citations


Journal ArticleDOI
TL;DR: It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work, but should be seen as a red thread throughout the modelling study starting from the very beginning.
Abstract: A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly complementary) methods commonly used in uncertainty assessment and characterisation: data uncertainty engine (DUE), error propagation equations, expert elicitation, extended peer review, inverse modelling (parameter estimation), inverse modelling (predictive uncertainty), Monte Carlo analysis, multiple model simulation, NUSAP, quality assurance, scenario analysis, sensitivity analysis, stakeholder involvement and uncertainty matrix. The applicability of these methods has been mapped according to purpose of application, stage of the modelling process and source and type of uncertainty addressed. It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work. Instead uncertainty should be seen as a red thread throughout the modelling study starting from the very beginning, where the identification and characterisation of all uncertainty sources should be performed jointly by the modeller, the water manager and the stakeholders.

1,112 citations


Journal ArticleDOI
TL;DR: Ten existing stormwater models are compared in relation to attributes relevant to modelling low-impact development urban stormwater drainage systems, and there are many areas for further model development, including broadening the range of contaminants.
Abstract: Low-impact development urban stormwater drainage systems (LID) are an increasingly popular method to reduce the adverse hydrologic and water quality effects of urbanisation. In this review, ten existing stormwater models are compared in relation to attributes relevant to modelling LID. The models are all based on conventional methods for runoff generation and routing, but half of the models add a groundwater/baseflow component and several include infiltration from LID devices. The models also use conventional methods for contaminant generation and treatment such as buildup-washoff conceptual models and first order decay processes, although some models add treatment mechanisms specific to particular types of LID device. Several models are capable of modelling distributed on-site devices with a fine temporal resolution and continuous simulation, yet the need for such temporal and spatial detail needs to be established. There is a trend towards incorporation of more types of LID into stormwater models, and some recent models incorporate a wide range of LID devices or measures. Despite this progress, there are many areas for further model development, many of which relate to stormwater models in general, including: broadening the range of contaminants; improving the representation of contaminant transport in streams and within treatment devices; treating baseflow components and runoff from pervious surfaces more thoroughly; linkage to habitat and toxicity models; linkage to automated calibration and prediction uncertainty models; investigating up-scaling for representation of on-site devices at a catchment level; and catchment scale testing of model predictions.

602 citations


Journal ArticleDOI
TL;DR: The paper discusses the importance of focusing on the transition to new management paradigms based on the insight that the systems to be managed are complex adaptive systems, and provides arguments for the role of social learning processes and the need to develop methods combining approaches from hard and soft systems analysis.
Abstract: Integrated environmental resources management is a purposeful activity with the goal to maintain and improve the state of an environmental resource affected by human activities. In many cases different goals are in conflict and the notion ''integrated'' clearly indicates that resources management should be approached from a broad perspective taking all potential trade-offs and different scales in space and time into account. However, we are yet far from putting into practice integrated resources management fully taking into account the complexity of human-technology-environment systems. The tradition of resources management and of dealing with environmental problems is characterized by a command and control approach. The increasing awareness for the complexity of environmental problems and of human-technology-environment systems has triggered the development of new management approaches. The paper discusses the importance of focusing on the transition to new management paradigms based on the insight that the systems to be managed are complex adaptive systems. It provides arguments for the role of social learning processes and the need to develop methods combining approaches from hard and soft systems analysis. Soft systems analysis focuses on the importance of subjective perceptions and socially constructed reality. Soft systems methods and group model building techniques are quite common in management science where the prime target of management has always been the social system. Resources management is still quite slow to take up such innovations that should follow as a logical consequence of adopting an integrated management approach. Integrated water resources management is used as example to provide evidence for the need to implement participatory and adaptive management approaches that are able to cope with increasing uncertainties arising from fast changing socio-economic conditions and global and climate change. Promising developments and future research directions are discussed. The paper concludes with pointing out the need for changes in the scientific community to improve the conditions for interdisciplinary, system-oriented and trans-disciplinary research.

495 citations


Journal ArticleDOI
TL;DR: Predicting next day hourly ozone concentrations through a new methodology based on feedforward artificial neural networks using principal components as inputs improved both models prediction by reducing their complexity and eliminating data collinearity.
Abstract: The prediction of tropospheric ozone concentrations is very important due to the negative impacts of ozone on human health, climate and vegetation. The development of models to predict ozone concentrations is thus very useful because it can provide early warnings to the population and also reduce the number of measuring sites. The aim of this study was to predict next day hourly ozone concentrations through a new methodology based on feedforward artificial neural networks using principal components as inputs. The developed model was compared with multiple linear regression, feedforward artificial neural networks based on the original data and also with principal component regression. Results showed that the use of principal components as inputs improved both models prediction by reducing their complexity and eliminating data collinearity.

472 citations


Journal ArticleDOI
TL;DR: The development of a weather generator for use in climate impact assessments of agricultural and water system management, which produces internally consistent series of meteorological variables including: rainfall, temperature, humidity, wind, sunshine, as well as derivation of potential evapotranspiration.
Abstract: This paper describes the development of a weather generator for use in climate impact assessments of agricultural and water system management. The generator produces internally consistent series of meteorological variables including: rainfall, temperature, humidity, wind, sunshine, as well as derivation of potential evapotranspiration. The system produces series at a daily time resolution, using two stochastic models in series: first, for rainfall which produces an output series which is then used for a second model generating the other variables dependent on rainfall. The series are intended for single sites defined nationally across the UK at a 5km resolution, but can be generated to be representative across small catchments (<1000km2). Scenarios can be generated for the control period (1961-1990) based on observed data, as well as for the UK Climate Impacts Programme (UKCIP02) scenarios for three time slices (2020s, 2050s and 2080s). Future scenarios are generated by fitting the models to observations which have been perturbed by application of change factors derived from the UKCIP02 mean projected changes in that variable. These change factors are readily updated, as new scenarios become available, and with suitable calibration data the approach could be extended to any geographical region.

469 citations


Journal ArticleDOI
TL;DR: This paper develops a detailed methodology for parameterising and evaluating Bayesian networks using a risk assessment case study, with the focus being on native fish communities in the Goulburn Catchment (Victoria, Australia).
Abstract: Catchment managers face considerable challenges in managing ecological assets. This task is made difficult by the variable and complex nature of ecological assets, and the considerable uncertainty involved in quantifying how various threats and hazards impact upon them. Bayesian approaches have the potential to address the modelling needs of environmental management. However, to date many Bayesian networks (Bn) developed for environmental management have been parameterised using knowledge elicitation only. Not only are these models highly qualitative, but the time and effort involved in elicitation of a complex Bn can often be overwhelming. Unfortunately in environmental applications, data alone are often too limited for parameterising a Bn. Consequently, there is growing interest in how to parameterise Bns using both data and elicited information. At present, there is little formal guidance on how to combine what can be learned from the data with what can be elicited. In a previous publication we proposed a detailed methodology for this process, focussing on parameterising and evaluating a Bn. In this paper, we further develop this methodology using a risk assessment case study, with the focus being on native fish communities in the Goulburn Catchment (Victoria, Australia).

437 citations


Journal ArticleDOI
TL;DR: An open access web site that can be used by hydrologists and other scientists to evaluate time series models and includes an open forum that is intended to encourage further discussion and debate on the topic of hydrological performance evaluation metrics.
Abstract: This paper presents details of an open access web site that can be used by hydrologists and other scientists to evaluate time series models. There is at present a general lack of consistency in the way in which hydrological models are assessed that handicaps the comparison of reported studies and hinders the development of superior models. The HydroTest web site provides a wide range of objective metrics and consistent tests of model performance to assess forecasting skill. This resource is designed to promote future transparency and consistency between reported models and includes an open forum that is intended to encourage further discussion and debate on the topic of hydrological performance evaluation metrics. It is envisaged that the provision of such facilities will lead to the creation of superior forecasting metrics and the development of international benchmark time series datasets.

436 citations


Journal ArticleDOI
TL;DR: The background of recent developments in EDSS is given and a selected set of papers that were presented at the 2nd Biennial Conference of the International Society of Environmental Modelling and Software are summarised.
Abstract: Development of environmental decision support systems (EDSS) is rapidly progressing. The sustainable management of natural resources has a growing research focus as the awareness of the complexity of interactions between socio-cultural, economical and biophysical system components is increasingly acknowledged. As better data and methods become available, the complexity of the system representation is augmenting. At the same time realism and relevance are increasing and allowing direct support for management and policy development. This article gives the background of recent developments in EDSS and summarises a selected set of papers that were presented at the 2nd Biennial Conference of the International Society of Environmental Modelling and Software (IEMSS 2004). Recent developments show a continuum between integrated assessment modelling and EDSS with varying levels of stakeholder participation in both EDSS development and application. There is a general tendency towards better utilisation of interdisciplinary data, integration and visualisation of temporal and spatial results. Future developments appear directed towards better representation of reality in models, improving user-friendliness and use in a negotiation or group discussion context.

292 citations


Journal ArticleDOI
TL;DR: Some pros and cons of adopting Bns for water resource planning and management are analyzed by framing their use within the context of a participatory and integrated planning procedure, and exploring how they can be integrated with other types of models.
Abstract: Bayesian Networks (Bns) are emerging as a valid approach for modelling and supporting decision making in the field of water resource management. Based on the coupling of an interaction graph to a probabilistic model, they have the potential to improve participation and allow integration with other models. The wide availability of ready-to-use software with which Bn models can be easily designed and implemented on a PC is further contributing to their spread. Although a number of papers are available in which the application of Bns to water-related problems is investigated, the majority of these works use the Bn semantics to model the whole water system, and thus do not discuss their integration with other types of model. In this paper some pros and cons of adopting Bns for water resource planning and management are analyzed by framing their use within the context of a participatory and integrated planning procedure, and exploring how they can be integrated with other types of models.

Journal ArticleDOI
TL;DR: The proposed tool, named iCity - Irregular City, extends the traditional formalization of cellular automata to include an irregular spatial structure, asynchronous urban growth, and a high spatio-temporal resolution to aid in spatial decision making for urban planning.
Abstract: The objective of this study is to present a novel tool for predictive modelling of urban growth. The proposed tool, named iCity - Irregular City, extends the traditional formalization of cellular automata (CA) to include an irregular spatial structure, asynchronous urban growth, and a high spatio-temporal resolution to aid in spatial decision making for urban planning. The iCity software tool was developed as an embedded model within a common desktop geographic information system (GIS) with a user-friendly interface to control modelling operations for urban land-use change. This approach allows the model developer to focus on implementing model logic rather than developing an entire stand-alone modelling application. It also provides the model user with a familiar environment in which to run the model to simulate urban growth.

Journal ArticleDOI
TL;DR: Captain extends Matlab^(R) to allow, in the most general case, for the identification and estimation of a wide range of unobserved components models, and focuses on models with both time variable and state dependent parameters.
Abstract: The Data-Based Mechanistic (DBM) modelling philosophy emphasises the importance of parametrically efficient, low order, 'dominant mode' models, as well as the development of stochastic methods and the associated statistical analysis required for their identification and estimation Furthermore, it stresses the importance of explicitly acknowledging the basic uncertainty in the process, which is particularly important for the characterisation and forecasting of environmental and other poorly defined systems The paper focuses on a Matlab^(R) compatible toolbox that has evolved from this DBM modelling research Based around a state space and transfer function estimation framework, Captain extends Matlab^(R) to allow, in the most general case, for the identification and estimation of a wide range of unobserved components models Uniquely, however, Captain focuses on models with both time variable and state dependent parameters and has recently been implemented with the latest methodological developments in this regard Here, the main innovations are: the automatic optimisation of the hyper-parameters, which define the statistical properties of the time variable parameters; the provision of smoothed as well as filtered parameter estimates; the robust and statistically efficient identification and estimation of both discrete and continuous time transfer function models; and the availability of various special model structures that have wide application potential in the environmental sciences

Journal ArticleDOI
TL;DR: Artificial neural networks are employed to estimate the daily total suspended sediment load on rivers, and the simulated sediment load hydrographs obtained by two ANN methods are found closer to the observed ones again compared with multi-linear regression.
Abstract: Estimates of sediment load are required in a wide spectrum of water resources engineering problems. The nonlinear nature of suspended sediment load series necessitates the utilization of nonlinear methods for simulating the suspended sediment load. In this study artificial neural networks (ANNs) are employed to estimate the daily total suspended sediment load on rivers. Two different ANN algorithms, the feed-forward back-propagation (FFBP) method and the radial basis functions (RBF), were used for this purpose. The neural networks are trained using rainfall flow and suspended sediment load data from the Juniata Catchment, USA. The simulations provided satisfactory simulations in terms of the selected performance criteria comparing well with conventional multi-linear regression. Similarly, the simulated sediment load hydrographs obtained by two ANN methods are found closer to the observed ones again compared with multi-linear regression.

Journal ArticleDOI
TL;DR: The MULINO approach is presented, focusing on its potential for the current implementation process of the WFD, according to the recently released guidance documents and the experience gained in several case studies carried out during the research project.
Abstract: The EU Water Framework Directive, WFD (Dir. 2000/60/EC) introduces an innovative, integrated and holistic approach to the protection and management of water resources. New methodologies and tools are required to support implementation of the new policy. To fulfil these requirements, tools such as Decision Support Systems (DSSs) that integrate environmental, social and economic concerns and that facilitate the involvement of interested parties in the formulation of strategies may be useful. The MULINO project has developed a methodology and a DSS tool to tackle such problems. Focus is on connecting environmental tools and decision support methods by combining the DPSIR (Driving force, Pressure, State, Impact and Response) approach with multi-criteria analysis methods in a Decision Support System called mDSS. The proposed approach can be applied in decision processes in which a group of people (i.e. decision makers and stakeholders), share a common conceptual framework and procedure, to structure the problem, discuss the decision and communicate the proposed solution. In this paper, the MULINO approach is presented, focusing on its potential for the current implementation process of the WFD, according to the recently released guidance documents and the experience gained in several case studies carried out during the research project. The evaluation of the potential of the tool for applications in real-world management problems is carried out by taking into account the feedback from project partners and from end users, within and outside the research consortium.

Journal ArticleDOI
TL;DR: This paper explores the use of three methods of parameter and predictive uncertainty analysis, and compares their performance when used in conjunction with a lumped parameter model for surface water flow (HSPF) in a large watershed.
Abstract: Where numerical models are employed as an aid to environmental management, the uncertainty associated with predictions made by such models must be assessed. A number of different methods are available to make such an assessment. This paper explores the use of three such methods, and compares their performance when used in conjunction with a lumped parameter model for surface water flow (HSPF) in a large watershed. Linear (or first-order) uncertainty analysis has the advantage that it can be implemented with virtually no computational burden. While the results of such an analysis can be extremely useful for assessing parameter uncertainty in a relative sense, and ascertaining the degree of correlation between model parameters, its use in analyzing predictive uncertainty is often limited. Markov Chain Monte Carlo (MCMC) methods are far more robust, and can produce reliable estimates of parameter and predictive uncertainty. As well as this, they can provide the modeler with valuable qualitative information on the shape of parameter and predictive probability distributions; these shapes can be quite complex, especially where local objective function optima lie within those parts of parameter space that are considered probable after calibration has been undertaken. Nonlinear calibration-constrained optimization can also provide good estimates of parameter and predictive uncertainty, even in situations where the objective function surface is complex. Furthermore, they can achieve these estimates using far fewer model runs than MCMC methods. However, they do not provide the same amount of qualitative information on the probability structure of parameter space as do MCMC methods, a situation that can be partially rectified by combining their use with an efficient gradient-based search method that is specifically designed to locate different local optima. All methods of parameter and predictive uncertainty analysis discussed herein are implemented using freely-available software. Hence similar studies, or extensions of the present study, can be easily undertaken in other modeling contexts by other modelers.

Journal ArticleDOI
TL;DR: The VIDEO framework allows users to visually navigate large multi-objective solution sets while aiding decision makers in identifying one or more optimal designs, and is intended to provide an innovative exploration tool for high-order Pareto-optimal solution sets.
Abstract: This study presents a framework for Visually Interactive Decision-making and Design using Evolutionary Multi-objective Optimization (VIDEO). The VIDEO framework allows users to visually navigate large multi-objective solution sets while aiding decision makers in identifying one or more optimal designs. Specifically, the interactive visualization framework is intended to provide an innovative exploration tool for high-order Pareto-optimal solution sets (i.e., solution sets for three or more objectives). The framework is demonstrated for a long-term groundwater monitoring (LTM) application in which users can explore and visualize tradeoffs for up to four design objectives, simultaneously. Interactive functionality within the framework allows the user to select solutions within the objective space and visualize the corresponding monitoring plan's performance in the design space. This functionality provides the user with a holistic picture of the information provided by a particular solution, ultimately allowing them to make a more informed decision. In addition, the ease with which the framework allows users to navigate and compare solutions as well as design tradeoffs leads to a time efficient analysis, even when there are thousands of potential solutions.

Journal ArticleDOI
TL;DR: Bns allow stakeholders' divergent values, interests and beliefs to be surfaced and negotiated in participatory processes for areas where conventional physically based groundwater models are insufficient due to lack of data, physical understanding, flexibility or lack of integration capability.
Abstract: Negotiation and active involvement with participation of water managers, experts, stakeholders and representatives of the general public requires decision support tools (Environmental Decision Support Systems; EDSS) that build on transparency and flexibility in order to reach sound action plans and management instruments. One possible EDSS for active involvement of stakeholders is application of Bayesian networks (Bns). The paper gives an example of a case study (The Danish case) where farmers and hydrologists disputed the degree to which pesticide application affected the quality of deep groundwater. Instead of selecting one opinion or another, the decision was made to include both in the Bns. By adopting this approach, it was possible to view the results from either point of view, accepting the reality of the situation, not becoming mired in an insoluble conflict, and in this way laying the foundation for future compromises. The paper explores Bns as a tool for acting on and dealing with management of groundwater protection. Bns allow stakeholders' divergent values, interests and beliefs to be surfaced and negotiated in participatory processes for areas where conventional physically based groundwater models are insufficient due to lack of data, physical understanding, flexibility or lack of integration capability. In this way, the agency will be able to address the institutional arrangement influencing groundwater protection in all its complexity.

Journal ArticleDOI
TL;DR: The Monte Carlo Analysis Toolbox (MCAT) is a Matlab library of visual and numerical analysis tools for the evaluation of hydrological and environmental models and also allows for the testing of hypotheses with respect to the model structure used.
Abstract: The detailed evaluation of mathematical models and the consideration of uncertainty in the modeling of hydrological and environmental systems are of increasing importance, and are sometimes even demanded by decision makers. At the same time, the growing complexity of models to represent real-world systems makes it more and more difficult to understand model behavior, sensitivities and uncertainties. The Monte Carlo Analysis Toolbox (MCAT) is a Matlab library of visual and numerical analysis tools for the evaluation of hydrological and environmental models. Input to the MCAT is the result of a Monte Carlo or population evolution based sampling of the parameter space of the model structure under investigation. The MCAT can be used off-line, i.e. it does not have to be connected to the evaluated model, and can thus be used for any model for which an appropriate sampling can be performed. The MCAT contains tools for the evaluation of performance, identifiability, sensitivity, predictive uncertainty and also allows for the testing of hypotheses with respect to the model structure used. In addition to research applications, the MCAT can be used as a teaching tool in courses that include the use of mathematical models.

Journal ArticleDOI
TL;DR: An integrated model framework based on a Bayesian network (Bn) is presented, used to assess the sustainability of eight coastal lake-catchment systems, located on the coast of New South Wales (NSW), Australia.
Abstract: Coastal lakes are ecosystems of significant value generating many ecological, social and economic benefits. Increasing demands resulting from urban development and other human activities within coastal lake catchments have the potential to result in their degradation and can lead to conflicts, for example between lake users and upstream communities. There are many techniques that can be used to integrate the variables involved in such conflicts including system dynamics, meta-modelling, and coupled component models, but many of these techniques are too complex for catchment managers to employ on a routine basis. The overall result is the potential to compromise the sustainability of these important ecosystems. This paper describes research to address this problem. It presents the development of an integrated model framework based on a Bayesian network (Bn). Bns are used to assess the sustainability of eight coastal lake-catchment systems, located on the coast of New South Wales (NSW), Australia. The paper describes the potential advantages in the use of Bns and the methods used to develop their frameworks. A case study application for the Cudgen Lake of northern NSW is presented to illustrate the techniques. The case study includes a description of the relevant management issues being considered, the model framework and the techniques used to derive input data. Results for the case study application and their implications for management are presented and discussed. Finally, the directions for future research and a discussion of the applicability of Bn techniques to support management in similar situations are proffered.

Journal ArticleDOI
TL;DR: A toolkit for distributed hydrologic modeling at multiple scales using two independent models within a geographic information system is presented and examples of the use of AGWA for watershed modeling and assessment at a range of scales are described.
Abstract: A toolkit for distributed hydrologic modeling at multiple scales using two independent models within a geographic information system is presented. This open-source, freely available software was developed through a collaborative endeavor involving two Universities and two government agencies. Called the Automated Geospatial Watershed Assessment tool (AGWA), this software is written for the ArcView GIS platform and is distributed as an extension via the Internet. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both the Soil and Water Assessment Tool (SWAT) and Kinematic Runoff and Erosion model (KINEROS2). These two distributed hydrologic models operate at different time scales and are suitable for application across a range of spatial scales. Descriptions of the GIS framework, hydrologic models, spatial analyses and algorithms that control the modeling process are given. Model requirements, limitations on the model applications and calibration techniques are described with examples of the use of AGWA for watershed modeling and assessment at a range of scales. 2006 Elsevier Ltd. All rights reserved.

Journal ArticleDOI
TL;DR: This paper shows that the urban forms in historical areas with narrower roads, complex road networks and a higher density of intersections lead to lower traffic volumes and thus lower noise pollution, but the greater street canyon effects in these historical urban areas lead to higher carbon monoxide (CO) concentrations.
Abstract: Based on the work of Jensen [Jensen, S.S., 1998. Mapping human exposure to traffic air pollution using GIS. Journal of Hazardous Materials 61(1-3), 385-392; Jensen, S.S., 1999. A geographic approach to modelling human exposure to traffic air pollution using GIS. Ph.D. Thesis. National Environmental Research Institute, Roskilde], a prototype system for modelling noise and air pollution is developed for the Macao Peninsula. The system integrates a road traffic noise model, an operational air pollution model, digital maps, an urban landscape model and a Geographic Information System (GIS). Compared with mesoscale model systems with input/output resolution in kilometres, the present one has a higher spatial resolution down to individual buildings along both sides of the street. Applying the developed model system, a preliminary study investigates the ways that four urban forms existing nowadays on the Macao Peninsula influence vehicle transport and street environment. This paper shows that the urban forms in historical areas with narrower roads, complex road networks and a higher density of intersections lead to lower traffic volumes and thus lower noise pollution. However, the greater street canyon effects in these historical urban areas lead to higher carbon monoxide (CO) concentrations.

Journal ArticleDOI
TL;DR: Results are presented on the dynamics of land-use change under different growth management strategies based on an area of the Dallas-Fort Worth (Texas, U.S.A.) region facing intense residential development.
Abstract: A major force affecting many forest ecosystems is the encroachment of residential, commercial and industrial development. Analysis of the complex interactions between development decisions and ecosystems, and how the environmental consequences of these decisions influence human values and subsequent decisions will lead to a better understanding of the environmental consequences of private choices and public policies. Determining conditions of the interactions between human decisions and natural systems that lead to long-term sustainability of forest ecosystems is one goal of this work. Interactions between human stakeholders are represented using multi-agent models that act on forest landscape models in the form of land-use change. Feedback on the effects of these actions is received through ecological habitat metrics and hydrological responses. Results are presented on the dynamics of land-use change under different growth management strategies based on an area of the Dallas-Fort Worth (Texas, U.S.A.) region facing intense residential development.

Journal ArticleDOI
TL;DR: A concept is outlined for the use of techniques of decision analysis to structure scientist and stakeholder involvement in river rehabilitation decisions to support consensus-building among stakeholders and stimulate the creation of alternatives with a greater degree of consensus.
Abstract: River rehabilitation decisions, like other decisions in environmental management, are often taken by authorities without sufficient transparency about how different goals, predictions, and concerns were considered during the decision making process. This can lead to lack of acceptance or even opposition by stakeholders. In this paper, a concept is outlined for the use of techniques of decision analysis to structure scientist and stakeholder involvement in river rehabilitation decisions. The main elements of this structure are (i) an objectives hierarchy that facilitates and stimulates explicit discussion of goals, (ii) an integrative probability network model for the prediction of the consequences of rehabilitation alternatives, and (iii) a mathematical representation of preferences for possible outcomes elicited from important stakeholders. This structure leads to transparency about expectations of outcomes by scientists and valuations of these outcomes by stakeholders and decision makers. It can be used (i) to analyze synergies and conflict potential between stakeholders, (ii) to analyze the sensitivity of alternative-rankings to uncertainty in prediction and valuation, and (iii) as a basis for communicating the reasons for the decision. These analyses can be expected to support consensus-building among stakeholders and stimulate the creation of alternatives with a greater degree of consensus. Because most decisions in environmental management are characterized by similarly complex scientific problems and diverse stakeholders, the outlined methodology will be easily transferable to other settings.

Journal ArticleDOI
TL;DR: It is argued that technical and scientific aspects of Policy Support Systems are not the sole elements deciding on their use in practice and concludes with some lessons learned during the development and use of the MedAction PSS and similar systems.
Abstract: Planners, policy-makers and their technicians have the difficult task to intervene in complex human-natural systems. It is not enough for them to focus on individual processes; rather it is necessary to address the system as a complex integral whole. In the given circumstances, integrated models as part of Policy Support Systems (PSS) can provide support. The MedAction PSS incorporates socio-economic and physical processes in a strongly coupled manner. It is implemented with the GEONAMICA application framework and is intended to support planning and policy making in the fields of land degradation, desertification, water management and sustainable farming. The objective of this paper is to provide some insight in the individual models, the model integration achieved, as well as the actual use of the MedAction PSS. For the latter an application example is developed. The paper also argues that technical and scientific aspects of Policy Support Systems are not the sole elements deciding on their use in practice and concludes with some lessons learned during the development and use of the MedAction PSS and similar systems. 2005 Elsevier Ltd. All rights reserved.

Journal ArticleDOI
TL;DR: A neural network model for predicting the methane fraction in landfill gas originating from field-scale landfill bioreactors and evaluated the anaerobic conversion efficiencies based on leachate characteristics during different time periods.
Abstract: In this study we present a neural network model for predicting the methane fraction in landfill gas originating from field-scale landfill bioreactors. Landfill bioreactors were constructed at the Odayeri Sanitary Landfill, Istanbul, Turkey, and operated with (C2) and without (C1) leachate recirculation. The refuse height of the test cell was 5m, with a placement area of 1250m^2 (25mx50m). We monitored the leachate and landfill gas components for 34 months, after which we modeled the methane fraction in landfill gas from the bioreactors (C1 and C2) using artificial neural networks; leachate components were used as input parameters. To predict the methane fraction in landfill gas as a final product of anaerobic digestion, we used input parameters such as pH, alkalinity, Chemical Oxygen Demand, sulfate, conductivity, chloride and waste temperature. We evaluated the anaerobic conversion efficiencies based on leachate characteristics during different time periods. We determined the optimal architecture of the neural network, and advantages, disadvantages and further developments of the network are discussed.

Journal ArticleDOI
TL;DR: An enhancement to the basic ASM3 model is proposed, introducing a two-step model for the process nitrification and, consequently, considering denitrification on both nitrite and nitrate.
Abstract: A common limitation of the Activated Sludge Models (ASM) [Henze, M., Gujer, W., Mino, T., van Loosdrecht, M.C.M., 2000. Activated Sludge Models ASM1, ASM2, ASM2d, and ASM3. IWA Scientific and Technical Report No. 9. IWA Publishing, London, UK] is the representation of the nitrification dynamics as a single-step process and the consequent denitrification on nitrate alone. This generally acknowledged simplification may represent a serious limitation in specific applications where nitrites become important, either as a target final product or an unwanted intermediate. This paper proposes an enhancement to the basic ASM3 model, introducing a two-step model for the process nitrification and, consequently, considering denitrification on both nitrite and nitrate. After introducing the relevant process kinetics and adapting the stoichiometric matrix accordingly, the model implementation in the Matlab/Simulink(TM) platform is described with reference to the benchmark setting. To obtain a fast implementation, the process units (reaction tanks and secondary settler) have been implemented as DLLs linked to the Simulink blocks, whereas the model parameters and stoichiometric matrix remain accessible to the user. The new model is compared with the standard ASM3 and checked for consistency and mass conservation. It is also shown that with the default kinetic parameters nitrite may represent a considerable fraction of the nitrified effluent, thus revealing a design limitation in the benchmark sizing. In the last part, an optimization of the benchmark plant volumes has been attempted in order to minimize such violations, resulting in a moderate increase of the overall reaction volume. The pertinent software is freely available for research purposes.

Journal ArticleDOI
TL;DR: EvoLand is examined, which uses an actor-based approach to conduct alternative futures analyses in the Willamette Basin, Oregon, which challenges the modeling community to provide tools that capture sufficiently the richness of human and ecosystem processes and interactions in ways that are computationally tractable and understandable.
Abstract: Increasingly, models (and modelers) are being asked to address the interactions between human influences, ecological processes, and landscape dynamics that impact many diverse aspects of managing complex coupled human and natural systems. These systems may be profoundly influenced by human decisions at multiple spatial and temporal scales, and the limitations of traditional process-level ecosystems modeling approaches for representing the richness of factors shaping landscape dynamics in these coupled systems has resulted in the need for new analysis approaches. New tools in the areas of spatial data management and analysis, multicriteria decision-making, individual-based modeling, and complexity science have all begun to impact how we approach modeling these systems. The term ''biocomplexity'' has emerged as a descriptor of the rich patterns of interactions and behaviors in human and natural systems, and the challenges of analyzing biocomplex behavior is resulting in a convergence of approaches leading to new ways of understanding these systems. Important questions related to system vulnerability and resilience, adaptation, feedback processing, cycling, non-linearities and other complex behaviors are being addressed using models employing new representational approaches to analysis. The complexity inherent in these systems challenges the modeling community to provide tools that capture sufficiently the richness of human and ecosystem processes and interactions in ways that are computationally tractable and understandable. We examine one such tool, EvoLand, which uses an actor-based approach to conduct alternative futures analyses in the Willamette Basin, Oregon.

Journal ArticleDOI
TL;DR: A generalised conceptual framework is provided for considering these types of interactions and their representation in integrated water allocation models and applications to three very different case studies are outlined.
Abstract: Nodal network approaches are a common framework for considering water allocation in river basins. In this type of model framework, a river basin is represented as a series of nodes, where nodes generally represent key points of extraction or instream use. When considering water allocation, agricultural production and other water use decisions generally interact with the stream system in two ways: they can affect the generation of runoff and thus the volume of water reaching the stream; or, they may involve direct extraction or use of water once it has reached the stream. Models are generally required to consider the influence of these decisions on flows and downstream water availability, as well as the influence of flows on the productive, passive use and environmental values of water. This paper provides a generalised conceptual framework for considering these types of interactions and their representation in integrated water allocation models. Applications of this framework to three very different case studies are outlined.

Journal ArticleDOI
TL;DR: This paper investigates the application of a numerical modelling system for large area hazard analysis in snow avalanche modelling and shows how the avalanche modelling system was applied over the mountainous region of Switzerland to delineate forests with protective function against avalanches.
Abstract: Snow avalanches threaten settlements and roads in steep mountainous areas. Hazard mitigation strategies apply numerical models in combination with GIS-based methods to determine run out distances and pressure maps of snow avalanches in three-dimensional terrain. The snow avalanche modelling system is usually applied to study single avalanche tracks. In this paper we investigate the application of a numerical modelling system for large area hazard analysis. We begin by briefly presenting the depth-averaged equations governing avalanche flow. Then, we describe the statistical and GIS-based methods that are applied to define the initial fracture depths and release areas for snow avalanche modelling. We discuss the calibration of the avalanche model friction coefficients for extreme avalanches in function of altitude, avalanche size and topography. Seven test sites with areas between 100 and 350km^2, that are well distributed over the different snow climates and elevation ranges of Switzerland, were used to calibrate the model by comparing the simulation results with historic avalanche events and existing avalanche hazard maps. We then show how the avalanche modelling system was applied over the mountainous region of Switzerland (25,000km^2) to delineate forests with protective function against avalanches.