Other affiliations: University of New Brunswick, Hunan University, University of Regina ...read more
Bio: Yuefei Huang is an academic researcher from Tsinghua University. The author has contributed to research in topics: Environmental science & Vapour Pressure Deficit. The author has an hindex of 33, co-authored 90 publications receiving 2966 citations. Previous affiliations of Yuefei Huang include University of New Brunswick & Hunan University.
Papers published on a yearly basis
TL;DR: Two popular variants of Recurrent Neural Network named Long Short-Term Memory and Gated Recurrent Unit networks were employed to develop new data-driven flood forecasting models, showing that GRU models perform equally well as LSTM models and GRU may be the preferred method in short term runoff predictions.
Abstract: Runoff forecasting is an important approach for flood mitigation Many machine learning models have been proposed for runoff forecasting in recent years To reconstruct the time series of runoff data into a standard machine learning dataset, a sliding window method is usually used to pre-process the data, with the size of the window as a variable parameter which is commonly referred to as the time step Conventional machine learning methods, such as artificial neural network models (ANN), require optimization of the time step because both too small and too large time steps reduce prediction accuracy In this work two popular variants of Recurrent Neural Network (RNN) named Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) networks were employed to develop new data-driven flood forecasting models GRU and LSTM models are in theory able to filter redundant information automatically, and therefore a large time step is expected to not reduce prediction accuracy The three models (LSTM, GRU, and ANN) were applied to simulate runoff in the Yutan station control catchment, Fujian Province, Southeast China, using hourly discharge measurements of one runoff station and hourly rainfall of four rainfall stations from 2000 to 2014 Results show that the prediction accuracy of LSTM and GRU models increases with increasing time step, and eventually stabilizes This allows selection of a relatively large time step in practical runoff prediction without first evaluating and optimizing the time step required by conventional machine learning models We also show that LSTM and GRU models perform better than ANN models when the time step is optimized GRU models have fewer parameters and less complicated structures compared to LSTM models, and our results show that GRU models perform equally well as LSTM models GRU may be the preferred method in short term runoff predictions since it requires less time for model training
TL;DR: A multistage fuzzy-stochastic programming (MFSP) model is developed for tackling uncertainties presented as fuzzy sets and probability distributions and a vertex analysis approach is proposed for solving multiple fuzzy sets in the MFSP model.
Abstract: In this study, a multistage fuzzy-stochastic programming (MFSP) model is developed for tackling uncertainties presented as fuzzy sets and probability distributions. A vertex analysis approach is proposed for solving multiple fuzzy sets in the MFSP model. Solutions under a set of @a-cut levels can be generated by solving a series of deterministic submodels. The developed method is applied to the planning of a case study for water-resources management. Dynamics and uncertainties of water availability (and thus water allocation and shortage) could be taken into account through generation of a set of representative scenarios within a multistage context. Moreover, penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. The modeling results can help to generate a range of alternatives under various system conditions, and thus help decision makers to identify desired water-resources management policies under uncertainty.
TL;DR: The IFSRA integrated fuzzy logic, expert involvement, and stochastic simulation within a general framework was applied to a petroleum-contaminated groundwater system in western Canada and provided more realistic support for remediation-related decisions.
Abstract: An integrated fuzzy-stochastic risk assessment (IFSRA) approach was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with site conditions, environmental guidelines, and health impact criteria. The contaminant concentrations in groundwater predicted from a numerical model were associated with probabilistic uncertainties due to the randomness in modeling input parameters, while the consequences of contaminant concentrations violating relevant environmental quality guidelines and health evaluation criteria were linked with fuzzy uncertainties. The contaminant of interest in this study was xylene. The environmental quality guideline was divided into three different strictness categories: "loose", "medium" and "strict". The environmental-guideline-based risk (ER) and health risk (HR) due to xylene ingestion were systematically examined to obtain the general risk levels through a fuzzy rule base. The ER and HR risk levels were divided into five categories of "low", "low-to-medium", "medium", "medium-to-high" and "high", respectively. The general risk levels included six categories ranging from "low" to "very high". The fuzzy membership functions of the related fuzzy events and the fuzzy rule base were established based on a questionnaire survey. Thus the IFSRA integrated fuzzy logic, expert involvement, and stochastic simulation within a general framework. The robustness of the modeling processes was enhanced through the effective reflection of the two types of uncertainties as compared with the conventional risk assessment approaches. The developed IFSRA was applied to a petroleum-contaminated groundwater system in western Canada. Three scenarios with different environmental quality guidelines were analyzed, and reasonable results were obtained. The risk assessment approach developed in this study offers a unique tool for systematically quantifying various uncertainties in contaminated site management, and it also provides more realistic support for remediation-related decisions.
TL;DR: In this paper, the concept of underlying water use efficiency (uWUE) was used to develop a new method for ET partitioning by assuming that the maximum, or the potential uWUE is related to transpiration while the averaged or apparent UWUE was related to evapotranspiration.
Abstract: Evapotranspiration (ET) is dominated by transpiration (T) in the terrestrial water cycle. However, continuous measurement of transpiration is still difficult, and the effect of vegetation on ET partitioning is unclear. The concept of underlying water use efficiency (uWUE) was used to develop a new method for ET partitioning by assuming that the maximum, or the potential uWUE is related to T while the averaged or apparent uWUE is related to ET. T/ET was thus estimated as the ratio of the apparent over the potential uWUE using half-hourly flux data from 17 AmeriFlux sites. The estimated potential uWUE was shown to be essentially constant for 14 of the 17 sites, and was broadly consistent with the uWUE evaluated at the leaf scale. The annual T/ET was the highest for croplands, i.e., 0.69 for corn and 0.62 for soybean, followed by grasslands (0.60) and evergreen needle leaf forests (0.56), and was the lowest for deciduous broadleaf forests (0.52). The enhanced vegetation index (EVI) was shown to be significantly correlated with T/ET and could explain about 75% of the variation in T/ET among the 71 site-years. The coefficients of determination between EVI and T/ET were 0.84 and 0.82 for corn and soybean, respectively, and 0.77 for deciduous broadleaf forests and grasslands, but only 0.37 for evergreen needle leaf forests. This ET partitioning method is sound in principle and simple to apply in practice, and would enhance the value and role of global FLUXNET in estimating T/ET variations and monitoring ecosystem dynamics.
TL;DR: It was indicated that the proposed linearization method was effective in dealing with IFNP problems; uncertainties can be communicated into optimization process and generate reliable solutions for decision variables and objectives; the decision alternatives can be obtained by adjusting different combinations of the decision variables within their solution intervals.
Abstract: Planning for water quality management systems is complicated by a variety of uncertainties and nonlinearities, where difficulties in formulating and solving the resulting inexact nonlinear optimization problems exist. With the purpose of tackling such difficulties, this paper presents the development of an interval-fuzzy nonlinear programming (IFNP) model for water quality management under uncertainty. Methods of interval and fuzzy programming were integrated within a general framework to address uncertainties in the left- and right-hand sides of the nonlinear constraints. Uncertainties in water quality, pollutant loading, and the system objective were reflected through the developed IFNP model. The method of piecewise linearization was developed for dealing with the nonlinearity of the objective function. A case study for water quality management planning in the Changsha section of the Xiangjiang River was then conducted for demonstrating applicability of the developed IFNP model. The results demonstrated that the accuracy of solutions through linearized method normally rises positively with the increase of linearization levels. It was also indicated that the proposed linearization method was effective in dealing with IFNP problems; uncertainties can be communicated into optimization process and generate reliable solutions for decision variables and objectives; the decision alternatives can be obtained by adjusting different combinations of the decision variables within their solution intervals. It also suggested that the linearized method should be used under detailed error analysis in tackling IFNP problems.
TL;DR: In this article, the authors present a document, redatto, voted and pubblicato by the Ipcc -Comitato intergovernativo sui cambiamenti climatici - illustra la sintesi delle ricerche svolte su questo tema rilevante.
Abstract: Cause, conseguenze e strategie di mitigazione Proponiamo il primo di una serie di articoli in cui affronteremo l’attuale problema dei mutamenti climatici. Presentiamo il documento redatto, votato e pubblicato dall’Ipcc - Comitato intergovernativo sui cambiamenti climatici - che illustra la sintesi delle ricerche svolte su questo tema rilevante.
01 Dec 2010
TL;DR: In this article, the authors suggest a reduction in the global NPP of 0.55 petagrams of carbon, which would not only weaken the terrestrial carbon sink, but would also intensify future competition between food demand and biofuel production.
Abstract: Terrestrial net primary production (NPP) quantifies the amount of atmospheric carbon fixed by plants and accumulated as biomass. Previous studies have shown that climate constraints were relaxing with increasing temperature and solar radiation, allowing an upward trend in NPP from 1982 through 1999. The past decade (2000 to 2009) has been the warmest since instrumental measurements began, which could imply continued increases in NPP; however, our estimates suggest a reduction in the global NPP of 0.55 petagrams of carbon. Large-scale droughts have reduced regional NPP, and a drying trend in the Southern Hemisphere has decreased NPP in that area, counteracting the increased NPP over the Northern Hemisphere. A continued decline in NPP would not only weaken the terrestrial carbon sink, but it would also intensify future competition between food demand and proposed biofuel production.
TL;DR: 1. Place animal in induction chamber and anesthetize the mouse and ensure sedation, move it to a nose cone for hair removal using cream and reduce anesthesia to maintain proper heart rate.
Abstract: 1. Place animal in induction chamber and anesthetize the mouse and ensure sedation. 2. Once the animal is sedated, move it to a nose cone for hair removal using cream. Only apply cream to the area of the chest that will be utilized for imaging. Once the hair is removed, wipe area with wet gauze to ensure all hair is removed. 3. Move the animal to the imaging platform and tape its paws to the ECG lead plates and insert rectal probe. Body temperature should be maintained at 36-37°C. During imaging, reduce anesthesia to maintain proper heart rate. If the animal shows signs of being awake, use a higher concentration of anesthetic.
TL;DR: The assessment was completed by the Intergovernmental Panel on Climate Change (IPCC) with a primary aim of reviewing the current state of knowledge concerning the impacts of climate change on physical and ecological systems, human health, and socioeconomic factors as mentioned in this paper.
Abstract: Climate Change 1995 is a scientific assessment that was generated by more than 1 000 contributors from over 50 nations. It was jointly co-ordinated through two international agencies; the World Meteorological Organization and the United Nations Environment Programme. The assessment was completed by the Intergovernmental Panel on Climate Change (IPCC) with a primary aim of reviewing the current state of knowledge concerning the impacts of climate change on physical and ecological systems, human health, and socioeconomic factors. The second aim was to review the available information on the technical and economic feasibility of the potential mitigation and adaptation strategies.
TL;DR: This paper reviews theory and methodology that have been developed to cope with the complexity of optimization problems under uncertainty and discusses and contrast the classical recourse-based stochastic programming, robust stochastics programming, probabilistic (chance-constraint) programming, fuzzy programming, and stochastically dynamic programming.
Abstract: A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Uncertainty, for instance, governs the prices of fuels, the availability of electricity, and the demand for chemicals. A key difficulty in optimization under uncertainty is in dealing with an uncertainty space that is huge and frequently leads to very large-scale optimization models. Decision-making under uncertainty is often further complicated by the presence of integer decision variables to model logical and other discrete decisions in a multi-period or multi-stage setting. This paper reviews theory and methodology that have been developed to cope with the complexity of optimization problems under uncertainty. We discuss and contrast the classical recourse-based stochastic programming, robust stochastic programming, probabilistic (chance-constraint) programming, fuzzy programming, and stochastic dynamic programming. The advantages and shortcomings of these models are reviewed and illustrated through examples. Applications and the state-of-the-art in computations are also reviewed. Finally, we discuss several main areas for future development in this field. These include development of polynomial-time approximation schemes for multi-stage stochastic programs and the application of global optimization algorithms to two-stage and chance-constraint formulations.