scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Post-blackening approach for modeling periodic streamflows

31 Jan 2001-Journal of Hydrology (Elsevier)-Vol. 241, Iss: 3, pp 221-269
TL;DR: In this paper, a semi-parametric post-blackening (PB) approach was proposed for periodic streamflows. But the model was only applied to the Beaver and Weber rivers in the US.
About: This article is published in Journal of Hydrology.The article was published on 2001-01-31. It has received 30 citations till now. The article focuses on the topics: Parametric model & Semiparametric model.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a rigorous methodology is described for quantifying some of the uncertainties of climate change impact studies, excluding those due to downscaling techniques, and applied on a set of five catchments in Great Britain.

342 citations

Journal ArticleDOI
TL;DR: Evaluating six published data fusion strategies for hydrological forecasting based on two contrasting catchments reveals unequal aptitudes for fixing different categories of problematic catchment behaviour and, in such cases, the best method were a good deal better than their closest rival(s).
Abstract: . This paper evaluates six published data fusion strategies for hydrological forecasting based on two contrasting catchments: the River Ouse and the Upper River Wye. The input level and discharge estimates for each river comprised a mixed set of single model forecasts. Data fusion was performed using: arithmetic-averaging, a probabilistic method in which the best model from the last time step is used to generate the current forecast, two different neural network operations and two different soft computing methodologies. The results from this investigation are compared and contrasted using statistical and graphical evaluation. Each location demonstrated several options and potential advantages for using data fusion tools to construct superior estimates of hydrological forecast. Fusion operations were better in overall terms in comparison to their individual modelling counterparts and two clear winners emerged. Indeed, the six different mechanisms on test revealed unequal aptitudes for fixing different categories of problematic catchment behaviour and, in such cases, the best method(s) were a good deal better than their closest rival(s). Neural network fusion of differenced data provided the best solution for a stable regime (with neural network fusion of original data being somewhat similar) — whereas a fuzzified probabilistic mechanism produced a superior output in a more volatile environment. The need for a data fusion research agenda within the hydrological sciences is discussed and some initial suggestions are presented. Keywords: data fusion, fuzzy logic, neural network, hydrological modelling

131 citations


Cites methods from "Post-blackening approach for modeli..."

  • ...…manner of: (i) estimation and addition of residuals; (ii) building modular assemblies of expert subunits (Zhang and Govindaraju, 2000); (iii) performing weighted combination of individual forecasters (Shamseldin et al., 1997); or (iv) bootstrapping operations (Srinivas and Srinivasan, 2000, 2001)....

    [...]

Journal ArticleDOI
TL;DR: In this article, a K-nearest-neighbor approach is proposed to resample monthly flows conditioned on an annual value in a temporal disaggregation or multiple upstream locations conditioned on a downstream location for a spatial disaggregation.
Abstract: [1] Stochastic disaggregation models are used to simulate streamflows at multiple sites preserving their temporal and spatial dependencies. Traditional approaches to this problem involve transforming the streamflow data of each month and at every location to a Gaussian structure and subsequently fitting a linear model in the transformed space. The simulations are then back transformed to the original space. The main drawbacks of this approach are (1) transforming marginals to Gaussian need not lead to the correct multivariate distribution particularly if the dependence across variables is nonlinear, and (2) the number of parameters to be estimated for a traditional disaggregation model grows rapidly with an increase in space or time components. We present a K-nearest-neighbor approach to resample monthly flows conditioned on an annual value in a temporal disaggregation or multiple upstream locations conditioned on a downstream location for a spatial disaggregation. The method is parsimonious, as the only parameter to estimate is K (the number of nearest neighbors to be used in resampling). Simulating space-time flow scenarios conditioned upon large-scale climate information (e.g., El Nino–Southern Oscillation, etc.) can be easily achieved. We demonstrate the utility of this methodology by applying it for space-time disaggregation of streamflows in the Upper Colorado River basin. The method appropriately captures the distributional and spatial dependency properties at all the locations.

101 citations

Journal ArticleDOI
TL;DR: In this article, the authors employ the R/S and V/S methodologies to test for long-range dependence in equity returns and volatility in emerging markets and find that although emerging markets possess stronger long-term dependence in stock returns than developed economies, this is not true for volatility.
Abstract: In this paper, we show a novel approach to rank stock market indices in terms of weak form efficiency using state of the art methodology in statistical physics. We employ the R/S and V/S methodologies to test for long-range dependence in equity returns and volatility. Empirical results suggests that although emerging markets possess stronger long-range dependence in equity returns than developed economies, this is not true for volatility. In the case of volatility, Hurst exponents are substantially high for both classes of countries, which indicates that traditional option prices such as the Black and Scholes model are misspecified. These findings have important implications for both portfolio and risk management.

76 citations

Journal ArticleDOI
TL;DR: In this article, the authors extended the hybrid approach introduced by the authors for at-site modeling of annual and periodic streamflows in earlier works to simulate multi-site multi-season streamflows.

75 citations

References
More filters
Book
01 Jan 1993
TL;DR: This article presents bootstrap methods for estimation, using simple arguments, with Minitab macros for implementing these methods, as well as some examples of how these methods could be used for estimation purposes.
Abstract: This article presents bootstrap methods for estimation, using simple arguments. Minitab macros for implementing these methods are given.

37,183 citations

Book
01 Jan 1970
TL;DR: In this article, a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970 is presented, focusing on practical techniques throughout, rather than a rigorous mathematical treatment of the subject.
Abstract: From the Publisher: This is a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970. It focuses on practical techniques throughout, rather than a rigorous mathematical treatment of the subject. It explores the building of stochastic (statistical) models for time series and their use in important areas of application —forecasting, model specification, estimation, and checking, transfer function modeling of dynamic relationships, modeling the effects of intervention events, and process control. Features sections on: recently developed methods for model specification, such as canonical correlation analysis and the use of model selection criteria; results on testing for unit root nonstationarity in ARIMA processes; the state space representation of ARMA models and its use for likelihood estimation and forecasting; score test for model checking; and deterministic components and structural components in time series models and their estimation based on regression-time series model methods.

19,748 citations

BookDOI
01 Jan 1986
TL;DR: The Kernel Method for Multivariate Data: Three Important Methods and Density Estimation in Action.
Abstract: Introduction. Survey of Existing Methods. The Kernel Method for Univariate Data. The Kernel Method for Multivariate Data. Three Important Methods. Density Estimation in Action.

15,499 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss the problem of estimating the sampling distribution of a pre-specified random variable R(X, F) on the basis of the observed data x.
Abstract: We discuss the following problem given a random sample X = (X 1, X 2,…, X n) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = \(\theta \left( {\hat F} \right) - \theta \left( F \right)\), θ some parameter of interest.) A general method, called the “bootstrap”, is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

14,483 citations