Journal ArticleDOI
Prediction of groundwater quality using efficient machine learning technique.
TLDR
A deep learning (DL) based model is proposed for predicting groundwater quality and compared with three other machine learning (ML) models, namely, random forest, eXtreme gradient boosting (XGBoost), and artificial neural network, which showed that DL model is the best prediction model with the highest accuracy.About:
This article is published in Chemosphere.The article was published on 2021-08-01. It has received 104 citations till now.read more
Citations
More filters
Journal ArticleDOI
Machine learning algorithms for efficient water quality prediction
TL;DR: In this article, the authors take the advantages of machine learning algorithms to develop a model that is capable of predicting water quality index and then the water quality class based on four water parameters: temperature, pH, turbidity and coliforms.
Journal ArticleDOI
Applications of various data-driven models for the prediction of groundwater quality index in the Akot basin, Maharashtra, India.
TL;DR: In this paper, four standalone methods such as additive regression (AR), M5P tree model (M5P), random subspace (RSS), and support vector machine (SVM) were employed to predict WQI based on variable elimination technique.
Journal ArticleDOI
Using Machine Learning Models for Predicting the Water Quality Index in the La Buong River, Vietnam
TL;DR: The results indicated that all twelve ML models have good performance in predicting the WQI but that extreme gradient boosting (XGBoost) has the best performance with the highest accuracy, which strengthens the argument that ML models, especially XGBeost, may be employed for WQi prediction with a high level of accuracy,Which will further improve water quality management.
Journal ArticleDOI
An Integrated Statistical-Machine Learning Approach for Runoff Prediction
Abhinav Kumar Singh,Pankaj Kumar,Rawshan Ali,Nadhir Al-Ansari,Dinesh Kumar Vishwakarma,Kuldeep Kushwaha,Kanhu Charan Panda,Atish Sagar,Ehsan Mirzania,Ahmed Elbeltagi,Alban Kuriqi,Salim Heddam +11 more
TL;DR: In this article , several data-driven models, namely, multiple linear regression (MLR), multiple adaptive regression splines (MARS), support vector machine (SVM), and random forest (RF), were used for rainfall-runoff prediction of the Gola watershed, located in the south-eastern part of the Uttarakhand.
Journal ArticleDOI
Pre- and post-dam river water temperature alteration prediction using advanced machine learning models
Dinesh Kumar Vishwakarma,Rawshan Ali,Shakeel Ahmad Bhat,Ahmed Elbeltagi,N. L. Kushwaha,Rohitashw Kumar,Jitendra Rajput,Salim Heddam,Alban Kuriqi +8 more
TL;DR: In this article , the Yangtze River at Cuntan was predicted using machine learning models, namely, M5P, Random Forest (RF), Random Subspace (RSS), and Reduced Error Pruning Tree (REPTree), and the outputs of various machine learning algorithm models were compared with recorded daily water temperature data using goodness-of-fit criteria and graphical analysis to arrive at a final comparison.
References
More filters
Journal Article
Dropout: a simple way to prevent neural networks from overfitting
TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI
Multilayer feedforward networks are universal approximators
TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Proceedings ArticleDOI
XGBoost: A Scalable Tree Boosting System
Tianqi Chen,Carlos Guestrin +1 more
TL;DR: XGBoost as discussed by the authors proposes a sparsity-aware algorithm for sparse data and weighted quantile sketch for approximate tree learning to achieve state-of-the-art results on many machine learning challenges.