Site Characterization Using GP, MARS and GPR
01 Jan 2015-pp 345-357
TL;DR: The developed GP, MARS and GPR give the spatial variability of Nc values at Bangalore, which is to be approximated with which N value at any half space point in Bangalore can be determined.
Abstract: This article examines the capability of Genetic Programming (GP), Multivariate Adaptive Regression Spline (MARS) and Gaussian Process Regression (GPR) for developing site characterization model of Bangalore (India) based on corrected Standard Penetration Test (SPT) value (Nc). GP, MARS and GPR have been used as regression techniques. GP is developed based on genetic algorithm. MARS does not assume any functional relationship between input and output variables. GPR is a probabilistic, non-parametric model. In GPR, different kinds of prior knowledge can be applied. In three dimensional analysis, the function\( {\mathrm{N}}_{\mathrm{c}}=\mathrm{f}\left(\mathrm{X},\mathrm{Y},\mathrm{Z}\right) \) where X, Y and Z are the coordinates of a point corresponding to N value, is to be approximated with which N value at any half space point in Bangalore can be determined. A comparative study between the developed GP, MARS and GPR has been carried out in the proposed book chapter. The developed GP, MARS and GPR give the spatial variability of Nc values at Bangalore.
Citations
More filters
••
TL;DR: The main conclusions is that the number of researches in this field increases almost exponentially, the most used (AI) technique is the Artificial Neural Networks and its enhancements where it is presents about half the researches and finally correlating soil and rock properties is the most addressed subject with about 30% of the researched.
Abstract: It was 35 years ago since the first usage of Artificial Intelligence (AI) technique in geotechnical engineering, during those years many (AI) techniques were developed based in mathematical, statistical and logical concepts, but the breakthrough occurs by mimicking the natural searching and optimization algorithms. This huge development in (AI) techniques reflected on the geotechnical engineering problems. In this research, 626 paper and thesis published in the period from 1984 to 2019 concerned in applying (AI) techniques in geotechnical engineering were collected, filtered, arranged and classified with respect to subject, (AI) technique, publisher and publishing date and stored in a database. The extracted information from the database were tabulated, presented graphically and commented. The main conclusions is that the number of researches in this field increases almost exponentially, the most used (AI) technique is the Artificial Neural Networks and its enhancements where it is presents about half the researches and finally correlating soil and rock properties is the most addressed subject with about 30% of the researches.
56 citations
Additional excerpts
...…366 Miranda et al. (2004) 406 Ornek et al. (2012) 446 Samui and Sitharam (2010) 327 Amini et al. (2005) 367 Shahin (2015a) 407 Zaman et al. (2010) 447 Samui et al. (2015) 328 Prasad and Giridhar (2014) 368 Shahin (2014a) 408 Himanshu and Burman (2018) 448 Pirnia et al. (2018) 329 Jaksa et al.…...
[...]
••
TL;DR: Novel hybrid models based on combination of the modified version of the equilibrium optimizer (EO) and two conventional machine learning algorithms, namely extreme learning machine (ELM) and artificial neural network (ANN) are constructed to predict the permeability of tight carbonates.
Abstract: It is a problematic task to perform petro-physical property prediction of carbonate reservoir rocks in most cases, specifically for permeability prediction since a carbonate rock most commonly contains grains of heterogeneous size distributions. Consequently, the permeability calculation of tight rocks in laboratories is costly and very time-consuming. Therefore, this study aims to tackle this issue by developing novel hybrid models based on combination of the modified version of the equilibrium optimizer (EO), i.e., MEO, and two conventional machine learning algorithms, namely extreme learning machine (ELM) and artificial neural network (ANN). The MEO employs a mutation mechanism in order to avoid trapping in local optima of EO by increasing the search capabilities. In this study, ELM-MEO and ANN-MEO, novel metaheuristic ELM-based and ANN-based algorithms, were constructed to predict the permeability of tight carbonates. In addition, ANN, ELM, RF, RVM and MARS combined with particle swarm optimization and genetic programming algorithm have a better insight into the performances for preferably predicting the permeability carbonates. The results illustrate that the proposed ELM-MEO model with R2 = 0.9323, RMSE = 0.0612 and MAE = 0.0442 in training stage and R2 = 0.8743, RMSE = 0.0806 and MAE = 0.0660 in testing stage, outperformed other ELM-based and ANN-based metaheuristic models in predicting the permeability of tight carbonates at all levels.
47 citations
References
More filters
•
01 Jan 1992TL;DR: This book discusses the evolution of architecture, primitive functions, terminals, sufficiency, and closure, and the role of representation and the lens effect in genetic programming.
Abstract: Background on genetic algorithms, LISP, and genetic programming hierarchical problem-solving introduction to automatically-defined functions - the two-boxes problem problems that straddle the breakeven point for computational effort Boolean parity functions determining the architecture of the program the lawnmower problem the bumblebee problem the increasing benefits of ADFs as problems are scaled up finding an impulse response function artificial ant on the San Mateo trail obstacle-avoiding robot the minesweeper problem automatic discovery of detectors for letter recognition flushes and four-of-a-kinds in a pinochle deck introduction to biochemistry and molecular biology prediction of transmembrane domains in proteins prediction of omega loops in proteins lookahead version of the transmembrane problem evolutionary selection of the architecture of the program evolution of primitives and sufficiency evolutionary selection of terminals evolution of closure simultaneous evolution of architecture, primitive functions, terminals, sufficiency, and closure the role of representation and the lens effect Appendices: list of special symbols list of special functions list of type fonts default parameters computer implementation annotated bibliography of genetic programming electronic mailing list and public repository
13,487 citations
••
TL;DR: In this article, a new method is presented for flexible regression modeling of high dimensional data, which takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data.
Abstract: A new method is presented for flexible regression modeling of high dimensional data. The model takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data. This procedure is motivated by the recursive partitioning approach to regression and shares its attractive properties. Unlike recursive partitioning, however, this method produces continuous models with continuous derivatives. It has more power and flexibility to model relationships that are nearly additive or involve interactions in at most a few variables. In addition, the model can be represented in a form that separately identifies the additive contributions and those associated with the different multivariable interactions.
6,651 citations
••
TL;DR: It is demonstrated that by exploiting a probabilistic Bayesian learning framework, the 'relevance vector machine' (RVM) can derive accurate prediction models which typically utilise dramatically fewer basis functions than a comparable SVM while offering a number of additional advantages.
Abstract: This paper introduces a general Bayesian framework for obtaining sparse solutions to regression and classification tasks utilising models linear in the parameters Although this framework is fully general, we illustrate our approach with a particular specialisation that we denote the 'relevance vector machine' (RVM), a model of identical functional form to the popular and state-of-the-art 'support vector machine' (SVM) We demonstrate that by exploiting a probabilistic Bayesian learning framework, we can derive accurate prediction models which typically utilise dramatically fewer basis functions than a comparable SVM while offering a number of additional advantages These include the benefits of probabilistic predictions, automatic estimation of 'nuisance' parameters, and the facility to utilise arbitrary basis functions (eg non-'Mercer' kernels) We detail the Bayesian framework and associated learning algorithm for the RVM, and give some illustrative examples of its application along with some comparative benchmarks We offer some explanation for the exceptional degree of sparsity obtained, and discuss and demonstrate some of the advantageous features, and potential extensions, of Bayesian relevance learning
5,116 citations
••
TL;DR: In this paper, the three primary sources of geotechnical uncertainties are inherent variability, measurem, and measurem uncertainties, and the three main sources of variability are measurem and inherent variability.
Abstract: Geotechnical variability is a complex attribute that results from many disparate sources of uncertainties. The three primary sources of geotechnical uncertainties are inherent variability, measurem...
1,663 citations
••
TL;DR: The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient."
1,639 citations