scispace - formally typeset
Search or ask a question

Showing papers by "Pijush Samui published in 2020"


Journal ArticleDOI
TL;DR: It could be concluded that the proposed hybridization of GIS and deep learning can be a promising tool to assist the government authorities and involving parties in flash flood mitigation and land-use planning.

228 citations


Journal ArticleDOI
01 May 2020-Catena
TL;DR: It is concluded that the Keras’s deep learning model is a new tool for shallow susceptibility mapping at landslide-prone areas and is better than those of the employed benchmark approaches of random forest, J48 decision tree, classification tree, and logistic model tree.
Abstract: This research aims at investigating the capability of Keras’s deep learning models with three robust optimization algorithms (stochastic gradient descent, root mean square propagation, and adaptive moment optimization) and two-loss functions for spatial modeling of landslide hazard at a regional scale. Shallow landslides at the Ha Long area (Vietnam) were selected as a case study. For this regard, set of ten influencing factors (slope, aspect, curvature, topographic wetness index, landuse, distance to road, distance to river, soil type, distance to fault, and lithology) and 193 landslide polygons were prepared to construct a Geographic Information System (GIS) database for the study area. Using the collected database, the DNN with its potential of realizing complex functional mapping hidden in the data is used to generalize a decision boundary that separates the learning space into two distinct categories: landslide (a positive class) and non-landslide (a negative class). Experimental results point out that the utilized the Keras’s deep learning model with the Adam optimization and the mean squared error lost function is the best with the prediction performance of 84.0%. The performance is better than those of the employed benchmark approaches of random forest, J48 decision tree, classification tree, and logistic model tree. We conclude that the Keras’s deep learning model is a new tool for shallow susceptibility mapping at landslide-prone areas.

98 citations


Journal ArticleDOI
TL;DR: In this article, a multivariate adaptive regression splines model (MARS) was used as a feature extraction method to extract the optimum inputs that use to design the high performance concrete (HPC) structures.

67 citations


Journal ArticleDOI
TL;DR: The proposed model is deep neural network (DNN), which presents a category of learning algorithms that adopt nonlinear extraction of information in several steps within a hierarchical framework, primarily applied for learning and pattern classification.
Abstract: Heating load and cooling load forecasting are crucial for estimating energy consumption and improvement of energy performance during the design phase of buildings. Since the capacity of cooling ventilation and air-conditioning system of the building contributes to the operation cost, it is ideal to develop accurate models for heating and cooling load forecasting of buildings. This paper proposes a machine-learning technique for prediction of heating load and cooling load of residential buildings. The proposed model is deep neural network (DNN), which presents a category of learning algorithms that adopt nonlinear extraction of information in several steps within a hierarchical framework, primarily applied for learning and pattern classification. The output of DNN has been compared with other proposed methods such as gradient boosted machine (GBM), Gaussian process regression (GPR) and minimax probability machine regression (MPMR). To develop DNN model, the energy data set has been divided into training (70%) and testing (30%) sets. The performance of proposed model was benchmarked by statistical performance metrics such as variance accounted for (VAF), relative average absolute error (RAAE), root means absolute error (RMAE), coefficient of determination (R2), standard deviation ratio (RSR), mean absolute percentage error (MAPE), Nash–Sutcliffe coefficient (NS), root means squared error (RMSE), weighted mean absolute percent error (WMAPE) and mean absolute percentage Error (MAPE). DNN and GPR have produced best-predicted VAF for cooling load and heating load of 99.76% and 99.84% respectively.

55 citations


Journal ArticleDOI
TL;DR: The results clearly advocate the ENN as a promising artificial intelligence technique for accurate forecasting of hourly river flow in the form of real-time.
Abstract: Monitoring hourly river flows is indispensable for flood forecasting and disaster risk management. The objective of the present study is to develop a suite of hourly river flow forecasting models for the Albert river, located in Queensland, Australia using various machine learning (ML) based models including a relatively new and novel artificial intelligent modeling technique known as emotional neural network (ENN). Hourly river flow data for the period 2011–2014 is employed for the development and evaluation of the predictive models. The performance of the ENN model in forecasting hourly stage river flow is compared with other well-established ML-based models using a number of statistical metrics and graphical evaluation methods. The ENN showed an outstanding performance in terms of their forecasting accuracies, in comparison with other ML models. In general, the results clearly advocate the ENN as a promising artificial intelligence technique for accurate forecasting of hourly river flow in the form of real-time.

53 citations


Journal ArticleDOI
TL;DR: This work adopted a genetic-algorithm (GA)-optimized long short-term memory (LSTM) technique to predict river water temperature (WT) as a key indicator of the health state of the aquatic habitat, where its modeling is crucial for effective urban water quality management.
Abstract: Advances in establishing real-time river water quality monitoring networks combined with novel artificial intelligence techniques for more accurate forecasting is at the forefront of urban water management. The preservation and improvement of the quality of our impaired urban streams are at the core of the global challenge of ensuring water sustainability. This work adopted a genetic-algorithm (GA)-optimized long short-term memory (LSTM) technique to predict river water temperature (WT) as a key indicator of the health state of the aquatic habitat, where its modeling is crucial for effective urban water quality management. To our knowledge, this is the first attempt to adopt a GA-LSTM to predict the WT in urban rivers. In recent research trends, large volumes of real-time water quality data, including water temperature, conductivity, pH, and turbidity, are constantly being collected. Specifically, in the field of water quality management, this provides countless opportunities for understanding water quality impairment and forecasting, and to develop models for aquatic habitat assessment purposes. The main objective of this research was to develop a reliable and simple urban river water temperature forecasting tool using advanced machine learning methods that can be used in conjunction with a real-time network of water quality monitoring stations for proactive water quality management. We proposed a hybrid time series regression model for WT forecasting. This hybrid approach was applied to solve problems regarding the time window size and architectural factors (number of units) of the LSTM network. We have chosen an hourly water temperature record collected over 5 years as the input. Furthermore, to check its robustness, a recurrent neural network (RNN) was also tested as a benchmark model and the performances were compared. The experimental results revealed that the hybrid model of the GA-LSTM network outperformed the RNN and the basic problem of determining the optimal time window and number of units of the memory cell was solved. This research concluded that the GA-LSTM can be used as an advanced deep learning technique for time series analysis.

46 citations


Journal ArticleDOI
19 Jan 2020-Forests
TL;DR: The newly developed RFM is a promising tool to help local authorities in shallow landslide hazard mitigations and was better than those of benchmark approaches, including the SVM, RFC, and logistic regression.
Abstract: This study developed and verified a new hybrid machine learning model, named random forest machine (RFM), for the spatial prediction of shallow landslides. RFM is a hybridization of two state-of-the-art machine learning algorithms, random forest classifier (RFC) and support vector machine (SVM), in which RFC is used to generate subsets from training data and SVM is used to build decision functions for these subsets. To construct and verify the hybrid RFM model, a shallow landslide database of the Lang Son area (northern Vietnam) was prepared. The database consisted of 101 shallow landslide polygons and 14 conditioning factors. The relevance of these factors for shallow landslide susceptibility modeling was assessed using the ReliefF method. Experimental results pointed out that the proposed RFM can help to achieve the desired prediction with an F1 score of roughly 0.96. The performance of the RFM was better than those of benchmark approaches, including the SVM, RFC, and logistic regression. Thus, the newly developed RFM is a promising tool to help local authorities in shallow landslide hazard mitigations.

44 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented new models to predict chloride penetration into self-compacting concrete (SCC) using the rapid chloride penetration test (RCPT), which mainly focuses on the effect of supplementary cementitious material (i.e., fly ash and silica fume) and elevated temperature curing of SCC on results of the RCPT.
Abstract: This paper presents new models to predict chloride penetration into self-compacting concrete (SCC) using the rapid chloride penetration test (RCPT). The research mainly focuses on the effect of supplementary cementitious material (i.e., fly ash and silica fume) and elevated temperature curing of SCC on results of the RCPT. Models are developed to predict the value of RCPT using two statistical algorithms, namely Multivariate Adaptive Regression Spline (MARS) and Minimax Probability Machine Regression (MPMR). Both models incorporate the combined effect of fly ash, silica fume and elevated temperature curing on the RCPT, and a comparative study between the models is also discussed. The analysis confirms that both MARS and MPMR are promising models for the prediction of RCPT results.

30 citations


Journal ArticleDOI
TL;DR: In this article, the applicability and capability of the Extreme Learning Machine (ELM), Minimax Probability Machine Regression (MPMR) and Least Square Support Vector Machine (LS-SVM) for predicting the uniaxial compressive strength (UCS) of volcanic rocks was examined.
Abstract: Uniaxial compressive strength (UCS) of rock material is very important parameter for rock engineering applications such as rock mass classification, numerical modelling bearing capacity, mechanical excavation, slope stability and supporting with respect to the engineering behaviors’ of rock. UCS is obtained directly or can be predicted by different methods including using existing tables and diagrams, regression, Bayesian approach and soft computing methods. The main purpose of this study is to examine the applicability and capability of the Extreme Learning Machine (ELM), Minimax Probability Machine Regression (MPMR) for prediction of UCS of the volcanic rocks and to compare its performance with Least Square Support Vector Machine (LS-SVM). The samples tested were taken from the volcanic rock masses exposed at the eastern Pontides (NE Turkey). In the soft computing model to estimate UCS of the samples investigated, porosity and slake durability index were used as input parameters. In this study, the root mean square error (RMSE), variance account factor (VAF), maximum determination coefficient value (R2), adjusted determination coefficient (Adj. R2) and performance index (PI), regression error characteristic (REC) curve and Taylor diagram were used to determine the accuracy of the ELM, MPMR and LS-SVM models developed.

25 citations


Journal ArticleDOI
TL;DR: The present results of the proposed ENN model reveal a promising modeling strategy for the hourly simulation of river flow, and such a model can be explored further for its ability to contribute to the state-of-the-art of river engineering and water resources monitoring and future prediction at near real-time forecast horizons.
Abstract: Hourly river flow pattern monitoring and simulation is the indispensable precautionary task for river engineering sustainability, water resource management, flood risk mitigation, and impact reduction. Reliable river flow forecasting is highly emphasized to support major decision-makers. This research paper adopts a new implementation approach for the application of a river flow prediction model for hourly prediction of the flow of Mary River in Australia; a novel data-intelligent model called emotional neural network (ENN) was used for this purpose. A historical dataset measured over a 4-year period (2011–2014) at hourly timescale was used in building the ENN-based predictive model. The results of the ENN model were validated against the existing approaches such as the minimax probability machine regression (MPMR), relevance vector machine (RVM), and multivariate adaptive regression splines (MARS) models. The developed models are evaluated against each other for validation purposes. Various numerical and graphical performance evaluators are conducted to assess the predictability of the proposed ENN and the competitive benchmark models. The ENN model, used as an objective simulation tool, revealed an outstanding performance when applied for hourly river flow prediction in comparison with the other benchmark models. However, the order of the model, performance wise, is ENN > MARS > RVM > MPMR. In general, the present results of the proposed ENN model reveal a promising modeling strategy for the hourly simulation of river flow, and such a model can be explored further for its ability to contribute to the state-of-the-art of river engineering and water resources monitoring and future prediction at near real-time forecast horizons.

22 citations


Journal ArticleDOI
TL;DR: The proposed Deep Learning model to predict groundwater depths is an intelligent tool for predicting groundwater depths and can save resources and labor conventionally employed to estimate various features of complex groundwater systems.
Abstract: Groundwater depth has complex non-linear relationships with climate, groundwater extraction, and surface water flows. To understand the importance of each predictor and predictand (groundwater depth), different artificial intelligence (AI) techniques have been used. In this research, we have proposed a Deep Learning (DL) model to predict groundwater depths. The DL model is an extension of the conventional neural network with multiple layers having non-linear activation function. The feasibility of the DL model is assessed with well-established framework models [Extreme Learning Machine (ELM) and Gaussian Process Regression (GPR)]. The area selected for this study is Konan basin located in the Kochi Prefecture of Japan. The hydro-meteorological and groundwater data used are precipitation, river stage, temperature, recharge and groundwater depth. Identical set of inputs and outputs of all the selected stations were used to train and validate the models. The predictive accuracy of the DL, ELM and GPR models has been assessed considering suitable goodness-of-fit criteria. During training period, the DL model has a very good agreement with the observed data (RMSE = 0.04, r = 0.99 and NSE = 0.98) and during validation period, its performance is satisfactory (RMSE = 0.08, r = 0.95 and NSE = 0.87). To check practicality and generalization ability of the DL model, it was re-validated at three different stations (E2, E3 and E6) of the same unconfined aquifer. The significant prediction capability and generalization ability makes the proposed DL model more reliable and robust. Based on the finding of this research, the DL model is an intelligent tool for predicting groundwater depths. Such advanced AI technique can save resources and labor conventionally employed to estimate various features of complex groundwater systems.

Journal ArticleDOI
TL;DR: The paper proposes least square support vector machine (LSSVM), The Group Method of Data Handling (GMDH) and Gaussian process regression (GPR) based reliability analysis of pile group resting on cohesive soil.
Abstract: Robust and reliable design at certain levels of safety has earned lot of attention in recent. To build over the limitations of FOSM based reliability analysis, the paper proposes least square support vector machine (LSSVM), The Group Method of Data Handling (GMDH) and Gaussian process regression (GPR) based reliability analysis of pile group resting on cohesive soil. LSSVM is an improvement over support-vector machines (SVM) which uses linear systems instead of complex quadratic equations. GMDH is a self-organized neural network capable of solving complex non-linear problems. GPR is an effective Bayesian tool of machine learning the performance of the developed models is ascertained using various statistical parameters and Taylor curves. The reliability indices of the simulated values are compared to that of the actual values obtained from FOSM. The results show that all the models are applicable for reliability analysis of settlement of pile group.

Journal ArticleDOI
TL;DR: BBO, as a robust evolutionary algorithm, can be successfully linked to the ANN for better performance and provided a better generalization capability than the other predictive models, according to the results.
Abstract: Ground vibration induced by blasting operations is an important undesirable effect in surface mines and has significant environmental impacts on surrounding areas. Therefore, the precise prediction of blast-induced ground vibration is a challenging task for engineers and for managers. This study explores and evaluates the use of two stochastic metaheuristic algorithms, namely biogeography-based optimization (BBO) and particle swarm optimization (PSO), as well as one deterministic optimization algorithm, namely the DIRECT method, to improve the performance of an artificial neural network (ANN) for predicting the ground vibration. It is worth mentioning this is the first time that BBO-ANN and DIRECT-ANN models have been applied to predict ground vibration. To demonstrate model reliability and effectiveness, a minimax probability machine regression (MPMR), extreme learning machine (ELM), and three well-known empirical methods were also tested. To collect the required datasets, two quarry mines in the Shur river dam region, located in the southwest of Iran, were monitored, and the values of input and output parameters were measured. Five statistical indicators, namely the percentage root mean square error (%RMSE), coefficient of determination (R2), Ratio of RMSE to the standard deviation of the observations (RSR), mean absolute error (MAE), and degree of agreement (d) were taken into account for the model assessment. According to the results, BBO-ANN provided a better generalization capability than the other predictive models. As a conclusion, BBO, as a robust evolutionary algorithm, can be successfully linked to the ANN for better performance.

Journal ArticleDOI
TL;DR: The finding of this research concludes that PCA-based MARS model can be used as new and reliable data-driven approach for estimation of soil parameters and this new tool can help to save the time and capital spent on estimation of different parameter of soil.
Abstract: Heterogeneous nature of soil consists of various chemical and physical attributes that make the prediction of soil parameters very tedious and challenging. Moreover, it becomes more difficult when we have more number of variables. This study investigates the feasibility of principal component analysis as dimensionality reduction technique to select the input variables in terms of principal components (PCs), which helps in reducing the complexity and multicollinearity problem. The soil attributes, namely depth of the sample, sand percentage, silt percentage, clay percentage, moisture content, dry density, wet density, void ratio, liquid limit, plastic limit, liquid index, and plastic index, have been employed as influencing factors to estimate the coefficient of compression of soil. Furthermore, the extracted variance-based PCs were used as predictor to build the minimax probability machine regression (MPMR), multivariate adaptive regression splines (MARS), and genetic programming regression (GPR). The predictive accuracy of the models has been assessed via five statistical fitness parameters. In the training phase, the PCA-MARS model has shown good outcomes in terms of fitness measurement parameters (RMSE= 0.004, r = 0.981 and NSE = 0.963). During testing phase, PCA-MARS has outperformed (RMSE= 0.006, r = 0.963 and NSE = 0.912) followed by PCA-GPR and PCA-MPMR. The finding of this research concludes that PCA-based MARS model can be used as new and reliable data-driven approach for estimation of soil parameters. Furthermore, this new tool can help to save the time and capital spent on estimation of different parameter of soil.

Journal ArticleDOI
TL;DR: The buckling analysis of a laminated composite skew plate using the C0 finite element (FE) model based on higher-order shear deformation theory (HSDT) in conjunction with minimax probability machine regression (MPMR) and multivariate adaptive regression spline (MARS) is attempted.
Abstract: The purpose of this paper is to attempt the buckling analysis of a laminated composite skew plate using the C0 finite element (FE) model based on higher-order shear deformation theory (HSDT) in conjunction with minimax probability machine regression (MPMR) and multivariate adaptive regression spline (MARS).,HSDT considers the third-order variation of in-plane displacements which eliminates the use of shear correction factor owing to realistic parabolic transverse shear stresses across the thickness coordinate. At the top and bottom of the plate, zero transverse shear stress condition is imposed. C0 FE model based on HSDT is developed and coded in formula translation (FORTRAN). FE model is validated and found efficient to create new results. MPMR and MARS models are coded in MATLAB. Using skew angle (α), stacking sequence (Ai) and buckling strength (Y) as input parameters, a regression problem is formulated using MPMR and MARS to predict the buckling strength of laminated composite skew plates.,The results of the MPMR and MARS models are in good agreement with the FE model result. MPMR is a better tool than MARS to analyze the buckling problem.,The present work considers the linear behavior of the laminated composite skew plate.,To the authors’ best of knowledge, there is no work in the literature on the buckling analysis of a laminated composite skew plate using C0 FE formulation based on third-order shear deformation theory in conjunction with MPMR and MARS. These machine-learning techniques increase efficiency, reduce the computational time and reduce the cost of analysis. Further, an equation is generated with the MARS model via which the buckling strength of the laminated composite skew plate can be predicted with ease and simplicity.

Book
29 Jul 2020
TL;DR: In this paper, the authors highlight cutting-edge applications of machine learning techniques for disaster management by monitoring, analyzing, and forecasting hydro-meteorological variables, which is a consolidated discipline used to forewarn the possibility of natural hazards.
Abstract: This book highlights cutting-edge applications of machine learning techniques for disaster management by monitoring, analyzing, and forecasting hydro-meteorological variables. Predictive modelling is a consolidated discipline used to forewarn the possibility of natural hazards. In this book, experts from numerical weather forecast, meteorology, hydrology, engineering, agriculture, economics, and disaster policy-making contribute towards an interdisciplinary framework to construct potent models for hazard risk mitigation. The book will help advance the state of knowledge of artificial intelligence in decision systems to aid disaster management and policy-making. This book can be a useful reference for graduate student, academics, practicing scientists and professionals of disaster management, artificial intelligence, and environmental sciences.

Journal ArticleDOI
TL;DR: The efficiency of the MARS, LSSVM and GP are measured by the comparative study of the statistical parameters and can be concluded that the all the models performed very well as the output results are very close to the desired value, while the MARs slightly outperformed the other two models.
Abstract: The estimation of concrete compressive strength is utmost important for the construction of a building. Organizations have a limited budget for mix design; therefore, proper estimation of concrete data has a significant impact on site operations and the construction of the building. In this paper, the prediction of concrete compressive strength is done by Multivariate Adaptive Regression Spline (MARS), Least Squares Support Vector Machine (LSSVM) and genetic programming (GP) which is a very new approach in the field of concrete technology. MARS is a supervised technique, performs well for high dimensional data, interacts less with the input variables, whereas LSSVM is generally based on a statistical learning algorithm and GP builds equations that are generated for modeling. All the developed LSSVM, MARS and GP gives an equations for prediction of compressive strength which makes easy to predict the compressive strength of the concrete. The efficiency of the MARS, LSSVM and GP are measured by the comparative study of the statistical parameters and can be concluded that the all the models performed very well as the output results are very close to the desired value, while the MARS slightly outperformed the other two models.

Journal ArticleDOI
TL;DR: In this paper, four developed soft-computing approaches were proposed and compared, including the group method of data handling (GMDH), Minimax Probability Machine Regression (MPMR), emotional neural network (ENN), and hybrid artificial neural network-particle swarm optimization (ANN-PSO), to estimate the slump-flow (S) and compressive strength (CS), as fresh and hardened properties of SCC, respectively.
Abstract: The characteristics of fresh and hardened self-compacting concrete (SCC) are an essential requirement for construction projects. Moreover, the sensitivity of admixture contents of SCC in these properties is highly impacted by that cost. The current study investigates to estimate the slump-flow (S) and compressive strength (CS), as fresh and hardened properties of SCC, respectively. Four developed soft-computing approaches were proposed and compared, including the group method of data handling (GMDH), Minimax Probability Machine Regression (MPMR), emotional neural network (ENN), and hybrid artificial neural network-particle swarm optimization (ANN-PSO), to estimate the S and 28-day CS of SCC, which comprises fly ash (FA), silica fume (SF), and limestone powder (LP) as part of cement by mass in total powder content. In addition, the impact of eight admixture components is investigated and evaluated to assess the sensitivity of admixture contents for the modelling of S and CS of SCC. The results demonstrate that the performance prediction of ENN model is more significant than other models in estimating S and CS characteristics of SCC. The overall of Pearson correlation coefficient, r, and root mean square error (RMSE) of ENN model are 97.80% and 20.16 mm, respectively, for the S. These are 96.07% and 2.59 MPa, respectively, for the CS. Furthermore, the sensitivity of the powder content of fly ash is shown to have a high impact on the estimated S and CS values of SCC.

Journal ArticleDOI
TL;DR: A new integrated approach based on the iterative super-resolution algorithm and expectation-maximization for face hallucination, which is a process of converting a low-resolution face image to a high-resolution image, is proposed and verified.
Abstract: This paper proposed and verified a new integrated approach based on the iterative super-resolution algorithm and expectation-maximization for face hallucination, which is a process of converting a low-resolution face image to a high-resolution image. The current sparse representation for super resolving generic image patches is not suitable for global face images due to its lower accuracy and time-consumption. To solve this, in the new method, training global face sparse representation was used to reconstruct images with misalignment variations after the local geometric co-occurrence matrix. In the testing phase, we proposed a hybrid method, which is a combination of the sparse global representation and the local linear regression using the Expectation Maximization (EM) algorithm. Therefore, this work recovered the high-resolution image of a corresponding low-resolution image. Experimental validation suggested improvement of the overall accuracy of the proposed method with fast identification of high-resolution face images without misalignment.

01 Jan 2020
TL;DR: This book provides an interdisciplinary approach that creates advanced probabilistic models for engineering fields, ranging from conventional fields of mechanical engineering and civil engineering, to electronics, electrical, earth sciences, climate, agriculture, water resource, mathematical sciences and computer sciences.
Abstract: Handbook of Probabilistic Models carefully examines the application of advanced probabilistic models in conventional engineering fields. In this comprehensive handbook, practitioners, researchers and scientists will find detailed explanations of technical concepts, applications of the proposed methods, and the respective scientific approaches needed to solve the problem. This book provides an interdisciplinary approach that creates advanced probabilistic models for engineering fields, ranging from conventional fields of mechanical engineering and civil engineering, to electronics, electrical, earth sciences, climate, agriculture, water resource, mathematical sciences and computer sciences. Specific topics covered include minimax probability machine regression, stochastic finite element method, relevance vector machine, logistic regression, Monte Carlo simulations, random matrix, Gaussian process regression, Kalman filter, stochastic optimization, maximum likelihood, Bayesian inference, Bayesian update, kriging, copula-statistical models, and more.

DOI
01 Jun 2020
TL;DR: In this paper, design of retaining wall is modelled using Functional Network (FN), Genetic Programming (GP) and Group Method of Data Handling (GMDH), which have eliminated the drawbacks of several other soft computing methods involved previously in the reliability problems.
Abstract: Reliability analysis of the geo-structures has contributed a lot to the field of Geotechnical Engineering. This area of study gives an overview of the probability of failure of different structures. First-order second-moment method (FOSM) is a method, incorporated in this study, to determine the reliability index of the geo-structures (and other structures as well). In this paper, design of retaining wall is modelled using Functional Network (FN), Genetic Programming (GP) and Group Method of Data Handling (GMDH). These soft computing techniques have removed the cumbersome nature of the problem and have increased the precision of the result. The uncertainties involved in this problem is reduced. As these methodologies are evolved and are heated topics in the artificial intelligence field, they have eliminated the drawbacks of several other soft computing methods involved previously in the reliability problems. These methodologies employ genetic algorithm (GMDH) and make use of domain knowledge along with data knowledge accordingly (FN). These techniques have made problems facile and can produce a precise result. Performance of these methods has been assessed using different performance analysis, criterions and parameters. This paper is a comparative study between FOSM, FN based FOSM, GP based FOSM and GMDH based FOSM.