scispace - formally typeset
Search or ask a question

Showing papers on "Soft computing published in 2018"


Patent
12 Mar 2018
TL;DR: In this article, the first application of general-AI is described, which covers new algorithms, methods, and systems for: Artificial Intelligence; the first applications of General-AI. (versus Specific, Vertical, or Narrow-AI) (as humans can do) (which also includes Explainable-AI or XAI); addition of reasoning, inference, and cognitive layers/engines to learning module/engine/layer; soft computing; Information Principle; Stratification; Incremental Enlargement Principle; deep-level/detailed recognition, e.g.,
Abstract: Specification covers new algorithms, methods, and systems for: Artificial Intelligence; the first application of General-AI. (versus Specific, Vertical, or Narrow-AI) (as humans can do) (which also includes Explainable-AI or XAI); addition of reasoning, inference, and cognitive layers/engines to learning module/engine/layer; soft computing; Information Principle; Stratification; Incremental Enlargement Principle; deep-level/detailed recognition, e.g., image recognition (e.g., for action, gesture, emotion, expression, biometrics, fingerprint, tilted or partial-face, OCR, relationship, position, pattern, and object); Big Data analytics; machine learning; crowd-sourcing; classification; clustering; SVM; similarity measures; Enhanced Boltzmann Machines; Enhanced Convolutional Neural Networks; optimization; search engine; ranking; semantic web; context analysis; question-answering system; soft, fuzzy, or un-sharp boundaries/impreciseness/ambiguities/fuzziness in class or set, e.g., for language analysis; Natural Language Processing (NLP); Computing-with-Words (CWW); parsing; machine translation; music, sound, speech, or speaker recognition; video search and analysis (e.g., “intelligent tracking”, with detailed recognition); image annotation; image or color correction; data reliability; Z-Number; Z-Web; Z-Factor; rules engine; playing games; control system; autonomous vehicles or drones; self-diagnosis and self-repair robots; system diagnosis; medical diagnosis/images; genetics; drug discovery; biomedicine; data mining; event prediction; financial forecasting (e.g., for stocks); economics; risk assessment; fraud detection (e.g., for cryptocurrency); e-mail management; database management; indexing and join operation; memory management; data compression; event-centric social network; social behavior; drone/satellite vision/navigation; smart city/home/appliances/IoT; and Image Ad and Referral Networks, for e-commerce, e.g., 3D shoe recognition, from any view angle.

216 citations


Journal ArticleDOI
TL;DR: The soft computing model is presented as a simple formula and excellent agreement is obtained representing a high degree of reliability for the proposed model.

131 citations


Journal ArticleDOI
TL;DR: Four types of soft computing techniques were employed to predict the 28-day elastic modulus of RAC (ERAC), and the results showed that the proposed models based on the SVR and ANN techniques outperform the models proposed using other techniques.

122 citations


Journal ArticleDOI
TL;DR: A comprehensive review of the soft computing techniques for multiphase flow metering with a particular focus on the measurement of individual phase flowrates and phase fractions is presented.

107 citations


Journal ArticleDOI
TL;DR: Wind driven optimization (WDO) algorithm which is a newly developed evolutionary algorithm is used for tuning load frequency controllers based on simulation studies, the impact of different objective functions on the performance of the evolutionary algorithms in tuning the controllers is investigated.
Abstract: Due to the great importance of the performance of load frequency controllers in power systems, a lot of effort have been performed to improve the performance of these controllers by fine tuning the...

102 citations


Journal ArticleDOI
TL;DR: The finding of the current research provides an authorized soft computing model to determine WQI that can be used instead of the conventional procedure that consumes time, cost, efforts and sometimes computation errors.
Abstract: Soft computing models are known as an efficient tool for modelling temporal and spatial variation of surface water quality variables and particularly in rivers. These model’s performance relies on how effective their simulation processes are accomplished. Fuzzy logic approach is one of the authoritative intelligent model in solving complex problems that deal with uncertainty and vagueness data. River water quality nature is involved with high stochasticity and redundancy due to the its correlation with several hydrological and environmental aspects. Yet, the fuzzy logic theory can give robust solution for modelling river water quality problem. In addition, this approach likewise can be coordinated with an expert system framework for giving reliable and trustful information for decision makers in enhancing river system sustainability and factual strategies. In this research, different hybrid intelligence models based on adaptive neuro-fuzzy inference system (ANFIS) integrated with fuzzy c-means data clustering (FCM), grid partition (GP) and subtractive clustering (SC) models are used in modelling river water quality index (WQI). Monthly measurement records belong to Selangor River located in Malaysia were selected to build the predictive models. The modelling process was included several water quality terms counting physical, chemical and biological variables whereas WQI was the target variable. At the first stage of the research, statistical analysis for each water quality parameter was analyzed toward the WQI. Whereas in the second stage, the predictive models were established. The finding of the current research provides an authorized soft computing model to determine WQI that can be used instead of the conventional procedure that consumes time, cost, efforts and sometimes computation errors.

98 citations


Journal ArticleDOI
TL;DR: This paper could obliterate DBSCAN’s problem in selecting input parameters by benefiting from coefficient correlation, and improves detection accuracy through simultaneous analysis of those three features of temperature, humidity, and voltage.
Abstract: Anomaly is an important and influential element in Wireless Sensor Networks that affects the integrity of data. On account of the fact that these networks cannot be supervised, this paper, therefore, deals with the problem of anomaly detection. First, the three features of temperature, humidity, and voltage are extracted from the network traffic. Then, network data are clustered using the density-based spatial clustering of applications with noise (DBSCAN) algorithm. It also analyzes the accuracy of DBSCAN algorithm input data with the help of density-based detection techniques. This algorithm detects the points in regions with low density as anomaly. By using normal data, it trains support vector machine. And, finally, it removes anomalies from network data. The proposed algorithm is evaluated by the standard and general data set of Intel Berkeley Research lab (IRLB). In this paper, we could obliterate DBSCAN's problem in selecting input parameters by benefiting from coefficient correlation. The advantage of the proposed algorithm over previous ones is in using soft computing methods, simple implementation, and improving detection accuracy through simultaneous analysis of those three features.

93 citations


Journal ArticleDOI
TL;DR: The idea is to encapsulate various aspects like emerging topics, methods, evaluation parameters, the problem associated with different type of images, databases, segmentation applications, and other resources so that, it could be advantageous for researchers to make effort in developing new methods for segmentation.
Abstract: Image segmentation is the method of partitioning an image into a group of pixels that are homogenous in some manner. The homogeneity dependents on some attributes like intensity, color etc. Segmentation being a pre-processing step in image processing have been used in the number of applications like identification of objects to medical images, satellite images and much more. The taxonomy of an image segmentation methods collectively can be divided among two categories Traditional methods and Soft Computing (SC) methods. Unlike Traditional methods, SC methods have the ability to simulate human thinking and are flexible to work with their ownership function, have been predominantly applied to the task of image segmentation. SC techniques are tolerant of partial truth, imprecision, uncertainty, and approximations. Soft Computing approaches also having advantages of providing cost-effective, high performance and steadfast solutions. In this survey paper, our emphasis is on core SC approaches like Fuzzy logic, Artificial Neural Network, and Genetic Algorithm used for image segmentation. The contribution lies in the fact to present this paper to the researchers that explore state-of-the-art elaboration of almost all dimensions associated with the image segmentation. The idea is to encapsulate various aspects like emerging topics, methods, evaluation parameters, the problem associated with different type of images, databases, segmentation applications, and other resources so that, it could be advantageous for researchers to make effort in developing new methods for segmentation. The paper accomplishes with findings and concluding remarks.

81 citations


Journal ArticleDOI
TL;DR: This study investigates and proposes a method for improving a traditional range-free-based localization method (centroid) that uses soft computing approaches in a hybrid model that integrates a fuzzy logic system into centroid and uses an extreme learning machine (ELM) optimization technique to achieve a robust location estimation scheme.

77 citations


Journal ArticleDOI
27 Jul 2018-Energies
TL;DR: The results indicate the feasibility and superiority of the proposed model in both point and probabilistic wind speed forecasting, as well as comparing it with other soft computing models.
Abstract: Wind energy is a commonly utilized renewable energy source, due to its merits of extensive distribution and rich reserves. However, as wind speed fluctuates violently and uncertainly at all times, wind power integration may affect the security and stability of power system. In this study, we propose an ensemble model for probabilistic wind speed forecasting. It consists of wavelet threshold denoising (WTD), recurrent neural network (RNN) and adaptive neuro fuzzy inference system (ANFIS). Firstly, WTD smooths the wind speed series in order to better capture its variation trend. Secondly, RNNs with different architectures are trained on the denoising datasets, operating as sub-models for point wind speed forecasting. Thirdly, ANFIS is innovatively established as the top layer of the entire ensemble model to compute the final point prediction result, in order to take full advantages of a limited number of deep-learning-based sub-models. Lastly, variances are obtained from sub-models and then prediction intervals of probabilistic forecasting can be calculated, where the variances inventively consist of modeling and forecasting uncertainties. The proposed ensemble model is established and verified on less than one-hour-ahead ultra-short-term wind speed forecasting. We compare it with other soft computing models. The results indicate the feasibility and superiority of the proposed model in both point and probabilistic wind speed forecasting.

65 citations


Journal ArticleDOI
TL;DR: The proposed approach is able to solve both the position and orientation for the inverse kinematic problem and avoids singularities configurations, since, it is based on the forward kinematics equations.

Journal ArticleDOI
TL;DR: A new approach for time series prediction based on using different soft computing techniques, such as neural networks, type-1 and type-2 fuzzy logic systems and bio-inspired algorithms, where each of these intelligent techniques can provide a variety of features for solving real and complex problems is described.
Abstract: This paper describes a new approach for time series prediction based on using different soft computing techniques, such as neural networks (NNs), type-1 and type-2 fuzzy logic systems and bio-inspired algorithms, where each of these intelligent techniques can provide a variety of features for solving real and complex problems. Therefore, this paper describes the application of ensembles of interval type-2 fuzzy neural network (IT2FNN) models. The IT2FNN uses hybrid learning algorithm techniques from NNs models and fuzzy logic systems. The output of the Ensemble of IT2FNN models needs the integration process to forecast the time series, and we are required to design the fuzzy integrator (FI) to solve this real problem. Genetic algorithms and particle swarm optimization are used for the optimization of the parameter values in the membership functions of the FI. We consider different time series to measure the performance of the proposed model, and these time series are: Mackey–Glass, Mexican Stock Exchange (MSE or BMV), Dow Jones and NASDAQ. The forecasting errors are calculated as follows: mean absolute error, mean square error (MSE), root-mean-square error, mean percentage error and mean absolute percentage error. The best prediction errors are illustrated as follows: 0.00025 for the Mackey–Glass, 0.01012 for the MSE, 0.01307 for the Dow Jones and 0.01171 for the NASDAQ time series. Simulation results are compared using a statistical test and provide evidence of the potential advantages of the proposed approach.

Journal ArticleDOI
TL;DR: This paper uses a supervised machine learning approach to predict the component reliability in 19 industrial components obtained from real industries, and shows how machine learning models obtain better prediction results with respect to traditional methods when increasing the size of the time-to-failure datasets.
Abstract: The reliability estimation of engineered components is fundamental for many optimization policies in a production process. The main goal of this paper is to study how machine learning models can fit this reliability estimation function in comparison with traditional approaches (e.g., Weibull distribution). We use a supervised machine learning approach to predict this reliability in 19 industrial components obtained from real industries. Particularly, four diverse machine learning approaches are implemented: artificial neural networks, support vector machines, random forest, and soft computing methods. We evaluate if there is one approach that outperforms the others when predicting the reliability of all the components, analyze if machine learning models improve their performance in the presence of censored data, and finally, understand the performance impact when the number of available inputs changes. Our experimental results show the high ability of machine learning to predict the component reliability and particularly, random forest, which generally obtains high accuracy and the best results for all the cases. Experimentation confirms that all the models improve their performance when considering censored data. Finally, we show how machine learning models obtain better prediction results with respect to traditional methods when increasing the size of the time-to-failure datasets.

Journal ArticleDOI
TL;DR: This research paper presents an innovative system that can effectively offer SEG cybersecurity, which employs soft computing approaches, fuzzy cognitive maps, and a Mamdani fuzzy inference system in order to model overall security level.
Abstract: The upgrade of energy infrastructures by the incorporation of communication and Internet technologies might introduce new risks for the security and for the smooth operation of electricity networks Exploitation of the potential vulnerabilities of the heterogeneous systems used in smart energy grids (SEGs) may lead to the loss of control of critical electronic devices and, moreover, to the interception of confidential information This may result in the disruption of essential services or even in total power failures Addressing security issues that can ensure the confidentiality, the integrity, and availability of energy information is the primary objective for a transition to a new energy shape This research paper presents an innovative system that can effectively offer SEG cybersecurity It employs soft computing approaches, fuzzy cognitive maps, and a Mamdani fuzzy inference system in order to model overall security level Three of the 27 scenarios considered herein have low overall security

Journal ArticleDOI
TL;DR: The proposed hybrid soft computing approach on the basis of clustering, rule extraction, and decision tree methodology to predict the segment of the new customers in customer-centric companies is applied in two case studies in the field of insurance and telecommunication in order to predict potentially profitable leads.

Journal ArticleDOI
TL;DR: It is conspicuous from the review that artificial neural network based hybrids turned out to be more prevalent, more pervasive and more powerful than other soft computing hybrids found in the literature.

Journal ArticleDOI
TL;DR: The study is novel as it traces rise of soft computing methods in field of object detection and tracking in videos which has been neglected over the years and provides number of analyses to guide future directions of research.

Journal ArticleDOI
TL;DR: A case study of five meteorological stations located in Kurdistan province in the west of Iran shows soft computing models were superior to the empirical methods in modelling ET0, and the ANN was found to be better than the ANFIS and GEP.
Abstract: Evapotranspiration assessment is one of the most substantial issues in hydrology. The methods used in modelling reference evapotranspiration (ET0) consist of empirical equations or complex methods based on physical processes. In arid and semi-arid climates, determining the amount of evapotranspiration has a major role in the design of irrigation systems, irrigation network management, planning and management of water resources and water management issues in the agricultural sector. This paper presents a case study of five meteorological stations located in Kurdistan province in the west of Iran. The ability of three different soft computing methods, an artificial neural network (ANN), an adaptive neuro-fuzzy inference system (ANFIS) and gene expression programming (GEP), were compared for modelling ET0 in this study. The FAO56 Penman−Monteith model was considered as a reference model and soft computing models were compared using the Priestley−Taylor, Hargreaves, Hargreaves−Samani, Makkink and Makkink−Hansen empirical methods, with respect to the determination co-efficient, the root mean square error, the mean absolute error and the Nash–Sutcliffe model efficiency co-efficient. Soft computing models were superior to the empirical methods in modelling ET0. Among the soft computing methods, the ANN was found to be better than the ANFIS and GEP.

Journal Article
TL;DR: The main techniques in soft computing are evolutionary computing, artificial neural networks, and fuzzy logic and Bayesian statistics, which can produce solutions to problems that are too complex or inherently noisy to tackle with conventional mathematical methods.
Abstract: Soft Computing refers to the science of reasoning, thinking and deduction that recognizes and uses the real world phenomena of grouping, memberships, and classification of various quantities under study. As such, it is an extension of natural heuristics and capable of dealing with complex systems because it does not require strict mathematical definitions and distinctions for the system components. It differs from hard computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty and partial truth. In effect, the role model for soft computing is the human mind. The guiding principle of soft computing is: Exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness and low solution cost. The main techniques in soft computing are evolutionary computing, artificial neural networks, and fuzzy logic and Bayesian statistics. Each technique can be used separately, but a powerful advantage of soft computing is the complementary nature of the techniques. Used together they can produce solutions to problems that are too complex or inherently noisy to tackle with conventional mathematical methods. The applications of soft computing have proved two main advantages. First, it made solving nonlinear problems, in which mathematical models are not available, possible. Second, it introduced the human knowledge such as cognition, recognition, understanding, learning, and others into the fields of computing. This resulted in the possibility of constructing intelligent systems such as autonomous self-tuning systems, and automated designed systems. This paper highlights various areas of soft computing techniques.

Journal ArticleDOI
TL;DR: The results suggest that most used soft computing techniques can work well with good accuracy for the problem of effort estimation based on UCP, and the general regression neural network is the superior one with stable ranking across different accuracy measures.
Abstract: The size of a software project is a key measure of predicting software effort at the requirements and analysis phase. Use case points (UCP) is among software size metrics that achieved good reputation because of the increasing popularity of use case driven development methodologies in software industry. Nevertheless, there is no consistent method that can effectively translate the UCP into its corresponding effort. Previous estimation models were built using a very limited number of projects, and they were not well examined. The soft computing techniques were rarely applied for such problem and their performances have not been well investigated using a systematic procedure. This study looks into the accuracy and stability of some soft computing methods for the problem of effort estimation based on UCP. Four neural network methods, adaptive neuro fuzzy inference system and support vector regression have been used in this comparative study. The results suggest that most used soft computing techniques can work well with good accuracy for such problem. Among them, the general regression neural network is the superior one with stable ranking across different accuracy measures. Also, it has been found that using adjustment variables with basic UCP variables, solely or together, have positive impact on the accuracy and stability.

Journal ArticleDOI
TL;DR: The researchers studied the existing literature on assembly sequence generation methods and their limitations, and came up with efficient automated optimal sequence generation method that eliminates those assembly sets that have more directional changes and require more energy.
Abstract: In recent days, many interacted shape products have been developed by manufacturing industries for different applications in various fields such as defense, aerospace, and space centers. In manufacturing, 30% of time consumption is due to assembly operation compared with the remaining processes in manufacturing. It is very difficult to get optimal sequence because assembly sequence planning is a multimodel optimization problem. As the number of parts in the assembly increases, the possible number of sequences increases exponentially therefore obtaining the optimal assembly sequence becomes more difficult and time consuming. There exist many mathematical algorithms to obtain optimal assembly sequences. But, recent studies state that they perform poorly when it comes to multiobjective optimal assembly sequence. In recent years, researchers have developed several soft computing-based algorithms for solving assembly sequence problems. In this paper, assembly subset detection method has been introduced. The proposed method is applied for the first time to solve assembly sequence problems. This method eliminates those assembly sets that have more directional changes and require more energy. The method is compared with other algorithms, namely, genetic algorithm (GA), enhanced GA, ant colony optimization (ACO), memetic algorithm, imperialistic harmonic search algorithm, and flower pollination algorithm (FPA), and is found to be successful in achieving the optimal assembly sequence for an industrial product with smaller number of iterations. Note to Practitioners —This paper is motivated by the redesign of helicopter cowling of a Canadian aircraft company using concepts of design for assembly. Though we could reduce the number of parts using advanced composite materials and manufacturing processes, obtaining a feasible assembly for the new assembly structure required a lot of computation time. Hence, the researchers studied the existing literature on assembly sequence generation methods and their limitations, and came up with efficient automated optimal sequence generation method.

BookDOI
31 Aug 2018
TL;DR: This discussion covers the design and use of LSP criteria for evaluation and comparison in diverse areas, such as search engines, medical conditions, real estate, space management, habitat mitigation projects in ecology, and land use and residential development suitability maps.
Abstract: Soft Computing Evaluation Logic provides an in-depth examination of evaluation decision problems and presents comprehensive guidance toward the use of the Logic Scoring of Preference (LSP) method in modeling complex decision criteria. Fully aligned with current developments in computational intelligence, the discussion covers the design and use of LSP criteria for evaluation and comparison in diverse areas, such as search engines, medical conditions, real estate, space management, habitat mitigation projects in ecology, and land use and residential development suitability maps, with versatile transfer to other similar decision-modeling contexts.

Journal ArticleDOI
TL;DR: In this study soft computing approach, namely, adaptive neuro fuzzy inference system (ANFIS) was used and the results confirm that the application for education software could produce the best results in mathematics lecture.
Abstract: Mathematics lectures could be very challenging task for the both, teachers and pupils, since there are large education material which should be acquired throughout 1 year. Therefore there is need to improve the lectures in order to make it more interesting and attractive for pupils especially. In order to find a way how to improve the lectures, there is need to make statistical analysis in order to detect which factors are the most dominant for the mathematics lecture performance. For such a purpose, in this study soft computing approach, namely, adaptive neuro fuzzy inference system (ANFIS) was used. The ANFIS should determine what the qualitative influence of the several factors on the mathematics performance is. The results confirm that the application for education software could produce the best results in mathematics lecture.

Journal ArticleDOI
TL;DR: A new range-free localization algorithm which uses the neural networks for this purpose, and utilizes Particle swarm optimization (PSO) algorithm to optimize the number of neurons of hidden layers of neural networks.
Abstract: Wireless Sensor Network is one of the new technologies that have gotten more attention in the past few years. The localization problem is one of the most important topics in these types of the networks. The traditional positioning techniques cannot be used in these networks due to the hardware restrictions of the sensor nodes. Lately, some positioning methods which use soft computing approaches such as neural networks, are proposed for solving the localization problem. In this paper, we propose a new range-free localization algorithm which uses the neural networks for this purpose. This method utilizes Particle swarm optimization (PSO) algorithm to optimize the number of neurons of hidden layers of neural networks. The objective function considers both localization accuracy and storage overhead, simultaneously. The proposed algorithm is implemented and simulated in isotropic networks with and without coverage hole, and anisotropic networks. The obtained result show, in the different environmental conditions, the proposed algorithm has a less localization error rate and less storage requirement than the analogous methods.

Journal ArticleDOI
TL;DR: Wavelet analysis and artificial intelligence machine learning will be combined to improve the self learning ability and prediction accuracy in SVM and NN to achieve ultimate load forecasting.

Journal ArticleDOI
TL;DR: A methodology based on multi-objective evolutionary algorithms for the SR problem and overcoming of such limitations is proposed and a mathematical formulation of the problem is proposed.
Abstract: Distribution system (DS) service restoration (SR) in contingency situations is one of the most complex and challenging problems in DS operation. It is usually formulated as a multi-objective and multi-constraint optimization problem that must be quickly solved. Several methods have been proposed for its solution, however, most of them still have limitations. Some demand long running time when applied to large-scale DSs modeled with no simplification, whereas others disregard some important aspects of the SR problem. This paper proposes a methodology based on multi-objective evolutionary algorithms for the SR problem and overcoming of such limitations. In contrast to methods reported in the literature, the methodology: 1) deals with large-scale DSs with relatively soft computing time and requires no network topology simplification; 2) prioritizes the operation of remotely controlled switches; 3) prioritizes supply to three levels of priority customers; and 4) provides switching sequences. A mathematical formulation of the problem is also proposed. Several tests were conducted for the evaluation of the methodology and single and multiple fault cases in large-scale DSs (from 631 to 5158 switches) were considered.

Journal ArticleDOI
TL;DR: A hybrid model is proposed which is a combination of computational intelligence tools and soft computing techniques to identify nonlinear patterns with probabilistic classifiers to obtain narrower intervals than would be otherwise possible under the traditional FARIMA models.

Journal ArticleDOI
TL;DR: An extended ranking approach for generalized fuzzy numbers integrating the concepts of centroid point, rank index value, height of a fuzzy number, and the degree of the decision maker’s optimism is proposed, which provides a consistent ranking order for decision makers.

Journal ArticleDOI
TL;DR: This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry.
Abstract: Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training “intelligent machine” to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

Journal ArticleDOI
TL;DR: In this paper, an extreme learning machine (ELM) was used to predict the future output of beam strength and ductility based on relative inputs using a soft computing scheme, and the experimental results indicated that on the whole, the new-flanged algorithm creates good generalization presentation.
Abstract: Evaluation of the parameters affecting the shear strength and ductility of steel–concrete composite beam is the goal of this study. This study focuses on predicting the future output of beam’s strength and ductility based on relative inputs using a soft computing scheme, extreme learning machine (ELM). Estimation and prediction results of the ELM models were compared with genetic programming (GP) and artificial neural networks (ANNs) models. Referring to the experimental results, as opposed to the GP and ANN methods, the ELM approach enhanced generalization ability and predictive accuracy. Moreover, achieved results indicated that the developed ELM models can be used with confidence for further work on formulating novel model predictive strategy in shear strength and ductility of steel concrete composite. Furthermore, the experimental results indicate that on the whole, the newflanged algorithm creates good generalization presentation. In comparison to the other widely used conventional learning algorithms, the ELM has a much faster learning ability.