scispace - formally typeset
Search or ask a question

Showing papers on "Soft computing published in 2020"


Journal ArticleDOI
TL;DR: An overview of some soft computing techniques as well as their applications in underground excavations is presented and a case study is adopted to compare the predictive performances ofsoft computing techniques including eXtreme Gradient Boosting, Multivariate Adaptive Regression Splines, and Support Vector Machine in estimating the maximum lateral wall deflection induced by braced excavation.
Abstract: Soft computing techniques are becoming even more popular and particularly amenable to model the complex behaviors of most geotechnical engineering systems since they have demonstrated superior predictive capacity, compared to the traditional methods. This paper presents an overview of some soft computing techniques as well as their applications in underground excavations. A case study is adopted to compare the predictive performances of soft computing techniques including eXtreme Gradient Boosting (XGBoost), Multivariate Adaptive Regression Splines (MARS), Artificial Neural Networks (ANN), and Support Vector Machine (SVM) in estimating the maximum lateral wall deflection induced by braced excavation. This study also discusses the merits and the limitations of some soft computing techniques, compared with the conventional approaches available.

287 citations


Journal ArticleDOI
TL;DR: A comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to susceptible–infected–recovered (SIR) and susceptible-exposed-infectious-removed (SEIR) models suggests machine learning as an effective tool to model the outbreak.
Abstract: Several outbreak prediction models for COVID-19 are being used by officials around the world to make informed decisions and enforce relevant control measures. Among the standard models for COVID-19 global pandemic prediction, simple epidemiological and statistical models have received more attention by authorities, and these models are popular in the media. Due to a high level of uncertainty and lack of essential data, standard models have shown low accuracy for long-term prediction. Although the literature includes several attempts to address this issue, the essential generalization and robustness abilities of existing models need to be improved. This paper presents a comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to susceptible–infected–recovered (SIR) and susceptible-exposed-infectious-removed (SEIR) models. Among a wide range of machine learning models investigated, two models showed promising results (i.e., multi-layered perceptron, MLP; and adaptive network-based fuzzy inference system, ANFIS). Based on the results reported here, and due to the highly complex nature of the COVID-19 outbreak and variation in its behavior across nations, this study suggests machine learning as an effective tool to model the outbreak. This paper provides an initial benchmarking to demonstrate the potential of machine learning for future research. This paper further suggests that a genuine novelty in outbreak prediction can be realized by integrating machine learning and SEIR models.

256 citations


Journal ArticleDOI
TL;DR: A novel framework based on computer propped diagnosis and IoT to detect and observe type-2 diabetes patients is suggested and the recommended healthcare system aims to obtain a better accuracy of diagnosis with mysterious data.
Abstract: Internet of Things (IoT) has gain the importance with the growing applications in the fields of ubiquitous and context-aware computing. In IoT, anything can be a portion of it, whether it is unintelligent objects or sensor nodes; thus extremely different kinds of services can be developed. In this regard, data storage, resource management, service creation and discovery, and resource and power management would facilitate advanced mechanism and much better infrastructure. Cloud computing and fog computing play an important role when the quantity of data and information IoT are critical. Thus, it would not be potential for standalone strength forced IoT to handle. Cloud of things is an integration of IoT with cloud computing or fog computing which can aid to realize the objectives of evolving IoT and future Internet. Fog computing is an expansion to the notion of cloud computing to the network brim, making it suitable for IoT and other implementations that need real-time and fundamental interactions. Regardless of many virtually and services unlimited resources presented by cloud-like intelligent building monitoring and others, it yet countenances various difficulties when interfering many smart things in human’s life. Mobility, response time, and location consciousness are the most prominent problems. Fog and mobile edge computing have been established, to get rid of these difficulties of cloud computing. In this article, we suggest a novel framework based on computer propped diagnosis and IoT to detect and observe type-2 diabetes patients. The recommended healthcare system aims to obtain a better accuracy of diagnosis with mysterious data. The overall experimental results indicate the validity and robustness of our proposed algorithms.

181 citations


Journal ArticleDOI
TL;DR: A novel GMDH method, called GMDH network based on using extreme learning machine (GMDH-ELM), is proposed in which weighting coefficients of quadratic polynomials applied in conventional GMDH are no longer required to be updated either using back propagation technique or other evolutionary algorithms through training stage.
Abstract: Longitudinal dispersion coefficient (LDC) is known as the most remarkable environmental variables which plays a key role in evaluation of pollution profiles in water pipelines. Even though, there is a wide range of numerical models to estimate coefficient of longitudinal dispersion, these mathematical techniques may often come in quite few inaccuracies due to complex mechanism of convection-diffusion processes in pollutant transition in water pipelines. In this research work, to obtain more accurate prediction of LDC, general structure of group method of data handling (GMDH) is modified by means of extreme learning machine (ELM) conceptions. In fact, with getting inspiration from ELM, a novel GMDH method, called GMDH network based on using extreme learning machine (GMDH-ELM), is proposed in which weighting coefficients of quadratic polynomials applied in conventional GMDH are no longer required to be updated either using back propagation technique or other evolutionary algorithms through training stage. In fact, an intermediate parameter is employed to establish a relationship between the input and output in each neuron of the GMDH model. In this way, a well-known and reliable dataset (233 experimental data) related to LDC in water network pipelines, as output vector, is applied to conduct training and testing phases. Through datasets, the Re number, the average longitudinal flow velocity, the friction factor of pipeline and the diameter of pipe are considered as inputs of the proposed approach. The results of GMDH-ELM model indicate a highly satisfying level of precision in both training and testing phases. Furthermore, feed forward structure of GMDH model was improved by particle swarm optimization (PSO) and gravitational search algorithm (GSA) to predict LDC. Through a sound judgment, a comparison is drawn between the performance of GMDH-ELM and other developed GMDH models. Moreover, several empirical equations existing in literature have been applied for comparisons. Overall, results of GMDH-ELM have permissible superiority over the other soft computing tools and conventional predictive models.

115 citations


Journal ArticleDOI
TL;DR: This article achieves MVS by integrating deep neural network based soft computing techniques in a two-tier framework that extracts deep features from each frame of a sequence in the lookup table and passes them to deep bidirectional long short-term memory (DB-LSTM) to acquire probabilities of informativeness and generates a summary.
Abstract: The massive amount of video data produced by surveillance networks in industries instigate various challenges in exploring these videos for many applications, such as video summarization (VS), analysis, indexing, and retrieval. The task of multiview video summarization (MVS) is very challenging due to the gigantic size of data, redundancy, overlapping in views, light variations, and interview correlations. To address these challenges, various low-level features and clustering-based soft computing techniques are proposed that cannot fully exploit MVS. In this article, we achieve MVS by integrating deep neural network based soft computing techniques in a two-tier framework. The first online tier performs target-appearance-based shots segmentation and stores them in a lookup table that is transmitted to cloud for further processing. The second tier extracts deep features from each frame of a sequence in the lookup table and pass them to deep bidirectional long short-term memory (DB-LSTM) to acquire probabilities of informativeness and generates a summary. Experimental evaluation on benchmark dataset and industrial surveillance data from YouTube confirms the better performance of our system compared to the state-of-the-art MVS methods.

106 citations


Journal ArticleDOI
TL;DR: This study proposed four ensemble soft computing models based on logistic models for groundwater potential maps that would help in the management of groundwater storage resources and provide real-time information about groundwater quality.
Abstract: Groundwater potential maps are one of the most important tools for the management of groundwater storage resources. In this study, we proposed four ensemble soft computing models based on logistic ...

106 citations


Journal ArticleDOI
TL;DR: This work presents a systematic literature review to collate, explore, understand, understand and analyze the efforts and trends in a well‐structured manner to identify research gaps defining the future prospects of this coupling of soft computing techniques for sentiment analysis on Twitter.
Abstract: Sentiment detection and classification is the latest fad for social analytics on Web. With the array of practical applications in healthcare, finance, media, consumer markets, and government, distilling the voice of public to gain insight to target information and reviews is non‐trivial. With a marked increase in the size, subjectivity, and diversity of social web‐data, the vagueness, uncertainty and imprecision within the information has increased manifold. Soft computing techniques have been used to handle this fuzziness in practical applications. This work is a study to understand the feasibility, scope and relevance of this alliance of using Soft computing techniques for sentiment analysis on Twitter. We present a systematic literature review to collate, explore, understand and analyze the efforts and trends in a well‐structured manner to identify research gaps defining the future prospects of this coupling. The contribution of this paper is significant because firstly the primary focus is to study and evaluate the use of soft computing techniques for sentiment analysis on Twitter and secondly as compared to the previous reviews we adopt a systematic approach to identify, gather empirical evidence, interpret results, critically analyze, and integrate the findings of all relevant high‐quality studies to address specific research questions pertaining to the defined research domain.

100 citations


Journal ArticleDOI
TL;DR: The proposed ANN-PSO-IPS is implemented for four variants of TONMS-EFEs, and comparison with exact solutions relieved its robustness, correctness and effectiveness, which is further authenticated through statistical explorations.
Abstract: In this study, a novel neuro-swarming computing solver is developed for numerical treatment of third-order nonlinear multi-singular Emden–Fowler equation (TONMS-EFE) by using function approximation ability of artificial neural networks (ANNs) modeling and global optimization mechanism of particle swarm optimization (PSO) integrated with local search of interior-point scheme (IPS), i.e., ANN-PSO-IPS. The inspiration for the design of ANN-PSO-IPS-based numerical solver comes with an objective of presenting a reliable, accurate and viable structure that combines the strength of ANNs optimized with the integrated soft computing frameworks to deal with such challenging systems based on TONMS-EFE. The proposed ANN-PSO-IPS is implemented for four variants of TONMS-EFEs, and comparison with exact solutions relieved its robustness, correctness and effectiveness, which is further authenticated through statistical explorations.

97 citations


Journal ArticleDOI
TL;DR: Different strategies for the synthesis of a FIS by means of a hierarchical Genetic Algorithm with the aim to maximize the profit generated by the energy exchange with the grid, assuming a Time Of Use (TOU) energy price policy, and at the same time to reduce the EMS rule base system complexity are investigated.

97 citations


Journal ArticleDOI
TL;DR: A state-of-the-art review of the main applications of soft computing techniques to relevant structural and earthquake engineering problems is proposed, including the applications of fuzzy computing, evolutionary computing, swarm intelligence, and neural networks.

88 citations


Journal ArticleDOI
TL;DR: A review of different controllers utilized in traditional as well as renewable energy-based power system for LFM such as; classical controllers, fractional order controllers, cascaded controllers, sliding mode controller (SMC), tilt-integral-derivative controllers, H-infinity controller and other recently developed controllers are provided.

Journal ArticleDOI
12 Jan 2020-Energies
TL;DR: Clear insight is presented supporting the suitability of MPPT techniques for different types of converter configurations.
Abstract: Solar photovoltaic (PV) systems are attracting a huge focus in the current energy scenario. Various maximum power point tracking (MPPT) methods are used in solar PV systems in order to achieve maximum power. In this article, a clear analysis of conventional MPPT techniques such as variable step size perturb and observe (VSS-P&O), modified incremental conductance (MIC), fractional open circuit voltage (FOCV) has been carried out. In addition, the soft computing MPPT techniques such as fixed step size radial basis functional algorithm (FSS-RBFA), variable step size radial basis functional algorithm (VSS-RBFA), adaptive fuzzy logic controller (AFLC), particle swarm optimization (PSO), and cuckoo search (CS) MPPT techniques are presented along with their comparative analysis. The comparative analysis is done under static and dynamic irradiation conditions by considering algorithm complexity, tracking speed, oscillations at MPP, and sensing parameters. The single-diode model PV panel and double-diode model PV panel are also compared in terms of fill factor (FF) and maximum power extraction. Clear insight is presented supporting the suitability of MPPT techniques for different types of converter configurations.

Journal ArticleDOI
TL;DR: From the survey, it is apparent that the application of ANN is very vital in this field and its application will continue even in the future in making intelligent interpretation of oil reservoir.

Journal ArticleDOI
TL;DR: The aim is to present the state of the art of the concepts, applications, and theories associated with the digital image processing and soft computing methods for the identification and classification of diseases from the leaf of the plant.
Abstract: The real-time decision support system can enhance the crop or plant growth, therefore, increasing their productivity, quality, and economic value. This also helps us in serving the nature by supervising the plant growth in balancing the environment. Computer vision techniques have proven to play an important role in the number of applications like medical, defense, agriculture, remote sensing, business analysis, etc. The use of digital image processing methods for simulating the visual capability of the human being has proven to be a dynamic feature in smart or precision agriculture. This concept has provided with the automatic preventing and monitoring of plants, cultivation, disease management, water management etc. to increase the crop productivity and quality. In this paper, we have surveyed the number of articles that adopt the concept of computer vision and soft computing methods for the identification and classification of diseases from the leaf of the plant. Our aim is to present the state of the art of the concepts, applications, and theories associated with the digital image processing and soft computing methodologies. The various outcomes have been discussed separately.

Journal ArticleDOI
TL;DR: A novel neuro-swarming based heuristic solver is established for the numerical solutions of fourth-order multi-singular nonlinear Emden–Fowler (FO-MS-NEF) model using the function estimate capability of artificial neural networks (ANNs) modelling together with the global application of particle swarm optimization (PSO) enhanced by local search active set (AS) approach.
Abstract: In the present work, a novel neuro-swarming based heuristic solver is established for the numerical solutions of fourth-order multi-singular nonlinear Emden–Fowler (FO-MS-NEF) model using the function estimate capability of artificial neural networks (ANNs) modelling together with the global application of particle swarm optimization (PSO) enhanced by local search active set (AS) approach, i.e., ANN-PSO-AS solver. The design stimulation for the ANN-PSO-AS scheme for a numerical solver originates with an intention to present a viable, consistent and precise configuration that associates the ANNs strength under the optimization of unified soft computing backgrounds to tackle with such stimulating models for the FO-MS-NEF equation. The proposed ANN-PSO-AS solver is applied for three different variants of FO-MS-NEF equations. The comparison of the obtained results with the true solutions calmed its correctness, effectiveness, and robustness that is further validated with in-depth statistical investigations.

Journal ArticleDOI
05 Mar 2020
TL;DR: The paper presents the performance analysis of the evolutionary algorithms such as Genetic Algorithm, Ant lion optimization, Particle swarm optimization and Ant colony optimization and evaluates the routes obtained using the fuzzy Petri Net model to find out the optimal route for the wireless sensor networks.
Abstract: The routing in the wireless sensor networks has significant part in enhancing the functioning of the network as the improper routing methodologies and the routing deteriorates the energy of the sensor networks, affecting the lifespan of the network, leading to link failures and the connectivity problems. The necessity to improve the network performance is the main focus of the paper, so the paper presents the performance analysis of the evolutionary algorithms such as Genetic Algorithm, Ant lion optimization, Particle swarm optimization and Ant colony optimization and evaluates the routes obtained using the fuzzy Petri Net model, to find out the optimal route for the wireless sensor networks. The simulation through the python in terms of the some resource validates the optimal path in terms of energy, network life time and the packet delivery ratio.

Journal ArticleDOI
01 Aug 2020
TL;DR: In this survey, the literature related to the use of multi-objective meta-heuristics in intelligent control focused on the controller tuning problem is reviewed and discussed.
Abstract: Multi-objective optimization has been adopted in many engineering problems where a set of requirements must be met to generate successful applications. Among them, there are the tuning problems from control engineering, which are focused on the correct setting of the controller parameters to properly govern complex dynamic systems to satisfy desired behaviors such as high accuracy, efficient energy consumption, low cost, among others. These requirements are stated in a multi-objective optimization problem to find the most suitable controller parameters. Nevertheless, these parameters are tough to find because of the conflicting control performance requirements (i.e., a requirement cannot be met without harming the others). Hence, the use of techniques from computational intelligence and soft computing is necessary to solve multi-objective problems and handle the trade-offs among control performance objectives. Meta-heuristics have shown to obtain outstanding results when solving complex multi-objective problems at a reasonable computational cost. In this survey, the literature related to the use of multi-objective meta-heuristics in intelligent control focused on the controller tuning problem is reviewed and discussed.

Journal ArticleDOI
TL;DR: Numerical tests demonstrate that the point forecasts obtained from the proposed hybrid intelligent model can be effectively used to quantify PV power uncertainty and the performance of these two uncertainty quantification methods is assessed through reliability.
Abstract: This paper presents two probabilistic approaches based on bootstrap method and quantile regression (QR) method to estimate the uncertainty associated with solar photovoltaic (PV) power point forecasts. Solar PV output power forecasts are obtained using a hybrid intelligent model, which is composed of a data filtering technique based on wavelet transform (WT) and a soft computing model (SCM) based on radial basis function neural network (RBFNN) that is optimized by particle swarm optimization (PSO) algorithm. The point forecast capability of the proposed hybrid WT+RBFNN+PSO intelligent model is examined and compared with other hybrid models as well as individual SCM. The performance of the proposed bootstrap method in the form of probabilistic forecasts is compared with the QR method by generating different prediction intervals (PIs). Numerical tests using real data demonstrate that the point forecasts obtained from the proposed hybrid intelligent model can be effectively used to quantify PV power uncertainty. The performance of these two uncertainty quantification methods is assessed through reliability.

Posted ContentDOI
22 Apr 2020-medRxiv
TL;DR: A comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak suggests machine learning as an effective tool to model the outbreak.
Abstract: Several outbreak prediction models for COVID-19 are being used by officials around the world to make informed-decisions and enforce relevant control measures. Among the standard models for COVID-19 global pandemic prediction, simple epidemiological and statistical models have received more attention by authorities, and they are popular in the media. Due to a high level of uncertainty and lack of essential data, standard models have shown low accuracy for long-term prediction. Although the literature includes several attempts to address this issue, the essential generalization and robustness abilities of existing models needs to be improved. This paper presents a comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak. Among a wide range of machine learning models investigated, two models showed promising results (i.e., multi-layered perceptron, MLP, and adaptive network-based fuzzy inference system, ANFIS). Based on the results reported here, and due to the highly complex nature of the COVID-19 outbreak and variation in its behavior from nation-to-nation, this study suggests machine learning as an effective tool to model the outbreak.

Journal ArticleDOI
05 Jan 2020-Sensors
TL;DR: A neuron-based Kalman filter was proposed, in which the neuro units were introduced and proved that the filter was effective in noise elimination within the soft computing solution.
Abstract: The control effect of various intelligent terminals is affected by the data sensing precision. The filtering method has been the typical soft computing method used to promote the sensing level. Due to the difficult recognition of the practical system and the empirical parameter estimation in the traditional Kalman filter, a neuron-based Kalman filter was proposed in the paper. Firstly, the framework of the improved Kalman filter was designed, in which the neuro units were introduced. Secondly, the functions of the neuro units were excavated with the nonlinear autoregressive model. The neuro units optimized the filtering process to reduce the effect of the unpractical system model and hypothetical parameters. Thirdly, the adaptive filtering algorithm was proposed based on the new Kalman filter. Finally, the filter was verified with the simulation signals and practical measurements. The results proved that the filter was effective in noise elimination within the soft computing solution.

Journal ArticleDOI
TL;DR: In this paper, a review of global maximum power point tracking (GMPPT) methods for photovoltaic (PV) systems under partial shading conditions is presented, focusing on the improvement achieved by the conventional MPPT (perturb and observe, hill climbing, and incremental conductance).
Abstract: This review covers global maximum power point tracking (GMPPT) methods for photovoltaic (PV) systems under partial shading conditions. Unlike the previous review works that primarily focused on soft computing and hybrid GMPPT, this study gives exclusive attention to the improvement achieved by the conventional MPPT (perturb and observe, hill climbing, and incremental conductance). The improved methods include the popular 0.8 × V oc model and, more recently, the skipping algorithms. In addition to providing qualitative descriptions of the available techniques, this work also attempts to provide a fair evaluation of GMPPT to determine their comparative performances. The competing algorithms, which are selected to represent every category (conventional and soft computing and hybrid MPPT), are benchmarked under carefully selected operating conditions and shading scenarios. The evaluation is focused on four main criteria: tracking accuracy, convergence time, length of voltage fluctuations, and transient efficiency during the search for the global maximum power point. The results obtained from this study can become a basis for researchers and designers to select the best MPPT technique for their respective applications.

Journal ArticleDOI
11 Sep 2020
TL;DR: The aim of this paper is to introduce the notion ofsoft multi-set topology (SMS-topology) defined on a soft multi- set (S MS) and the multi-criteria decision-making (MCDM) algorithms with aggregation operators based on SMS-topOLOGY are established.
Abstract: The aim of this paper is to introduce the notion of soft multi-set topology (SMS-topology) defined on a soft multi-set (SMS). Soft multi-set and soft multi-set topology are fundamental tools in computational intelligence, which have a large number of applications in soft computing, fuzzy modeling and decision-making under uncertainty. The idea of power whole multi-subsets of a SMS is defined to explore various rudimentary properties of SMS-topology. Certain properties of SMS-topology like SMS-basis, MS-subspace, SMS-interior, SMS-closure and boundary of SMS are explored. Furthermore, the multi-criteria decision-making (MCDM) algorithms with aggregation operators based on SMS-topology are established. Algorithm i (i = 1, 2, 3) are developed for the selection of best alternative for biopesticides, for the selection of best textile company, for the award of performance, respectively. Some real life applications of the proposed algorithms in MCDM problems are illustrated by numerical examples. The the reliability and feasibility of proposed MCDM techniques is shown by comparison analysis with some existing techniques.

Journal ArticleDOI
TL;DR: To design and manufacture a portable stress monitoring system, based on photoplethysmography and galvanic skin response physiological signals, acquired by wearable sensors, a novel algorithm for continuous measurement of the stress index (SI) as well as the classification of stress levels is proposed.
Abstract: Stress is an issue that everyone experiences in today’s modern life Prolonged exposure to stress can cause many mental and physical diseases Accordingly, the stress management issue has become popular, and the need for personal healthcare devices has increased in recent years Therefore, the aim of this research is to design and manufacture a portable stress monitoring system, based on photoplethysmography (PPG) and galvanic skin response (GSR) physiological signals, acquired by wearable sensors To do so, we proposed a novel algorithm for continuous measurement of the stress index (SI) as well as the classification of stress levels In order to estimate an accurate value for SI, various soft computing algorithms such as support vector regression, artificial neural networks (ANN), and adaptive neuro-fuzzy inference system (ANFIS) were adopted for modeling the stress based on the features extracted from normalized and non-normalized types of PPG and GSR signals and their combinations Furthermore, K-nearest neighbor (KNN), ANNs, Naive Bayes, and support vector machine (SVM) were utilized to discriminate different levels of stress in subjects The obtained results indicate that the ANFIS algorithm can estimate the SI training output with the correlation coefficient (CC) of 09281 and the average relative error of 023 on a subset of the combined features of PPG and GSR signals Also, the best classification performance was for KNN (K = 3) algorithm, with 853% accuracy To evaluate the developed system, data of 16 subjects, out of the training dataset, participated in the experiment in the presence of the experts and psychologists, were used The average CC of 081 and classification accuracy of 75% were obtained, using the implemented ANFIS model and KNN classifier

Journal ArticleDOI
TL;DR: Experimental results and comparison with other state-of-the-art approaches, highlights superiority and efficacy of the proposed fuzzy RL technique for transformer fault classification.
Abstract: In this work a fuzzy reinforcement learning (RL) based intelligent classifier for power transformer incipient faults is proposed. Fault classifiers proposed till date have low identification accuracy and do not identify all types of transformer faults. Herein, an attempt has been made to design an adaptive, intelligent transformer fault classifier that progressively learns to identify faults on-line with high accuracy for all fault types. In the proposed approach, dissolved gas analysis (DGA) data of oil samples collected from real power transformers (and from credible sources) has been used, which serves as input to a fuzzy RL based classifier. Typically, classification accuracy is heavily dependent on the number of input variables chosen. This has been resolved by using the J48 algorithm to select 8 most appropriate input variables from the 24 variables obtained using DGA. Proposed fuzzy RL approach achieves a fault identification accuracy of 99.7%, which is significantly higher than other contemporary soft computing based identifiers. Experimental results and comparison with other state-of-the-art approaches, highlights superiority and efficacy of the proposed fuzzy RL technique for transformer fault classification.

Journal ArticleDOI
TL;DR: This study has investigated rainfall induced landslide susceptibility of the Uttarkashi district of India through the development of different novel GIS based soft computing approaches namel...
Abstract: In this study, we have investigated rainfall induced landslide susceptibility of the Uttarkashi district of India through the developmentof different novel GIS based soft computing approaches namel...

Journal ArticleDOI
TL;DR: A survey of recent research publications that use Soft Computing methods to answer education-related problems based on the analysis of educational data ‘mined’ mainly from interactive/e-learning systems finds that top research questions in education today seeking answers through soft computing methods refer directly to the issue of quality.
Abstract: The aim of this paper is to survey recent research publications that use Soft Computing methods to answer education-related problems based on the analysis of educational data ‘mined’ mainly from interactive/e-learning systems. Such systems are known to generate and store large volumes of data that can be exploited to assess the learner, the system and the quality of the interaction between them. Educational Data Mining (EDM) and Learning Analytics (LA) are two distinct and yet closely related research areas that focus on this data aiming to address open education-related questions or issues. Besides ‘classic’ data analysis methods such as clustering, classification, identification or regression/analysis of variances, soft computing methods are often employed by EDM and LA researchers to achieve their various tasks. Their very nature as iterative optimization algorithms that avoid the exhaustive search of the solutions space and go for possibly suboptimal solutions yet at realistic time and effort, along with their heavy reliance on rich data sets for training, make soft computing methods ideal tools for the EDM or LA type of problems. Decision trees, random forests, artificial neural networks, fuzzy logic, support vector machines and genetic/evolutionary algorithms are a few examples of soft computing approaches that, given enough data, can successfully deal with uncertainty, qualitatively stated problems and incomplete, imprecise or even contradictory data sets – features that the field of education shares with all humanities/social sciences fields. The present review focuses, therefore, on recent EDM and LA research that employs at least one soft computing method, and aims to identify (i) the major education problems/issues addressed and, consequently, research goals/objectives set, (ii) the learning contexts/settings within which relevant research and educational interventions take place, (iii) the relation between classic and soft computing methods employed to solve specific problems/issues, and (iv) the means of dissemination (publication journals) of the relevant research results. Selection and analysis of a body of 300 journal publications reveals that top research questions in education today seeking answers through soft computing methods refer directly to the issue of quality – a critical issue given the currently dominant educational/pedagogical models that favor e-learning or computer- or technology-mediated learning contexts. Moreover, results identify the most frequently used methods and tools within EDM/LA research and, comparatively, within their soft computing subsets, along with the major journals relevant research is being published worldwide. Weaknesses and issues that need further attention in order to fully exploit the benefits of research results to improve both the learning experience and the learning outcomes are discussed in the conclusions.

Journal ArticleDOI
TL;DR: The experimental results show that the HSCRKM method efficiently segments the nucleus, and it is also inferred that logistic regression and neural network perform better than other prediction algorithms.
Abstract: Segmenting an image of a nucleus is one of the most essential tasks in a leukemia diagnostic system. Accurate and rapid segmentation methods help the physicians identify the diseases and provide better treatment at the appropriate time. Recently, hybrid clustering algorithms have started being widely used for image segmentation in medical image processing. In this article, a novel hybrid histogram-based soft covering rough k-means clustering (HSCRKM) algorithm for leukemia nucleus image segmentation is discussed. This algorithm combines the strengths of a soft covering rough set and rough k-means clustering. The histogram method was utilized to identify the number of clusters to avoid random initialization. Different types of features such as gray level co-occurrence matrix (GLCM), color, and shape-based features were extracted from the segmented image of the nucleus. Machine learning prediction algorithms were applied to classify the cancerous and non-cancerous cells. The proposed strategy is compared with an existing clustering algorithm, and the efficiency is evaluated based on the prediction metrics. The experimental results show that the HSCRKM method efficiently segments the nucleus, and it is also inferred that logistic regression and neural network perform better than other prediction algorithms.

Journal ArticleDOI
TL;DR: The study will emphasis on the most general methods used by researchers in literature for developing the statistical and mathematical modeling using soft computing approaches including, genetic algorithm, response surface methodology, fuzzy logic, artificial neural network, Taguchi method and particle swarm optimization.
Abstract: In this paper, a wide literature review of soft computing methods in conventional machining processes of metal matrix composites is carried out. The tool wear, cutting force along with surface quality are presented in the different types of machining processes and examined thoroughly. Summary of the different particular soft computing approaches in machining such as turning, milling, drilling and grinding operations are thoroughly discussed. Furthermore, this work put emphases on the optimization and modeling of the machining process. The study will emphasis on the most general methods used by researchers in literature for developing the statistical and mathematical modeling using soft computing approaches including, genetic algorithm, response surface methodology, fuzzy logic, artificial neural network, Taguchi method and particle swarm optimization. In last section the comprehensive open issues and conclusion are presented for application of soft computing techniques in machining of metal matrix composite performance prediction and optimization.

Posted ContentDOI
19 Apr 2020
TL;DR: A comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to susceptible–infected–recovered (SIR) and susceptible-exposed-infectious-removed (SEIR) models suggests machine learning as an effective tool to model the outbreak.
Abstract: Several outbreak prediction models for COVID-19 are being used by officials around the world to make informed decisions and enforce relevant control measures. Among the standard models for COVID-19 global pandemic prediction, simple epidemiological and statistical models have received more attention by authorities, and these models are popular in the media. Due to a high level of uncertainty and lack of essential data, standard models have shown low accuracy for long-term prediction. Although the literature includes several attempts to address this issue, the essential generalization and robustness abilities of existing models need to be improved. This paper presents a comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to susceptible–infected–recovered (SIR) and susceptible-exposed-infectious-removed (SEIR) models. Among a wide range of machine learning models investigated, two models showed promising results (i.e., multi-layered perceptron, MLP; and adaptive network-based fuzzy inference system, ANFIS). Based on the results reported here, and due to the highly complex nature of the COVID-19 outbreak and variation in its behavior across nations, this study suggests machine learning as an effective tool to model the outbreak. This paper provides an initial benchmarking to demonstrate the potential of machine learning for future research. This paper further suggests that a genuine novelty in outbreak prediction can be realized by integrating machine learning and SEIR models.

Journal ArticleDOI
TL;DR: The proposed GA-BSTSM model yielded a robust performance with high accuracy for AOp prediction herein, and it was disclosed that meteorological factors have a strong influence on the accuracy of AOp predictive models, especially RH and WS.