scispace - formally typeset
Search or ask a question
Author

Aboul Ella Hassanien

Bio: Aboul Ella Hassanien is an academic researcher from Cairo University. The author has contributed to research in topics: Particle swarm optimization & Feature extraction. The author has an hindex of 60, co-authored 930 publications receiving 16382 citations. Previous affiliations of Aboul Ella Hassanien include Mansoura University & Beni-Suef University.


Papers
More filters
Journal ArticleDOI
TL;DR: Results prove the capability of the proposed binary version of grey wolf optimization (bGWO) to search the feature space for optimal feature combinations regardless of the initialization and the used stochastic operators.

958 citations

Journal ArticleDOI
TL;DR: A solid intuition is built for what is LDA, and how LDA works, thus enabling readers of all levels to get a better understanding of the LDA and to know how to apply this technique in different applications.
Abstract: Linear Discriminant Analysis (LDA) is a very common technique for dimensionality reduction problems as a preprocessing step for machine learning and pattern classification applications. At the same time, it is usually used as a black box, but (sometimes) not well understood. The aim of this paper is to build a solid intuition for what is LDA, and how LDA works, thus enabling readers of all levels be able to get a better understanding of the LDA and to know how to apply this technique in different applications. The paper first gave the basic definitions and steps of how LDA technique works supported with visual explanations of these steps. Moreover, the two methods of computing the LDA space, i.e. class-dependent and class-independent methods, were explained in details. Then, in a step-by-step approach, two numerical examples are demonstrated to show how the LDA space can be calculated in case of the class-dependent and class-independent methods. Furthermore, two of the most common LDA problems (i.e. Small Sample Size (SSS) and non-linearity problems) were highlighted and illustrated, and state-of-the-art solutions to these problems were investigated and explained. Finally, a number of experiments was conducted with different datasets to (1) investigate the effect of the eigenvectors that used in the LDA space on the robustness of the extracted feature for the classification accuracy, and (2) to show when the SSS problem occurs and how it can be addressed.

518 citations

Journal ArticleDOI
TL;DR: The Chaotic Whale Optimization Algorithm (CWOA) is proposed, using the chaotic maps to compute and automatically adapt the internal parameters of the optimization algorithm for the parameters estimation of solar cells.

465 citations

Journal ArticleDOI
26 May 2011-Sensors
TL;DR: The important role of body sensor networks in medicine to minimize the need for caregivers and help the chronically ill and elderly people live an independent life, besides providing people with quality care is explained.
Abstract: Wireless sensor network (WSN) technologies are considered one of the key research areas in computer science and the healthcare application industries for improving the quality of life. The purpose of this paper is to provide a snapshot of current developments and future direction of research on wearable and implantable body area network systems for continuous monitoring of patients. This paper explains the important role of body sensor networks in medicine to minimize the need for caregivers and help the chronically ill and elderly people live an independent life, besides providing people with quality care. The paper provides several examples of state of the art technology together with the design considerations like unobtrusiveness, scalability, energy efficiency, security and also provides a comprehensive analysis of the various benefits and drawbacks of these systems. Although offering significant benefits, the field of wearable and implantable body sensor networks still faces major challenges and open research problems which are investigated and covered, along with some proposed solutions, in this paper.

461 citations

Journal ArticleDOI
TL;DR: The experimental results showed that the proposed methods outperformed the other swarm algorithms; in addition, the MFO showed better results than WOA, as well as provided a good balance between exploration and exploitation in all images at small and high threshold numbers.
Abstract: Two metaheuristic algorithms (WOA and MFO) are used.These algorithms are applied to multilevel thresholding image segmentation.MFO and WOA are better than compared algorithms.MFO is better than WOA for higher number of thresholds. Determining the optimal thresholding for image segmentation has got more attention in recent years since it has many applications. There are several methods used to find the optimal thresholding values such as Otsu and Kapur based methods. These methods are suitable for bi-level thresholding case and they can be easily extended to the multilevel case, however, the process of determining the optimal thresholds in the case of multilevel thresholding is time-consuming. To avoid this problem, this paper examines the ability of two nature inspired algorithms namely: Whale Optimization Algorithm (WOA) and Moth-Flame Optimization (MFO) to determine the optimal multilevel thresholding for image segmentation. The MFO algorithm is inspired from the natural behavior of moths which have a special navigation style at night since they fly using the moonlight, whereas, the WOA algorithm emulates the natural cooperative behaviors of whales. The candidate solutions in the adapted algorithms were created using the image histogram, and then they were updated based on the characteristics of each algorithm. The solutions are assessed using the Otsus fitness function during the optimization operation. The performance of the proposed algorithms has been evaluated using several of benchmark images and has been compared with five different swarm algorithms. The results have been analyzed based on the best fitness values, PSNR, and SSIM measures, as well as time complexity and the ANOVA test. The experimental results showed that the proposed methods outperformed the other swarm algorithms; in addition, the MFO showed better results than WOA, as well as provided a good balance between exploration and exploitation in all images at small and high threshold numbers.

431 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Jan 2002

9,314 citations