scispace - formally typeset
Search or ask a question
Author

Amira S. Ashour

Other affiliations: Taif University
Bio: Amira S. Ashour is an academic researcher from Tanta University. The author has contributed to research in topics: Image segmentation & Cuckoo search. The author has an hindex of 36, co-authored 241 publications receiving 4631 citations. Previous affiliations of Amira S. Ashour include Taif University.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: A particle swarm optimization-based approach to train the NN (NN-PSO), capable to tackle the problem of predicting structural failure of multistoried reinforced concrete buildings via detecting the failure possibility of the multistory reinforced concrete building structure in the future.
Abstract: Faulty structural design may cause multistory reinforced concrete (RC) buildings to collapse suddenly. All attempts are directed to avoid structural failure as it leads to human life danger as well as wasting time and property. Using traditional methods for predicting structural failure of the RC buildings will be time-consuming and complex. Recent research proved the artificial neural network (ANN) potentiality in solving various real-life problems. The traditional learning algorithms suffer from being trapped into local optima with a premature convergence. Thus, it is a challenging task to achieve expected accuracy while using traditional learning algorithms to train ANN. To solve this problem, the present work proposed a particle swarm optimization-based approach to train the NN (NN-PSO). The PSO is employed to find a weight vector with minimum root-mean-square error (RMSE) for the NN. The proposed (NN-PSO) classifier is capable to tackle the problem of predicting structural failure of multistoried reinforced concrete buildings via detecting the failure possibility of the multistoried RC building structure in the future. A database of 150 multistoried buildings’ RC structures was employed in the experimental results. The PSO algorithm was involved to select the optimal weights for the NN classifier. Fifteen features have been extracted from the structural design, while nine features have been opted to perform the classification process. Moreover, the NN-PSO model was compared with NN and MLP-FFN (multilayer perceptron feed-forward network) classifier to find its ingenuity. The experimental results established the superiority of the proposed NN-PSO compared to the NN and MLP-FFN classifiers. The NN-PSO achieved 90 % accuracy with 90 % precision, 94.74 % recall and 92.31 % F-Measure.

252 citations

Journal ArticleDOI
TL;DR: The paper concludes that such mass-market health monitoring systems will only be prevalent when implemented together with home environmental monitoring and control systems.
Abstract: Wireless technology development has increased rapidly due to it’s convenience and cost effectiveness compared to wired applications, particularly considering the advantages offered by Wireless Sensor Network (WSN) based applications. Such applications exist in several domains including healthcare, medical, industrial and home automation. In the present study, a home-based wireless ECG monitoring system using Zigbee technology is considered. Such systems can be useful for monitoring people in their own home as well as for periodic monitoring by physicians for appropriate healthcare, allowing people to live in their home for longer. Health monitoring systems can continuously monitor many physiological signals and offer further analysis and interpretation. The characteristics and drawbacks of these systems may affect the wearer’s mobility during monitoring the vital signs. Real-time monitoring systems record, measure, and monitor the heart electrical activity while maintaining the consumer’s comfort. Zigbee devices can offer low-power, small size, and a low-cost suitable solution for monitoring the ECG signal in the home, but such systems are often designed in isolation, with no consideration of existing home control networks and smart home solutions. The present study offers a state of the art review and then introduces the main concepts and contents of the wireless ECG monitoring systems. In addition, models of the ECG signal and the power consumption formulas are highlighted. Challenges and future perspectives are also reported. The paper concludes that such mass-market health monitoring systems will only be prevalent when implemented together with home environmental monitoring and control systems.

209 citations

BookDOI
02 Jan 2017
TL;DR: This comprehensive book focuses on better big-data security for healthcare organizations and offers an in-depth analysis of medical body area networks with the 5th generation of IoT communication technology along with its nanotechnology.
Abstract: This comprehensive book focuses on better big-data security for healthcare organizations. Following an extensive introduction to the Internet of Things (IoT) in healthcare including challenging topics and scenarios, it offers an in-depth analysis of medical body area networks with the 5th generation of IoT communication technology along with its nanotechnology. It also describes a novel strategic framework and computationally intelligent model to measure possible security vulnerabilities in the context of e-health. Moreover, the book addresses healthcare systems that handle large volumes of data driven by patients records and health/personal information, including big-data-based knowledge management systems to support clinical decisions. Several of the issues faced in storing/processing big data are presented along with the available tools, technologies and algorithms to deal with those problems as well as a case study in healthcare analytics. Addressing trust, privacy, and security issues as well as the IoT and big-data challenges, the book highlights the advances in the field to guide engineers developing different IoT devices and evaluating the performance of different IoT techniques. Additionally, it explores the impact of such technologies on public, private, community, and hybrid scenarios in healthcare. This book offers professionals, scientists and engineers the latest technologies, techniques, and strategies for IoT and big data.

178 citations

Journal ArticleDOI
TL;DR: In this paper, a novel chaotic bat algorithm (CBA) was proposed for multi-level thresholding in grayscale images using Otsu's between-class variance function.
Abstract: Multi-level thresholding is a helpful tool for several image segmentation applications Evaluating the optimal thresholds can be applied using a widely adopted extensive scheme called Otsu's thresholding In the current work, bi-level and multi-level threshold procedures are proposed based on their histogram using Otsu's between-class variance and a novel chaotic bat algorithm (CBA) Maximization of between-class variance function in Otsu technique is used as the objective function to obtain the optimum thresholds for the considered grayscale images The proposed procedure is applied on a standard test images set of sizes (512 × 512) and (481 × 321) Further, the proposed approach performance is compared with heuristic procedures, such as particle swarm optimization, bacterial foraging optimization, firefly algorithm and bat algorithm The evaluation assessment between the proposed and existing algorithms is conceded using evaluation metrics, namely root-mean-square error, peak signal to noise ratio, structural similarity index, objective function, and CPU time/iteration number of the optimization-based search The results established that the proposed CBA provided better outcome for maximum number cases compared to its alternatives Therefore, it can be applied in complex image processing such as automatic target recognition

178 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Jan 2002

9,314 citations

01 Mar 2001
TL;DR: Using singular value decomposition in transforming genome-wide expression data from genes x arrays space to reduced diagonalized "eigengenes" x "eigenarrays" space gives a global picture of the dynamics of gene expression, in which individual genes and arrays appear to be classified into groups of similar regulation and function, or similar cellular state and biological phenotype.
Abstract: ‡We describe the use of singular value decomposition in transforming genome-wide expression data from genes 3 arrays space to reduced diagonalized ‘‘eigengenes’’ 3 ‘‘eigenarrays’’ space, where the eigengenes (or eigenarrays) are unique orthonormal superpositions of the genes (or arrays). Normalizing the data by filtering out the eigengenes (and eigenarrays) that are inferred to represent noise or experimental artifacts enables meaningful comparison of the expression of different genes across different arrays in different experiments. Sorting the data according to the eigengenes and eigenarrays gives a global picture of the dynamics of gene expression, in which individual genes and arrays appear to be classified into groups of similar regulation and function, or similar cellular state and biological phenotype, respectively. After normalization and sorting, the significant eigengenes and eigenarrays can be associated with observed genome-wide effects of regulators, or with measured samples, in which these regulators are overactive or underactive, respectively.

1,815 citations