scispace - formally typeset
Search or ask a question
Author

Mohamed Elhoseny

Bio: Mohamed Elhoseny is an academic researcher from Mansoura University. The author has contributed to research in topics: Computer science & Encryption. The author has an hindex of 49, co-authored 240 publications receiving 7044 citations. Previous affiliations of Mohamed Elhoseny include Maharaja Agrasen Institute of Technology & Cairo University.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: A new robust General N user authentication protocol based on N-particle Greenberger–Horne–Zeilinger (GHZ) states is presented, which makes eavesdropping detection more effective and secure, as compared to some current authentication protocols.
Abstract: Quantum communication provides an enormous advantage over its classical counterpart: security of communications based on the very principles of quantum mechanics. Researchers have proposed several approaches for user identity authentication via entanglement. Unfortunately, these protocols fail because an attacker can capture some of the particles in a transmitted sequence and send what is left to the receiver through a quantum channel. Subsequently, the attacker can restore some of the confidential messages, giving rise to the possibility of information leakage. Here we present a new robust General N user authentication protocol based on N-particle Greenberger–Horne–Zeilinger (GHZ) states, which makes eavesdropping detection more effective and secure, as compared to some current authentication protocols. The security analysis of our protocol for various kinds of attacks verifies that it is unconditionally secure, and that an attacker will not obtain any information about the transmitted key. Moreover, as the number of transferred key bits N becomes larger, while the number of users for transmitting the information is increased, the probability of effectively obtaining the transmitted authentication keys is reduced to zero.

62 citations

Proceedings ArticleDOI
01 Jun 2017
TL;DR: The adaptive neuro-fuzzy inference system approach is used and this approach is trained with a particle swarm optimization algorithm to improve the prediction performance of the biochar.
Abstract: This paper proposed an intelligent approach to predict the biochar yield. The biochar is an important renewable energy that produced from biomass thermochemical processes with yields that depend on different operating conditions. There are some approaches that are used to predict the production of biochar such as least square support vector machine. However, this approach suffers from some drawbacks such as get stuck in local point and high time complexity. In order to avoid these drawbacks, the adaptive neuro-fuzzy inference system approach is used and this approach is trained with a particle swarm optimization algorithm to improve the prediction performance of the biochar. Heating rate, pyrolysis temperature, Moisture content, holding time and sample mass were used as the input parameters and the outputs are biochar mass and biochar yield. The results show that the proposed approach is better than other approaches based on three measures the root mean square error, the coefficient of determination and average absolute percent relative error (0.2673, 0.9842 and 3.4529 respectively).

61 citations

Book ChapterDOI
22 Feb 2018
TL;DR: This paper proposes a framework for smoothly adapt the traditional e-learning systems to be suitable for smart cities applications.
Abstract: Due to the rapid change in technologies, new data forms exist which lead to a huge data size on the internet. As a result, some learning platforms such as e-learning systems must change their methodologies for data processing to be smarter. This paper proposes a framework for smoothly adapt the traditional e-learning systems to be suitable for smart cities applications. Learning Analytics (LA) has turned into a noticeable worldview with regards to instruction of late which embraces the current progressions of innovation, for example, cloud computing, big data processing, and Internet of Things. LA additionally requires a concentrated measure of preparing assets to create applicable investigative outcomes. Be that as it may, the customary methodologies have been wasteful at handling LA difficulties.

60 citations

Journal ArticleDOI
TL;DR: This article introduces a highly reliable and low-complexity image compression scheme using neighborhood correlation sequence (NCS) algorithm that increases the compression performance and decreases the energy utilization of the sensor nodes with high fidelity.
Abstract: Recently, the advancements in the field of wireless technologies and micro-electro-mechanical systems lead to the development of potential applications in wireless sensor networks (WSNs). The visual sensors in WSN create a significant impact on computer vision based applications such as pattern recognition and image restoration. generate a massive quantity of multimedia data. Since transmission of images consumes more computational resources, various image compression techniques have been proposed. But, most of the existing image compression techniques are not applicable for sensor nodes due to its limitations on energy, bandwidth, memory, and processing capabilities. In this article, we introduce a highly reliable and low-complexity image compression scheme using neighborhood correlation sequence (NCS) algorithm. The NCS algorithm performs the bit reduction operation and then encoded by a codec (such as PPM, Deflate, and Lempel Ziv Markov chain algorithm.) to further compress the image. The proposed NCS algorithm increases the compression performance and decreases the energy utilization of the sensor nodes with high fidelity. Moreover, it achieved a minimum end to end delay of 1074.46 ms at the average bit rate of 4.40 bpp and peak signal to noise ratio of 48.06 on the applied test images. On comparing with state-of-art methods, the proposed method maintains a better tradeoff between compression efficiency and reconstructed image quality.

60 citations


Cited by
More filters
01 Jan 2004
TL;DR: Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance and describes numerous important application areas such as image based rendering and digital libraries.
Abstract: From the Publisher: The accessible presentation of this book gives both a general view of the entire computer vision enterprise and also offers sufficient detail to be able to build useful applications. Users learn techniques that have proven to be useful by first-hand experience and a wide range of mathematical methods. A CD-ROM with every copy of the text contains source code for programming practice, color images, and illustrative movies. Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance. Topics are discussed in substantial and increasing depth. Application surveys describe numerous important application areas such as image based rendering and digital libraries. Many important algorithms broken down and illustrated in pseudo code. Appropriate for use by engineers as a comprehensive reference to the computer vision enterprise.

3,627 citations

01 Jun 2005

3,154 citations

01 Sep 2008
TL;DR: The Methodology used to Prepare the Guideline Epidemiology Incidence Etiology and Recommendations for Assessing Response to Therapy Suggested Performance Indicators is summarized.
Abstract: Executive Summary Introduction Methodology Used to Prepare the Guideline Epidemiology Incidence Etiology Major Epidemiologic Points Pathogenesis Major Points for Pathogenesis Modifiable Risk Factors Intubation and Mechanical Ventilation Aspiration, Body Position, and Enteral Feeding Modulation of Colonization: Oral Antiseptics and Antibiotics Stress Bleeding Prophylaxis, Transfusion, and Glucose Control Major Points and Recommendations for Modifiable Risk Factors Diagnostic Testing Major Points and Recommendations for Diagnosis Diagnostic Strategies and Approaches Clinical Strategy Bacteriologic Strategy Recommended Diagnostic Strategy Major Points and Recommendations for Comparing Diagnostic Strategies Antibiotic Treatment of Hospital-acquired Pneumonia General Approach Initial Empiric Antibiotic Therapy Appropriate Antibiotic Selection and Adequate Dosing Local Instillation and Aerosolized Antibiotics Combination versus Monotherapy Duration of Therapy Major Points and Recommendations for Optimal Antibiotic Therapy Specific Antibiotic Regimens Antibiotic Heterogeneity and Antibiotic Cycling Response to Therapy Modification of Empiric Antibiotic Regimens Defining the Normal Pattern of Resolution Reasons for Deterioration or Nonresolution Evaluation of the Nonresponding Patient Major Points and Recommendations for Assessing Response to Therapy Suggested Performance Indicators

2,961 citations