scispace - formally typeset
Search or ask a question

Showing papers by "Mohamed Elhoseny published in 2022"


Journal ArticleDOI
TL;DR: A comprehensive review of state-of-the-art advances in quantum machine learning can be found in this paper , where two methods for improving the performance of classical machine learning are presented.
Abstract: Machine learning has become a ubiquitous and effective technique for data processing and classification. Furthermore, due to the superiority and progress of quantum computing in many areas (e.g., cryptography, machine learning, healthcare), a combination of classical machine learning and quantum information processing has established a new field, called, quantum machine learning. One of the most frequently used applications of quantum computing is machine learning. This paper aims to present a comprehensive review of state-of-the-art advances in quantum machine learning. Besides, this paper outlines recent works on different architectures of quantum deep learning, and illustrates classification tasks in the quantum domain as well as encoding methods and quantum subroutines. Furthermore, this paper examines how the concept of quantum computing enhances classical machine learning. Two methods for improving the performance of classical machine learning are presented. Finally, this work provides a general review of challenges and the future vision of quantum machine learning. • Organize the most recent research works to pave the way for QML researchers. • Demonstrate the commonly used methods in the classification of real problems. • Provide readers with various quantum methods to enhance classical ML. • Present some of the challenges and future directions of QML.

33 citations


Journal ArticleDOI
TL;DR: In this article , an enhanced multimodal biometric technique for a smart city that is based on score-level fusion is proposed, where a fuzzy strategy with soft computing techniques known as an optimized fuzzy genetic algorithm is used.
Abstract: Biometric security is a major emerging concern in the field of data security. In recent years, research initiatives in the field of biometrics have grown at an exponential rate. The multimodal biometric technique with enhanced accuracy and recognition rate for smart cities is still a challenging issue. This paper proposes an enhanced multimodal biometric technique for a smart city that is based on score-level fusion. Specifically, the proposed approach provides a solution to the existing challenges by providing a multimodal fusion technique with an optimized fuzzy genetic algorithm providing enhanced performance. Experiments with different biometric environments reveal significant improvements over existing strategies. The result analysis shows that the proposed approach provides better performance in terms of the false acceptance rate, false rejection rate, equal error rate, precision, recall, and accuracy. The proposed scheme provides a higher accuracy rate of 99.88% and a lower equal error rate of 0.18%. The vital part of this approach is the inclusion of a fuzzy strategy with soft computing techniques known as an optimized fuzzy genetic algorithm.

26 citations



Journal ArticleDOI
TL;DR: In this article , an enhanced multimodal biometric technique for a smart city that is based on score-level fusion is proposed, where a fuzzy strategy with soft computing techniques known as an optimized fuzzy genetic algorithm is used.
Abstract: Biometric security is a major emerging concern in the field of data security. In recent years, research initiatives in the field of biometrics have grown at an exponential rate. The multimodal biometric technique with enhanced accuracy and recognition rate for smart cities is still a challenging issue. This paper proposes an enhanced multimodal biometric technique for a smart city that is based on score-level fusion. Specifically, the proposed approach provides a solution to the existing challenges by providing a multimodal fusion technique with an optimized fuzzy genetic algorithm providing enhanced performance. Experiments with different biometric environments reveal significant improvements over existing strategies. The result analysis shows that the proposed approach provides better performance in terms of the false acceptance rate, false rejection rate, equal error rate, precision, recall, and accuracy. The proposed scheme provides a higher accuracy rate of 99.88% and a lower equal error rate of 0.18%. The vital part of this approach is the inclusion of a fuzzy strategy with soft computing techniques known as an optimized fuzzy genetic algorithm.

17 citations


Journal ArticleDOI
TL;DR: In this article , an adaptive whale optimization algorithm with deep learning (AWOA-DL) technique is used to create a new financial distress prediction model, which can determine whether a company is experiencing financial distress or not.
Abstract: Predicting bankruptcies and assessing credit risk are two of the most pressing issues in finance. Therefore, financial distress prediction and credit scoring remain hot research topics in the finance sector. Earlier studies have focused on the design of statistical approaches and machine learning models to predict a company's financial distress. In this study, an adaptive whale optimization algorithm with deep learning (AWOA-DL) technique is used to create a new financial distress prediction model. The goal of the AWOA-DL approach is to determine whether a company is experiencing financial distress or not. A deep neural network (DNN) model called multilayer perceptron based predictive and AWOA-based hyperparameter tuning processes are used in the AWOA-DL method. Primarily, the DNN model receives the financial data as input and predicts financial distress. In addition, the AWOA is applied to tune the DNN model's hyperparameters, thereby raising the predictive outcome. The proposed model is applied in three stages: preprocessing, hyperparameter tuning using AWOA, and the prediction phase. A comprehensive simulation took place on four datasets, and the results pointed out the supremacy of the AWOA-DL method over other compared techniques by achieving an average accuracy of 95.8%, where the average accuracy equals 93.8%, 89.6%, 84.5%, and 78.2% for compared models.

7 citations


Journal ArticleDOI
TL;DR: A novel business approach for risk management is provided which includes deploying the trendiest techniques in this era which are social media and big data analysis and the challenges of the new framework are provided.
Abstract: This paper provides a quick review about business intelligence approaches and techniques in risk management. The important research articles from 2007 to 2021 are involved in this review. We found a little contribution from researchers in this research direction, however the vital role of business intelligence in risk management. Moreover, we provide a novel business approach for risk management. This approach includes deploying the trendiest techniques in this era which are social media and big data analysis. Social media represents the source of identifying risks through the discussions of social media users as well as big data analysis techniques which represent the control tool for potential risks. The new approach will help firms and organizations in many sectors to manage risks efficiently and make the best decisions. Further, we provide the challenges of the new framework and the further research points.

7 citations


Journal ArticleDOI
TL;DR: A thorough study of the Hardware/Software Co-Design approach is introduced to choose the most suitable system for the proposed algorithm that responds to the different temporal and architectural constraints of vegetation monitoring in agricultural areas using embedded systems.
Abstract: The development of embedded systems in sustainable precision agriculture has provided an important benefit in terms of processing time and accuracy of results, which has influenced the revolution in this field of research. This paper presents a study on vegetation monitoring algorithms based on Normalized Green-Red Difference Index (NGRDI) and Visible Atmospherically Resistant Index (VARI) in agricultural areas using embedded systems. These algorithms include processing and pre-processing to increase the accuracy of sustainability monitoring. The proposed algorithm was evaluated on a real database in the Souss Massa region in Morocco. The collection of data was based on unmanned aerial vehicles images hand data using four different agricultural products. The results in terms of processing time have been implemented on several architectures: Desktop, Odroid XU4, Jetson Nano, and Raspberry. However, this paper introduces a thorough study of the Hardware/Software Co-Design approach to choose the most suitable system for our proposed algorithm that responds to the different temporal and architectural constraints. The evaluation proved that we could process 311 frames/s in the case of low resolution, which gives real-time processing for agricultural field monitoring applications. The evaluation of the proposed algorithm on several architectures has shown that the low-cost XU4 card gives the best results in terms of processing time, power consumption, and computation flexibility.

7 citations


Journal ArticleDOI
TL;DR: The study formulates the workload assignment problem for IoV applications based on linear integer programming and devises the fault-tolerant and security delay optimal workload assignment (SFDWA) schemes that determine optimal workload assignments in edge computing.
Abstract: The number of automobiles has rapidly increased in recent years. To broaden inhabitant’s travel options, push transportation infrastructures to their limitations. With the rapid expansion of vehicles, traffic congestion and car accidents are all common occurrences in the city. The Internet of drone vehicle things (IoDV) has developed a new paradigm for improving traffic situations in urban areas. However, edge computing has the following issues such as fault-tolerant and security-enabled delay optimal workload assignment. The study formulates the workload assignment problem for IoV applications based on linear integer programming. The study devises the fault-tolerant and security delay optimal workload assignment (SFDWA) schemes that determine optimal workload assignment in edge computing. The goal is to minimize average response time, which combines network, computation, security, and fault-tolerant delay. Simulation results show that the proposed schemes gain 15% optimal workload assignment for IoV application compared to existing studies.

5 citations



Journal ArticleDOI
10 Jan 2022
TL;DR: In this paper , a review in processing information tools-based embedded systems in precision agriculture algorithms with different applications: weed detection, numerical counting, monitoring of plant indexes, and disease detection.
Abstract: ABSTRACT Precision agriculture (PA) research aims to design decision systems based on agricultural site control and management. These systems consist of observing fields and measuring metrics to optimize yields and investments while preserving resources. The corresponding applications can be found on large agricultural areas based on satellites, unmanned aerial vehicles (UAVs), and sol robots. All these applications based on various algorithms that are complex in terms of processing time. If these algorithms are evaluated offline on work-stations or desktops, this is not the case for algorithms that need to be embedded and should operate and help make real-time decisions. We, therefore, need an advanced study using hardware-software co-design approach to design decision systems to embed different algorithms, including sensor data acquisition and processing units. In this work, we propose a review in processing information tools-based embedded systems in PA algorithms with different applications: weed detection, numerical counting, monitoring of plant indexes, and disease detection. This review has been based on more than 100 papers to extract useful information on the different techniques used and the information processing systems. The elaborated study presents the various tools, databases, and systems in order to extract the advantages and disadvantages of system and application.

5 citations


Journal ArticleDOI
TL;DR: A proposed hybrid model of the artificial neural network (ANN) with parameters optimization by the butterfly optimization algorithm has been introduced and is compared with the pretrained AlexNet, GoogLeNet, and the SVM to identify the publicly accessible COVID-19 chest X-ray and CT images.
Abstract: Automated disease prediction has now become a key concern in medical research due to exponential population growth. The automated disease identification framework aids physicians in diagnosing disease, which delivers accurate disease prediction that provides rapid outcomes and decreases the mortality rate. The spread of Coronavirus disease 2019 (COVID-19) has a significant effect on public health and the everyday lives of individuals currently residing in more than 100 nations. Despite effective attempts to reach an appropriate trend to forecast COVID-19, the origin and mutation of the virus is a crucial obstacle in the diagnosis of the detected cases. Even so, the development of a model to forecast COVID-19 from chest X-ray (CXR) and computerized tomography (CT) images with the correct decision is critical to assist with intelligent detection. In this paper, a proposed hybrid model of the artificial neural network (ANN) with parameters optimization by the butterfly optimization algorithm has been introduced. The proposed model was compared with the pretrained AlexNet, GoogLeNet, and the SVM to identify the publicly accessible COVID-19 chest X-ray and CT images. There were six datasets for the examinations: three datasets with X-ray pictures and three with CT images. The experimental results approved the superiority of the proposed model for cognitive COVID-19 pattern recognition with average accuracy 90.48, 81.09, 86.76, and 84.97% for the proposed model, support vector machine (SVM), AlexNet, and GoogLeNet, respectively.

Journal ArticleDOI
TL;DR: In this article , an autonomous robot is designed to operate autonomously to extract useful information from the plants based on precise GPS localization using an RGB camera for plant detection and a multispectral camera for extracting the different special bands for processing, and an embedded architecture integrating a Nvidia Jetson Nano, which allows to perform the required processing.
Abstract: Our work is focused on developing an autonomous robot to monitor greenhouses and large fields. This system is designed to operate autonomously to extract useful information from the plants based on precise GPS localization. The proposed robot is based on an RGB camera for plant detection and a multispectral camera for extracting the different special bands for processing, and an embedded architecture integrating a Nvidia Jetson Nano, which allows us to perform the required processing. Our system uses a multi-sensor fusion to manage two parts of the algorithm. Therefore, the proposed algorithm was partitioned on the CPU-GPU embedded architecture. This allows us to process each image in 1.94 s in a sequential implementation on the embedded architecture. The approach followed in our implementation is based on a Hardware/Software Co-Design study to propose an optimal implementation. The experiments were conducted on a tomato farm, and the system showed that we can process different images in real time. The parallel implementation allows to process each image in 36 ms allowing us to satisfy the real-time constraints based on 5 images/s. On a laptop, we have a total processing time of 604 ms for the sequential implementation and 9 ms for the parallel processing. In this context, we obtained an acceleration factor of 66 for the laptop and 54 for the embedded architecture. The energy consumption evaluation showed that the prototyped system consumes a power between 4 W and 8 W. For this raison, in our case, we opted a low-cost embedded architecture based on Nvidia Jetson Nano.

Journal ArticleDOI
TL;DR: A new access control scheme that employs blockchain for the key-revocation process that is secure against some well-known attacks on open banking systems, and secured against the chosen-text attack by employing the challenge-response authentication mechanism.
Abstract: Open banking allows banks and financial sectors to easily access the customers’ financial data which is revolutionizing. It also provides the customers with excellent cloud access to various providers’ wide range of financial services. The storage of such sensitive services and data on cloud servers is a double-edged sword. It can ease and support fine-grained access to such services/data anywhere and anytime, supporting the open banking system. But, on the other hand, data privacy and secrecy are a challenge. Thus, efficient access control should exist for open banking’s services and data to protect cloud-hosted financial sensitive data from unauthorized customers. This paper proposes a new access control scheme that employs blockchain for the key-revocation process. We implement the smart contract’s functions on the Ethereum platform and test the contract’s code on the Kovan Testnet before deploying it to the Mainnet. Although the customer is authenticated to open banking, his key/s can be revoked according to the status response of the bank branch. Thus, his access to financial services and data is denied. We did comprehensive experiments for the revocation status response time, data exchanged until receiving the revocation status, and the time spent updating the policy. Also, we compared the results of our proposed scheme with two well-known methods—Certificate Revocation List (CRL) and Online Certificate Status Protocol (OCSP). The experimental results show that our proposed scheme (BKR-AC) has a faster response time than Certificate Revocation List (CRL) and Online Certificate Status Protocol (OCSP) in case of nonrevoked keys/certificates and a slower response time in case of revoked keys to avoid nonrevoking a revoked key. But the data exchanged is an average for BKR-AC between CRL and OCSP, which is still a tiny amount and accepted. The security analysis proved that our scheme is secure against some well-known attacks on open banking systems. In addition, it is also secured against the chosen-text attack by employing the challenge-response authentication mechanism.

Journal ArticleDOI
TL;DR: An advanced business intelligence framework for firms in a post-pandemic phase to increase their performance and productivity is proposed and open challenges based on this framework are described.
Abstract: In this paper, we proposed an advanced business intelligence framework for firms in a post-pandemic phase to increase their performance and productivity. The proposed framework utilizes some of the most significant tools in this era, such as social media and big data analysis for business intelligence systems. In addition, we survey the most outstanding related papers to this study. Open challenges based on this framework are described as well, and a proposed methodology to minimize these challenges is given. Finally, the conclusion and further research points that are worth studying are discussed.

Journal ArticleDOI
24 Feb 2022
TL;DR: This work introduces a new hybrid quantum-kernel support vector machine (QKSVM) combined with a Binary Harris hawk optimization (BHHO)-based gene selection for cancer classification on a quantum simulator to improve the microarray cancer prediction performance with the quantum kernel estimation based on the informative genes by BHHO.
Abstract: Cancer classification based on gene expression increases early diagnosis and recovery, but high-dimensional genes with a small number of samples are a major challenge. This work introduces a new hybrid quantum-kernel support vector machine (QKSVM) combined with a Binary Harris hawk optimization (BHHO)-based gene selection for cancer classification on a quantum simulator. This study aims to improve the microarray cancer prediction performance with the quantum kernel estimation based on the informative genes by BHHO. The feature selection is a critical step in large-dimensional features, and BHHO is used to select important features. The BHHO mimics the behavior of the cooperative action of Harris hawks in nature. The principal component analysis (PCA) is applied to reduce the selected genes to match the qubit numbers. After which, the quantum computer is used to estimate the kernel with the training data of the reduced genes and generate the quantum kernel matrix. Moreover, the classical computer is used to draw the support vectors based on the quantum kernel matrix. Also, the prediction stage is performed with the classical device. Finally, the proposed approach is applied to colon and breast microarray datasets and evaluated with all genes and the selected genes by BHHO. The proposed approach (QKSVM{PCA{BHHO) is found to enhance the overall performance with two datasets. Also, the proposed approach is evaluated with different quantum feature maps (kernels) and classical kernel (RBF).

Journal ArticleDOI
TL;DR: In this article, a hybrid Computational Intelligence (CI) algorithm called Moth-Flame Optimization and Marine Predators Algorithms (MOMPA) is proposed for planning the COVID-19 pandemic medical robot's path without collisions.

Journal ArticleDOI
TL;DR: In this article , the authors presented a comprehensive and reproducible methodology that addressed their successful efforts in aligning the Computer Science program with ABET-CAC requirements by emphasizing criterion 4.
Abstract: ABET accreditation is sought globally for engineering and technology academic programs due to the quality, added value, and competitiveness it adds to students, program, and the university locally, regionally, and globally. Aligning with its mission to prepare students as global citizens for future career aspirations and lifelong learning through quality teaching, the American University in the Emirates (AUE) focuses on outcome-based education to ensure the employability of graduates and hence soon realized the significance of the Accreditation Board of Engineering and Technology-Computing Accreditation Commission (ABET-CAC) standard toward the Computer Science (CS) program. While pursuing ABET accreditation was challenging, the outcome was positive, and currently, the Computer Science Program, with its two specializations in Network Security and Digital Forensics is ABET-accredited. The process required support from all units within the institution and was a great learning experience for all stakeholders. ABET draws generic requirements to be fulfilled by a program seeking accreditation without a detailed procedure to achieve them. However, there is little information about achieving these requirements, especially criterion 4: continuous improvement, which most programs fail to comply with according to ABET. This study presented a comprehensive and reproducible methodology that addresses our successful efforts in aligning the CS program with ABET-CAC requirements by emphasizing criterion 4. This article reported the evaluation of Student Outcomes number one and two for the academic year 2020–2021 through a comprehensive framework. The framework showed data collection, data reporting and analysis, actions, and recommendations for the next academic cycle. The framework showed a mathematical model for calculating the Student Outcomes (SOs) attainment based on the mapped Course Learning Outcomes (CLOs). Finally, the recommendations were reported. We believe this article established a solid foundation that would be beneficial for insinuations pursuing ABET accreditation.

Journal ArticleDOI
TL;DR: This study proposes a mobile agent-based efficient energy resource management solution and also protects IoT appliances, and by exploring rule-based conditions, offers an energy-efficient recommended system.
Abstract: The Internet of Things (IoT) and sensor technologies are combined with various communication networks in smart appliances and perform a significant role. Connected devices sense, analyze, and send environmental data, as well as support applications’ connections. Mobile agents can be explored to provide sensing intelligence with IoT-based systems. Many strategies have been proposed to address the issue of energy efficiency while maintaining the sensor load at a low cost. However, advancements are still desired. Furthermore, without fully trustworthy relationships, sensitive data are at risk, and the solution must provide privacy protection against unexpected events. With the development of two algorithms, this study proposes a mobile agent-based efficient energy resource management solution and also protects IoT appliances. Firstly, the software agents perform a decision using past and present precepts, and by exploring rule-based conditions, it offers an energy-efficient recommended system. Second, data from IoT appliances are securely evaluated on edge interfaces before being transferred to end-centers for verification. Simulations-based tests are conducted and verified the significance of the proposed protocol against other studies in terms of network metrics.

Journal ArticleDOI
TL;DR: In this paper , an outlier detection model for financial crisis prediction using a political optimizer-based deep neural network (OD-PODNN) is presented, which makes use of the isolation forest (iForest) based approach.


Journal ArticleDOI
TL;DR: This study introduces a novel Artificial Fish Swarm Algorithm with Weighted Extreme Learning Machine (AFSA-WELM) model for cybersecurity on social media and achieves maximum precision–recall performance with various datasets.
Abstract: The progress of data technology and wireless networks is generated by open online communication channels. Unfortunately, trolls are abusing the technology for executing cyberattacks and threats. An automated cybersecurity solution is vital for avoiding the threats and security issues from social media. This can be a requirement for tackling and considering cyberbullying in various aspects including prevention of such incidents and automated detection. This study introduces a novel Artificial Fish Swarm Algorithm with Weighted Extreme Learning Machine (AFSA-WELM) model for cybersecurity on social media. The proposed model is mostly intended to detect the existence of cyberbullying on social media. The proposed model starts by processing the dataset and making it ready for the next stages of the model. It then uses the TF-IDF vectorizer for word embedding. After that, it uses the WELM model for the identification and classification of cyberbullying. Finally, the optimal tunning parameters used in the WELM model are derived for the AFSA model. The experimental analysis has shown that the proposed model achieves maximum accuracy compared with existing algorithms. Moreover, our proposed model achieves maximum precision–recall performance with various datasets.


Journal ArticleDOI
TL;DR: The study suggests a novel content-efficient decision-aware task scheduling (CATSA) method for defining and adapting to complicated environmental changes and outperforms current studies regarding workflow execution quality of services and improved the makespan and deadline meeting in the study.
Abstract: Sensor-aware distributed workflow applications are becoming increasingly popular underwater. The apps are marine operations that generate data and process it based on its characteristics. Mobile-fog-cloud paradigms, as well as computing such as sensor nodes, have emerged. As previously stated, the nodes can be combined into a single system to achieve several goals. Many factors are considered, including network contents, workload fluctuation, variable execution durations, deadlines, and bandwidth. As a result, scheduling mobile workflow systems with multiple parameters might be challenging. The study suggests a novel content-efficient decision-aware task scheduling (CATSA) method for defining and adapting to complicated environmental changes. The CATSA consists of several components that work together to perform various benchmarks in the system, including a decision planner, sequencing, and scheduling. As evidenced by test findings during evaluation, the suggested architecture outperforms current studies regarding workflow execution quality of services and improved the makespan 30% and deadline meeting 40% in the study.

Journal ArticleDOI
TL;DR: In this paper , the authors presented how a patient may send his vital signs to the physician through the Internet without meeting with the latter in person, and how a smart contract is used with the smart contract to secure private patient records.
Abstract: Nowadays, the Internet of Medical Things (IoMT) technology is growing and leading the revolution in the global healthcare field. Exchanged information through IoMT permits attackers to hack or modify the patient’s data. Hence, it is of critical importance to ensure the security and privacy of this information. The standard privacy techniques are not secured enough, so this paper introduces blockchain technology that is used for securing data. Blockchain is used with the smart contract to secure private patient records. This paper presents how a patient may send his vital signs to the physician through the Internet without meeting with the latter in person. These vital signs are collected from the IoMT system that we developed before. In the proposed method, each medical record is stored in the block and connected to the previous block by a hashing function. In order to secure the new block, the SHA256 algorithm is used. We modified the SHA256 algorithm by using run-length code in compressing data. If any hacker attempts to attack any medical record, he must change all previous blocks. In order to preserve the rights of the doctor and patient, a smart contract is built into the blockchain system. When the transaction begins, the smart contract withdraws the money from the patient’s wallet and stores it in the smart contract. When the physician sends the treatment to the patient, the smart contract transfers the money to the physician. This paper shows that all recent work implements Blockchain 2 into the security system. This paper also shows that our security system can create a new block with O (n + d) time complexity. As a result, our system can create one hundred blocks in two minutes. Additionally, our system can deposit the money from the patient’s wallet into the physician’s wallet promptly. This paper also shows that our method performs better than all subsequent versions of the original blockchain.

Journal ArticleDOI
TL;DR: In this article , the authors proposed a joint resource allocation and offloading decision optimization problem to minimize network latency and total energy usage for connected autonomous vehicular networks, where vehicles are assembled into clusters, in which vehicle can transmit tasks to the other vehicle, while on the other hand the VEC server is used for processing the data.
Abstract: Vehicle Edge Computing (VEC)-assisted computational offloading brings cloud computing closer to user equipment (UEs) at the edge of the access network by delivering various services to the UEs with limited processing power and battery. However, in fifth-generation and beyond 5G (B5G) networks, where UEs’ service requests and locations change dynamically, the deployment of static edge server deployments may lead to an increase in latency and total energy consumption. This paper presents a latency-energy-aware, efficient task offloading scheme for connected autonomous vehicular networks. Firstly, vehicles are assembled into clusters, in which vehicle can transmit tasks to the other vehicle, while on the other hand, the VEC server is used for processing the data. We developed a joint resource allocation and offloading decision optimization problem to minimize network latency and total energy usage. Due to the non-convex character of the optimization issue, we employed the Markov decision process (MDP) to convert it to a reinforcement learning (RL) problem. Then, we used a soft-actor critic-based scheme to achieve the optimal policy for resource allocation and task offloading to reduce the total latency and energy consumption for connected autonomous vehicles. Simulation analysis reveals that the proposed scheme attains 46.6% and 17.2% lesser delay, and 28.8% and 20.0% consumes less energy than the Hybrid DRL with Genetic Algorithm (HDRL-GA) and DRL based collaborative Data Scheduling (DRL-CDSS) state-of-art schemes.



Journal ArticleDOI
TL;DR: In this article , the authors focus on the security and privacy of medical data and reports in a smart healthcare application, and they aim to focus on how to ensure the integrity of electronic medical records and medical reports.
Abstract: The thirteen papers in this special section focuses on the development of smart health monitoring systems. The rapid growth of smart healthcare applications is made possible by the advancements in the Internet of Things (IoT). The IoT collect data related to healthcare, public safety, energy, and behavior monitoring that provides useful information to a variety of applications. For example, in a smart health monitoring and improvement (SHAMI) system, healthcare professionals share the medical data of patients for a better and quick diagnosis of a disease. The integrity of electronic medical records and medical reports is necessary because modified, altered or corrupted medical data can make wrong diagnoses and create serious health issues for any individual. In addition, privacy leakage and medical identity theft is a growing concern and has contributed to a large number of e-fraud cases across the world. Therefore, in this special issue, we aim to focus on the security and privacy of medical data and reports in a smart healthcare application.

Journal ArticleDOI
TL;DR: In this article , the authors proposed a novel tunicate swarm algorithm that combines a long short-term memory-recurrent neural network (LSTM-RNN) model with a novel pre-processing to transform the input data into a usable format.
Abstract: The Internet of Things (IoT) paradigm has matured and expanded rapidly across many disciplines. Despite these advancements, IoT networks continue to face an increasing security threat as a result of the constant and rapid changes in the network environment. In order to address these vulnerabilities, the Fog system is equipped with a robust environment that provides additional tools to beef up data security. However, numerous attacks are persistently evolving in IoT and fog environments as a result of the development of several breaches. To improve the efficiency of intrusion detection in the Internet of Things (IoT), this research introduced a novel tunicate swarm algorithm that combines a long-short-term memory-recurrent neural network. The presented model accomplishes this goal by first undergoing data pre-processing to transform the input data into a usable format. Additionally, attacks in the IoT ecosystem can be identified using a model built on long-short-term memory recurrent neural networks. There is a strong correlation between the number of parameters and the model’s capability and complexity in ANN models. It is critical to keep track of the number of parameters in each model layer to avoid over- or under-fitting. One way to prevent this from happening is to modify the number of layers in your data structure. The tunicate swarm algorithm is used to fine-tune the hyper-parameter values in the Long Short-Term Memory-Recurrent Neural Network model to improve how well it can find things. TSA was used to solve several problems that couldn’t be solved with traditional optimization methods. It also improved performance and shortened the time it took for the algorithm to converge. A series of tests were done on benchmark datasets. Compared to related models, the proposed TSA-LSTMRNN model achieved 92.67, 87.11, and 98.73 for accuracy, recall, and precision, respectively, which indicate the superiority of the proposed model.