scispace - formally typeset
Search or ask a question

Showing papers in "Algorithms for intelligent systems in 2022"


Book ChapterDOI
TL;DR: In this paper , a novel hybrid algorithm developed by merging atom search optimization (ASO) and Nelder-Mead (NM) simplex search algorithms is presented, which is the first reported work on combining ASO and NM methods for optimization problems.
Abstract: A novel hybrid algorithm developed by merging atom search optimization (ASO) and Nelder-Mead (NM) simplex search algorithms is presented. The proposed improved algorithm (ASO-NM) is the first reported work on combining ASO and NM methods for optimization problems. The combination of ASO and NM leads to the construction of the desired metaheuristic approach that has a balanced exploration and exploitation. The proposed hybrid ASO-NM was used for optimizing a proportional-integral-derivative controller design for automobile cruise control systems as well as testing four well-known classical benchmark functions for the first time. The obtained statistical and transient response analyses and comparisons have shown the better capability of the proposed hybrid ASO-NM algorithm which can be used for further optimization problems as an effective approach.

17 citations


Journal ArticleDOI
TL;DR: In this paper , a preliminary study of the present status of blockchain adoption in the educational sector is presented, which aims to increase public awareness of blockchain technology whilst emphasising the advantages, risks, difficulties, and hazards associated with its use in educational contexts.
Abstract: This is a preliminary study of the present status of blockchain adoption in the educational sector. The aim of this study is to increase public awareness of blockchain technology whilst emphasising the advantages, risks, difficulties, and hazards associated with its use in educational contexts. This article further analyses the fundamental technological principles and application characteristics of blockchain technology and proposes a blockchain-based solution to the issues associated with online education. Through the use of blockchain technology, a student’s accomplishments may be monitored more precisely whilst remaining anonymous. Additionally, this ensures that companies have trustworthy digital certificates that safeguard their intellectual property whilst allowing for learning via the use of intelligent contracts. According to the findings, blockchain technology has tremendous potential for improving education in general, and online education in particular. The study’s findings shall inspire the creation of a decentralised online education system.

13 citations


Journal ArticleDOI
TL;DR: In this article , a hybrid approach between harris hawks optimization metaheuristics and deep neural network machine learning model is proposed for intrusion detection, which is tested against well-known NSL-KDD and KDD Cup 99 Kaggle datasets.
Abstract: Intrusion detection systems attempt to identify assaults while they occur or after they have occurred and they detect abnormal behavior in a network of computer systems in order to identify whether the activity is hostile or unlawful, allowing a response to the violation. Intrusion detection systems gather network traffic data from a specific location on the network or computer system and utilize it to safeguard hardware and software assets against malicious attacks. These systems employ high-dimensional datasets with a high number of redundant and irrelevant features and a large number of samples. One of the most significant challenges from this domain is the analysis and classification of such a vast amount of heterogeneous data. The utilization of machine learning models is necessary. The method proposed in this paper represents a hybrid approach between recently devised yet well-known, harris hawks optimization metaheuristics and deep neural network machine learning model. Since the basic harris hawks optimization exhibits some deficiencies, its improved version is used for dimensionality reduction, followed by the classification executed by the deep neural network model. Proposed approach is tested against well-known NSL-KDD and KDD Cup 99 Kaggle datasets. Comparative analysis with other similar methods proved the robustness of the presented technique when metrics like accuracy, precision, recall, F1-score are taken into account.

13 citations


Journal ArticleDOI
TL;DR: In this paper , a support vector machine (SVM) model was used to predict the future behavior of market prices of cryptocurrencies, using the sine cosine method to anticipate cryptocurrency values, which was enhanced with a simple exploration mechanism and compared with other techniques run on identical sets of data.
Abstract: For crypto investors, anticipating market behaviour is critical. They make judgements that result in profit or loss based on the prediction. The prediction often involves the use of previous data to estimate the future behaviour of market prices. The machine learning methodology is used for prediction. In recent years, nature-inspired algorithms have been effectively employed in the optimization of several machine learning models. Swarm metaheuristics algorithms, a group of nature-inspired algorithms, have shown to be outstanding optimization algorithms in the field of machine learning and a variety of other practical applications. This work provides one such methodology, namely improve the support vector machine model by using an improved version of the method sine cosine to anticipate cryptocurrency values. The basic SCA was enhanced with a simple exploration mechanism and then evaluated by comparing it to other techniques run on identical sets of data. The findings of the conducted experiment show that the suggested strategy outperformed the other alternatives considered in the study.

10 citations


Book ChapterDOI
TL;DR: The CycleGAN deep learning framework has been successfully used for image style transfer in important domains such as medical diagnosis as mentioned in this paper , where the authors describe attempts, first of their kind, at using the framework for converting Indian Classical music from one melodic framework, called raga or raag, to another.
Abstract: The CycleGAN deep learning framework has been successfully used for image style transfer in important domains such as medical diagnosis. This paper describes attempts, first of their kind, at using the framework for converting Indian Classical music from one melodic framework, called raga or raag, to another. From the audio samples generated and their visualizations, it is evident that the experiments were reasonably successful in converting music in Hindustani Classical raga to music in Indian Carnatic raga and vice versa. The insights presented in the paper are hoped to inspire further work to revolutionize the use of technology to improvise Indian Classical music.

9 citations


Book ChapterDOI
TL;DR: In this article , the conspectus of IoT security is explained in ease words, then an attack experiment of DoS over IoT is analyzed, which leads to the DoS attack.
Abstract: IoT industry is growing rapidly worldwide. Therefore, the expansion in this industry has to be taken seriously as those IoT devices will take a huge part in our daily life. Presently, IoT devices are being used to create smart homes, automobiles, wearables, industrial Internet, and others. Devices of IoT are built to enhance user experience, inventive services, and people’s lifestyles. The extending rise of IoT world makes it a desirable target for cyber-criminals. With weak security levels applied in IoT devices, and vulnerabilities existing in IoT inspire attackers not to wait to exploit. The most attack used by cyber-criminals is the DoS attack, which leads IoT systems to shut down. IoT systems consist of at least two or more devices, which pledge to deal with these systems carefully. In this paper, the conspectus of IoT security is explained in ease words, then an attack experiment of DoS over IoT is analyzed.

9 citations


Book ChapterDOI
TL;DR: In this paper , a transformer model comprising encoders and decoders is adapted by tuning the different parameter sets to identify the best performing model for Bangla-English translation, which outperformed some prominent existing MT methods.
Abstract: Bangla is a widely spoken language, but unfortunately, very few researches in Machine Translation (MT) for Bangla have been reported in the literature. This research aims at developing an MT system for Bangla–English translation. MT is language-dependent as data preparation is different from language to language. Moreover, the vital part of MT is a model which requires training to adjust the particular language pair along with their grammar and phrase rules. Modern deep learning-based transformer model has been used for this language pair as it worked well for other language pairs. A transformer model comprising encoders and decoders is adapted by tuning the different parameter sets to identify the best performing model for Bangla–English translation. The proposed model is tested on a benchmark of Bangla–English corpus, which outperformed some prominent existing MT methods.

8 citations


Book ChapterDOI
TL;DR: In this article , the exact issues and limitations that are stopping Ubicomp from becoming a reality are identified and discussed, along with potential solutions to them, which are being rapidly developed.
Abstract: The rapid developments in IoT, wireless technology, and mobile computing devices imply that ubiquitous computing is not a far-off dream from the present. But while it is not wrong to assume that ubiquitous computing will be achievable and feasible in the future, the concept is still plagued by a very high number of limitations. These numerous limitations are mainly related to the device’s hardware, security concerns, and energy maintenance for each device, cost and usability of the ubicomputing environment, connectivity, and complex user interface. This paper has pinpointed these exact issues and limitations that are stopping Ubicomp from becoming a reality. Along with finding these issues, we have also researched and mentioned potential solutions to them, which are being rapidly developed. Additionally, we have discussed comparative study of these solutions and their limitations and what can be improved. Toward the end, we have added our suggestions on what can be done to overcome the obstacles for UbiComp, and the future possibilities for the technology.

7 citations


Book ChapterDOI
TL;DR: In this paper , the authors present a survey on different architectures, tools, and techniques used for state-of-the-art wireless network on chip (WiNoC) processors.
Abstract: The key challenge the SoC designers vanquish is on performance and energy consumption. Concerning this, the network on chip (NoC) evolved to accommodate greater numbers of IP cores on a die and empowering conventional technology. Intensive research has been going on globally keeping the major parameters in mind the latency, power consumption, bandwidth, and interconnect routing problems of conventional NoCs. Wireless network on chip (WiNoC) with on-chip antennas, routers, and transceivers is a promising trend in NoC in enhancing the performance and energy in multicore processors. This paper presents the survey on different architectures, tools, and technique which are used for state-of-the-art WiNoC. Researchers have designed and proposed several architectures and algorithms for different wireless hubs, traffic scenarios, and data injection rates with diverse IP cores which yield significantly improved results in terms of throughput, latency, and reliability.

6 citations


Book ChapterDOI
TL;DR: Gupta et al. as discussed by the authors proposed a lattice-based mutual authentication and key agreement protocol for smart grid which permits a secure communication among the service provider and the smart meter, which also allows the service providers to initiate a communication with smart meters in the smart grid system.
Abstract: Gupta, Daya SagarInternet of Things (IoT) has been experienced in various communication technologies. In recent years, as an application of IoT, smart grid dragged great attention from industry and academia. Smart grid can establish the sharing of information among smart meters and the service provider via the Internet protocol (IP)-based communication. This information sharing among various entities in smart grid system may lead to make the system vulnerable to various security and privacy issues. Various security protocols are existed in the literature which provides the secure communication of smart meters with the service provider by establishing a secure connection. However, many of them fail to propose a security protocol where the service provider can share the information with different smart meters and can negotiate a session key with each smart meters for further communication. Further, the existed schemes are exposed to quantum attack as they have used the traditional cryptographic tool in their implementation. Considering these issues, we design a lattice-based mutual authentication and key agreement protocol for smart grid which permits a secure communication among the service provider and the smart meter. The proposed protocol also allows the service provider to initiate a communication with smart meters in the smart grid system. The security analysis shows the security resistance of the proposed scheme with existed as well as quantum attack.

5 citations


Journal ArticleDOI
TL;DR: In this article , a model is suggested for predicting the diverse blowout of COVID-19, which is an infection that spreads from one to another in multiple chains, and some investigations utilize diverse statistical methods to deliver models in order to analyze the current state of the pandemic and the losses incurred for other reasons depending upon place to place.
Abstract: The novel virus, often called COVID-19, is an infection that spreads from one to another in multiple chains. The novel virus has caused a universal pandemic, and some investigations utilize diverse statistical methods to deliver models in order to analyze the current state of the pandemic and the losses incurred for other reasons depending upon place to place. The obtained statistical models depend on diverse aspects, and studies are purely based on possible preferences. In this research, a model is suggested for predicting the diverse blowout of COVID-19. Machine learning classifiers like linear regression, multilayer perceptron, and vector autoregression can be chosen to predict the likely patterns of COVID-19 effects in various parts of the world based on their climate, environment, culture, behavior, and socioeconomic factors.

Book ChapterDOI
TL;DR: RLO as discussed by the authors is a range-free localization algorithm using optimization in WSN, which defines the distance error factors with upper and lower bonds in a manner such that the unknown node of interest finds its optimum location.
Abstract: The field observation of a sensor node is signified by the location of the event occurrences. The various approaches to estimate the location of the sensor nodes (unknown node) approximate distances between sensor pairs by considering the known location of some of the sensor nodes (anchor nodes), like the DV-hop algorithm and its various successors. Here, the major challenge is to estimate precise distances between the node pairs using average hop sizes and the non-Euler shortest paths. This research gap motivates to propose an algorithm—RLO, a range-free localization using optimization in WSN. RLO defines the distance error factors with upper and lower bonds in a manner such that the unknown node of interest finds its optimum location. To obtain an optimum location, we apply linear programming and transformed the localization problem into the optimization problem. The validation of RLO by simulation experimentation shows that RLO is better to localize the unknown nodes by DV-hop and IDV algorithms by 6% and 3%, respectively, on average.

Book ChapterDOI
TL;DR: In this article , the authors present the different trends that are gaining momentum in the industry during post-COVID-19 and discuss the challenges faced by data and analytics in 2019.
Abstract: For many industries, 2020 has been a turbulent year, but data analytics has been one field that has observed steady and substantial growth amid economic and market instability. We find ourselves in a very paradoxical position in 2020. A builder cannot properly build a house without the right instruments or materials, and a business cannot make the best decisions without the right data and consumer insights. The rapidly changing demands of customers are forcing businesses in all industries to continually spindle their strategies so as to remain competitive and drive sales—and data and analytics are the best way to do this. On the one side, to survive the global health crisis, we are told to separate ourselves socially. The hyperconnected world, on the other hand, helps us to communicate even more easily and more effectively, exchanging knowledge and ideas from the comfort of our homes across the world. This, in turn, urges industries to build tools to help us better explain concepts and encourage each professional to make a valid contribution. Such technologies certainly did not arise just this year, but they are affecting many areas of business and life now, and there is no exception to data analytics. This chapter presents the different trends that are gaining momentum in the industry during post-COVID-19.

Journal ArticleDOI
Prakhar Consul1
TL;DR: In this paper , a cell-free massive MIMO system with an unmanned aerial vehicle (UAV) assist for the power optimisation of the network is proposed, where a deep reinforcement learning (DRL)-based deep deterministic policy gradient (DDPG) algorithm is used for solving the power optimization problems in a centralised fashion.
Abstract: Interference is a very critical issue, especially in future communication networks because of their complexity and ultra-dense nature. Also, the interference limits the performance of the communication network in terms of quality of service (QoS) and user net throughput. In order to address this issue, we proposed a cell-free massive MIMO system with an unmanned aerial vehicle (UAV) assist for the power optimisation of the network. A deep reinforcement learning (DRL)-based deep deterministic policy gradient (DDPG) algorithm is used for solving the power optimisation problems in a centralised fashion. Multiple neural networks make the learning approach less sophisticated for solving the power optimisation problem. Numerical results represents that the higher SE in results as compared to existing state-of-the-art approaches achieved with the proposed scheme.

Journal ArticleDOI
TL;DR: In this paper , three benchmark deep convolutional neural networks (CNNs) such as InceptionV3, MobileNet, and Xception are investigated to find their respective efficacy.
Abstract: Recognition of medicinal plants is very important to enhance plant cultivation, boost production in the medical industry as well as protect the plant species from extinction. The plant leaf is a key feature in recognizing the plant. However, standard medicinal plant leaf data sets are scarce. This chapter deals with the development of a standard data set and also the recognition of plants from their leaves using a deep learning model, as deep learning models confirmed superior recognition accuracy. With this view, here, three benchmark deep convolutional neural networks (CNN), such as InceptionV3, MobileNet, and Xception are investigated to find their respective efficacy. Extensive experiments are performed using the developed data set to recognize 11 medicinal plants from their leaf images. MobileNet deep CNN architecture confirms the optimum performance based on four evaluation metrics derived from the confusion matrix.

Book ChapterDOI
TL;DR: In this paper , the authors have used an open-source dataset of telecom and banking sectors' customers and predicted churning rates using artificial neural networks (ANNs) and machine learning methodologies like decision tree, random forest, KNN, kernel SVM (K-SVM), naive Bayes, and logistic regression.
Abstract: AbstractCustomer satisfaction is one of the most crucial elements which dictate the growth rate and success of companies. In the operational environment, customer churn is one of the most-ceiled challenges that make companies lose customers, and hence is of greater concern to the industries. Due to its straightforward impact on the revenues of the industries, they are trying to develop and adopt different ways to anticipate customer churn. Various factors that increase customer churning rates can hence be identified, and necessary actions can be taken to lessen it. This work aims to develop and analyze the churning rates in the banking department and the telecommunication company to foresee the customers who might churn. This work has used an open-source dataset of telecom and banking sectors’ customers and predicted churning rates using artificial neural networks (ANNs) and machine learning methodologies like decision tree, random forest, KNN, kernel SVM (K-SVM), naive Bayes, and logistic regression. The models are examined using different evaluation metrics, and the highest accuracy model is used for churn prediction. For the Bank dataset, random forest obtained the highest accuracy of 87.05%, and for the telecom dataset, artificial neural network obtained the highest accuracy of 81.93%. This work is based on the study of reasons for customer churning and methods to retain these customers. It will help in flourishing the level of performance and profit of the companies.KeywordsNeural networksCustomer churnChurn predictionMachine learningTelecomBank

Journal ArticleDOI
TL;DR: In this article , a deep learning-based framework for the detection and classification of diabetic retinopathy is introduced, where a new customized convolutional neural network (CNN) model is trained on two different benchmark datasets.
Abstract: Diabetic retinopathy (DR) is one of the common issues of diabetic mellitus that affects the eyesight of humans by causing lesions in their retinas. DR is mainly caused by the damage of blood vessels in the tissue of the retina, and it is one of the leading causes of visual impairment globally. It can even cause blindness if not detected in its early stages. To reduce the risk of eyesight loss, early detection and treatment are necessary. The manual process by ophthalmologists in the detection of DR requires much effort and time and is costly also. Many computer vision-based techniques reduce the manual effort for the automatic detection of DR. Machine learning is an important subset of computer vision mainly used in medical imaging for the detection of different diseases. This paper introduces a deep learning-based framework for the detection and classification of diabetic retinopathy, where we have trained a new customized convolutional neural network (CNN) model on two different benchmark datasets. This trained CNN model is tested on the separate test datasets. The detection performances of CNN are significantly encouraging, and it can assist to the doctors and radiologist in the early diagnosis of DR.KeywordsDiabetic retinopathyFundus imagesDeep learningConvolutional neural network

Book ChapterDOI
TL;DR: In this article , the authors used SIFT-GLCM algorithm applied to classifier models of SVM, Random Forest, and Logistic Regression for the detection of cataracts.
Abstract: The World Health Organization report indicates that one of the world's leading causes of blindness is reported to be due to cataracts. Even though cataract majorly affects the elderly population; however, now they can be seen among minors too. Among the various types, the prominently three types of cataracts affect masses in high numbers which are nuclear, cortical, and post-subcapsular cataracts. Conventional methods of cataract diagnoses include slit lamp image tests by doctors which do not prove to be effective in classifying cataracts in the early stages and can also have inaccuracies in identifying the correct type of cataract. Existing work to automate the process has worked on classification based upon binary detection only or has considered only one type of cataract among the mentioned types for further expanding the system. Our system works on the detection of cataracts in an attempt to reduce errors of manual detection of cataracts in the early ages. From the literature, we analyzed that textural features with improved pre-processing showed satisfactory improvement in detection rate with different classifiers. Our proposed system was successful to classify images as cataract affected or as a normal eye with an accuracy of 96% using combined feature vectors from the SIFT-GLCM algorithm applied to classifier models of SVM, Random Forest, and Logistic Regression. The effect of using SIFT and GLCM separately has also been studied which leads to comparatively lesser accuracies in the model trained.

Book ChapterDOI
TL;DR: A review of the important advances and recent developments made for the survivability of patients with oral cancer (OC) is presented in this paper , where the most commonly used technique is machine learning for detecting the overall survival of patients.
Abstract: BackgroundThe present article is an attempt to review the important advances and recent developments made for the survivability of patients with oral cancer (OC). OC is one of the prevailing diseases in those areas where the population is addicted to chewing betel nuts, tobacco, and maintains poor oral hygiene. Method: An efficient search of the previous studies is done from various databases. All studies which had investigated the survivability of oral cancer patients during the period from 2005–2020 are retrieved. For detecting the overall survival of patients, the most commonly used technique is machine learning. Conclusion: After reviewing the papers, it has been found that the prognosis of oral cancer (OC) remains poor. It is important to identify and address the structural and social determinants of oral cancer. Without detailed knowledge of these factors, the outcomes of prevention and detection of diseases are ineffective. Early identification and diagnosis of cancer decrease morbidity and mortality rate. Raising public awareness of oral cancer may also help in early diagnosis. Machine learning is mostly in use for predicting the survival of patients using different techniques to improve the prediction accuracy.KeywordsSurvivabilityOral cancerMachine learningPredictionData mining

Book ChapterDOI
TL;DR: In this paper , an innovative methodology to observe pretend accounts needs to be developed by employing a variation of gradient boosting algorithmic program with a call tree consisting of a group of attributes.
Abstract: AbstractSocial media has changed the environment by increasing the number of social media users; the advantage of online social media is that it is an easy way of communication between individuals in an efficient manner. This ends up in potential attacks, like fake identity or pretend and larva accounts, unfolding of info etc. In step with the knowledge shared during a survey, the amount of actual accounts in social media is far lesser than the present users, what is more is that this means the increasing quantity of pretended or fake accounts in recent years. The detection of pretended accounts in social media platforms, like Twitter, has been an important task in huge amounts of information. Online social media owners or suppliers face issues in pretend accounts detections. Old strategies cannot distinguish between real and pretended accounts expeditiously. To overcome the problem, new strategies were created that used completely different approaches like bots or automatic comments and posts, rotating false data or info unfold and spreading spam messages within the type of advertisements. These strategies are a unit accustomed to observe pretend accounts in online social media. The large increase within the pretend accounts has reduced the potency of classification algorithms such as support vector machines, naive Bayes and random forest. In this work, an innovative methodology to observe pretend accounts needs to be developed by employing a variation of gradient boosting algorithmic program with a call tree consisting of a group of attributes. This may lead to rising overall potency and subsume scalability because of the increasing range of users using social media. So, there is a necessity for a tool which can identify and detect the fake or pretend account and accurately make the difference between pretend and genuine accounts.KeywordsOnline social mediaIdentification of fake accountMachine learningBig data

Book ChapterDOI
TL;DR: In this paper , the authors evaluated various segmentation techniques with pros and cons for image classification and evaluated their performance on various parameters, such as parameter and hyper-parameters for developing lightweight models.
Abstract: Machine learning techniques are used for crop disease identification and classification. Considering the remote nature of Agriculture, an optimized model needs to be discovered. Traditional handcrafted features lack accuracy compared to the latest Convolutional Neural Network (CNN) models. Automatic feature extraction and classification is the latest research arena. To deal with the challenges of extracting leaves from the real field images needs to be addressed by an efficient segmentation technique. This paper evaluates popular segmentation techniques with pros and cons. Computation power and accuracy for image classification are evaluated on various parameters. This work reviews and analyzes various approaches used by researchers to solve plant disease diagnosis challenges. After review, it is identified to develop small, fast and accurate models based on Support Vector Machines and compact neural networks like MobileNet for mobile devices. Identifying parameter and hyper-parameters for developing lightweight models evaluated based on present research and future directions for research identified.

Book ChapterDOI
TL;DR: In this paper , the authors used prediction algorithms to forecast the success of a movie in advance by using a mathematical model and mechanism based on the movie budget, likes and dislikes from YouTube and Twitter, and a comparison of different classification algorithms.
Abstract: The primary objective of this research work is to use prediction algorithms to forecast the success of a film in advance. In today’s world, movies have a lot of influence on investors; thus, the prediction model may assist to comprehend how well the movie will do at the box office. The goal of this study is to create a prediction model for forecasting the success of a movie in advance by using a mathematical model and mechanism based on the movie’s budget, likes and dislikes from YouTube and Twitter, and a comparison of different classification algorithms. The same dataset was used on five different classifiers, namely K-Nearest Neighbor (K-NN), Decision Tree (DT), Logistic Regression (LR), Support Vector Machine (SVM), and Random Forest (RF). The paper also provides the techniques utilized along with their implementation and application. The models were trained by leveraging a good accuracy on the dataset out of which the logistic regression was found out to be the best.

Journal ArticleDOI
TL;DR: In this paper , the authors focused on Twitter evaluation of public sentiment about COVID-19 vaccinations and found that perspectives are extraordinarily unstructured, heterogeneous, and, in sure cases, positive, negative, or impartial.
Abstract: Social networking Web sites, which include Twitter, Facebook, etc., generate giant volumes of information on an extensive scale and are rapidly gaining prominence due to the fact they permit customers to speak and articulate their critiques on a number of subjects, interact in conversations with numerous audiences, and put up messages everywhere in the world. There has been a top-notch number of studies finished within the area of sentiment evaluation of Twitter numbers. This mission focuses in most cases on Twitter evaluation of public sentiment approximately COVID-19 vaccinations and is beneficial for deciphering records in tweets in which perspectives are extraordinarily unstructured, heterogeneous, and, in sure cases, positive, negative, or impartial. In phrases of the mission synopsis, we can study the workable NLP techniques that may be used for emotion evaluation of COVID-19 vaccines with scores which include positive, negative, and impartial to see which one fits best. This could best be viable if we extract any treasured tweets from Twitter, wherein humans focus their minds on the COVID-19 vaccinations. Twitter is a micro-blogging platform where users can express themselves freely and randomly through 140-character tweets. The concept of this mission can help to alleviate their concerns about receiving COVID-19 vaccinations.

Journal ArticleDOI
TL;DR: In this paper , a comparison and grammatical difference between English and ISL is made and an interpretation system which is using recurrent neural network with LSTM to train the interpretation system is proposed.
Abstract: Language is a method of communication. There are different types of spoken language which are having standard rules and grammar and are easy to understand and learn. But, if we talk about understanding, interpretation and learning of sign language, then it is difficult because it is not a regular language and is not taught to everyone as it is used by deaf and mute. Lack of understanding of this knowledge bridges a gap between disabled and normal people. Due to this gap, many talented disabled people cannot explore and showcase their knowledge in front of normal community. In this paper, we are coming up with comparison and grammatical difference between English and ISL. We have also proposed an interpretation system which is using recurrent neural network with LSTM to train the interpretation system.

Book ChapterDOI
TL;DR: In this article , the authors presented the process of designing and simulating a circular shaped microstrip patch antenna for WiFi applications, which resonates at a frequency of 5.1 GHz.
Abstract: Paper presents the process of designing and simulating a circular shaped microstrip patch antenna for WiFi applications. The proposed antenna design resonates at a frequency of 5.1 GHz. This frequency is widely used for various wireless communication systems and WiFi applications. The reflection coefficient for the proposed design of the antenna is found to be below −10 dB for frequencies in the range of 4.9–5.1 GHz. The minimum value of the reflection coefficient is found to be −21.5 dB for resonating frequency of 5.1 GHz. Various parameters of the proposed antenna design are simulated using the CADFEKO simulation tool and the detailed analysis is presented in the paper. Aspects of designs and their impact on the parameters are briefly discussed. Results obtained using the simulation tool are found to be in good agreement with the results obtained after testing the actual fabricated antenna.

Book ChapterDOI
TL;DR: In this article , a machine learning model is developed and infused into a web site which includes various factors to predict obesity and categorize the persons' weight into seven distinct weight classes using machine learning algorithms such as random forest, support vector machine (SVM), decision tree, and KNN.
Abstract: Obesity is examined to be one of the significant health snags. Obesity is a condition that impairs both men and women of all ages. Obesity is nowadays a severe health issue that can lead to a variety of diseases such as insulin, lymphoma, high cholesterol, and so on. So, early prediction of obesity would be more effective to prevent and control various diseases and health conditions. So, in this paper, a machine learning model is developed and infused into a Web site which includes various factors to predict obesity. By using the machine learning model, we will categorize the persons’ weight into seven distinct weight classes. Machine learning algorithms such as random forest, support vector machine (SVM), decision tree, and K-nearest neighbor (KNN) have been used with accuracy scores of 98.48%, 96.21%, 96.96%, and 78.97%, respectively. The various diseases that an individual may get are also estimated, and their remedies are also provided based on their weight classes. As a result, people can learn about diseases and risks they may face in the future as a result of their weight and take appropriate precautions.

Book ChapterDOI
TL;DR: In this paper , the authors proposed a new architecture and presented a novel security approach for mobile payments system, which clearly defines the role and responsibilities of each participating entity and takes care of all the concerning issues in mobile payment system.
Abstract: The technological innovations in mobile payment systems have brought a revolutionary change in the way online payments are carried out. The availability of robust and secure infrastructure has further motivated users to execute financial transactions through handheld devices like smartphones and tablets. Determinants like low cost and powerful devices, affordable Internet plans and simplified payment interfaces have also contributed to adoption of mobile payment systems and made it the preferred payment option among mobile users. However, the aspects critical to mobile payment processing like transaction security, data privacy and network reliability keep raising concerns among mobile users where the users always remain worried about cyber-attacks like eavesdropping, denial of service, viruses, phishing, etc., and performance issues like network availability, low bandwidth and so on, act as a challenge and hindrance for mobile users to trust mobile payment systems and thus limit their usage. Novel mobile payment architectures, models and security techniques have been proposed in recent past to resolve these issues where each approach focused on providing solutions to a specific issues. This paper proposes a new architecture and presents a novel security approach for mobile payments system. The proposed architecture clearly defines the role and responsibilities of each participating entity and takes care of all the concerning issues in mobile payment system. Multi-factor authorization security approach based on hashing and cryptography has been adopted which adds an extra security layer before the transaction gets executed and it generates a unique OTP always. The proposed approach is implemented as a Web application and validated through experimentation. Lastly, the results are empirically analyzed and presented.

Book ChapterDOI
TL;DR: In this article , the authors developed a real-time blood transfusion system using Artificial Intelligence and Internet of Things (IoT) technology, which can help people who are in need of life saving blood at the right time by using current technologies.
Abstract: The purpose of this research is to help people who are in need of life-saving blood at the right time by using current technologies. A complete database of real-time blood transfusions has been developed in this research. The life-saving tool for a normal human being has been considered and developed with immediate access to the required blood using Artificial Intelligence and Internet of Things. The main objective of this research is the customization of the blood storage refrigerator and the ultra-freezer for plasma component storage compatible with Internet of Things application to improve the availability of various blood products in a timely manner and to reduce the wastage of bloods. Real-time status of the availability of each blood component in each blood bank, including packing date using Internet of Things-enabled technology. Further, the contact details of donors and their willingness to donate blood are available in the database. Real-time Global Position System monitoring of potential donors can help track the availability of donors around the needy area, and an Artificial Intelligence-enabled algorithm for automatically contacting blood donors in a hospital/place is needed.KeywordsArtificial intelligenceInternet of thingsBlood banking system

Journal ArticleDOI
TL;DR: In this paper , the authors proposed a methodology for disease detection on the rice and cotton plant leaf image dataset, where a subtractive pixel adjacency matrix (SPAM) method is used for feature extraction.
Abstract: Plant leaf disease detection and identification is a tedious and time-consuming task. Moreover, plant disease detection can be performed early to prevent it from spreading and limiting plant growth. The current study of machine learning and artificial intelligence technology on plant image data has been used to identify the plant’s diseases and prevent them from spreading. Analysis of these plant image datasets enables farmers and companies to improve crop quality and productivity. This chapter proposes a methodology for disease detection on the rice and cotton plant leaf image dataset. To formulate the proposed methodology, a subtractive pixel adjacency matrix (SPAM) method is used for feature extraction. On the other hand, the exponential spider monkey optimization technique (ESMO) has been constructed to select optimum features from extracted features. The proposed system effectively detects and classifies input plant leaf data as healthy or diseased using SVM and kNN classifier, where SVM gives better accuracy of 93.67%. The obtained results indicate that the proposed methodology outperforms the other algorithms in obtaining good classification accuracy.

Book ChapterDOI
TL;DR: In this paper , a detailed analysis of the state-of-the-art in attack detection system using recurrent neural networks, with a focus on results and discussion of key parameters such as false and true positives, alarm rates, accuracy, detection rate, and benchmarks for attack detection systems.
Abstract: Recurrent Neural Networks (RNN’s) are deep neural (DL) networks that can be trained on large volume of databases and performed well on natural language processing, speech recognition, and other classification problems. Here in this paper exploring the application of recurrent neural network and its variants in network based attack detection systems. In this paper, we look at how recurrent neural networks and their variants can be used in network-based attack detection systems. This paper presents a detailed analysis of the state-of-the-art in attack detection system using recurrent neural networks, with a focus on results and discussion of key parameters such as false and true positives, alarm rates, accuracy, detection rate, and benchmarks for attack detection systems.