scispace - formally typeset
Search or ask a question

Showing papers in "Advances in intelligent systems and computing in 2020"


Book ChapterDOI
TL;DR: A conceptual presentation of new terminology, i.e., Internet of Drone Things (IoDT), along with its related technologies, applications, security issues and real-time implementation of IoDT is presented by taking case studies of Agriculture and Smart Cities.
Abstract: The Internet of Drone Things (IoDT) is envisioned as Future direction of Drones backend via Internet of Things, Smart Computer vision, Cloud Computing, advanced wireless communication, big data, and high-end security techniques. The utilization of drones is increasing in diverse fields from Agriculture to Industry, from Government to private organizations and from Smart Cities to Rural area monitoring. With IoDT based implementations, all the existing sectors will become intelligent and smart for performing Monitoring, surveillance, search and rescue and more. In this paper, we present, a conceptual presentation of new terminology, i.e., Internet of Drone Things (IoDT), along with its related technologies, applications, security issues and real-time implementation of IoDT by taking case studies of Agriculture and Smart Cities.

65 citations


Book ChapterDOI
TL;DR: This paper concentrates upon the use of RNN and CNN in the feature extraction of images and the challenges and a brief literature review of the neural networks like CNN and RNN.
Abstract: With the advent of technologies, real-time data is essentially required for future development. Everyday, a huge amount of visual data is being collected, but to use it efficiently, we need to recognize, understand and arrange the visual data for a perfect approach. So, the neural network was introduced to find out patterns from images, a form of visual data as the neuron functionality in a human brain. It is biologically inspired programming approach to allow the machine to learn from observational data. Neural networks have provided solutions to several problems of image recognition, and it is actively utilized in the medical field due to its efficiency. This paper concentrates upon the use of RNN and CNN in the feature extraction of images and the challenges. The paper also presents a brief literature review of the neural networks like CNN and RNN.

34 citations


Book ChapterDOI
TL;DR: The proposed model will improve the decision making using CNN in case of various diseases in paddy crop for prediction of diseases in initial stages and prevention of mass loss in productivity of the whole yield.
Abstract: The agriculture industry is the most important industry for society as it serves the most important need of life. But the plant diseases in agriculture lead to a decrease in productivity and hence it is very important to prevent, detect, and get rid of the diseases. Image processing and deep learning are nowadays the buzzwords in the IT industry and their applications in the agriculture industry can enhance decision making in various aspects of the agriculture industry. Paddy crop is one of the most demanding crops especially in South Asia. This paper proposes a predictive model using CNN for classification and prediction of disease in paddy crop. Paddy crop diseases are very fatal and can affect the crops severely if it is not taken care in the initial stages. The proposed model will improve the decision making using CNN in case of various diseases in paddy crop for prediction of diseases in initial stages and prevention of mass loss in productivity of the whole yield.

27 citations


Book ChapterDOI
TL;DR: A modified U-Net architecture is designed and proposed that is deep enough to extract contextual information from satellite imagery and performs well using intersection over union (IOU) and overall accuracy (OA).
Abstract: In recent years, convolution neural network (CNN) has emerged as a dominant paradigm in machine learning for image processing application. In remote sensing, image segmentation is a very challenging task, and CNN has shown its worth over traditional image segmentation methods. U-Net structure is one of the simplified architectures used for image segmentation. However, it cannot extract promising spatial information from satellite data due to insufficient numbers of layers. In this study, a modified U-Net architecture has designed and proposed. This new architecture is deep enough to extract contextual information from satellite imagery. The study has formulated the downsampling part by introducing the DenseNet architecture, encourages the use of repetitive feature map, and reinforces the information propagation throughout the network. The long-range skip connections implemented between downsampling and upsampling. The quantitative comparison has performed using intersection over union (IOU) and overall accuracy (OA). The proposed architecture has shown 73.02% and 96.02% IOU and OA, respectively.

19 citations


Book ChapterDOI
TL;DR: Blockchain is one of the best-emerging technologies for ensuring privacy and security by using cryptographic algorithms and hashing and it will be discussed the basics of blockchain technology, consensus algorithms, comparison of important consensus algorithms and areas of application.
Abstract: In today’s era of big data and machine learning, IoT is playing a very crucial role in nearly all areas like social, economic, political, education, health care. This drastic increase in data creates security, privacy, and trust issues in the era of the Internet. The responsibility of IT is to ensure the privacy and security for huge incoming information and data due to the drastic evolution of the IoT in the coming years. The blockchain has emerged as one of the major technologies that have the potential to transform the way of sharing the huge information and increase trust among. Building trust in a distributed and decentralized environment without the call for a trusted third party is a technological challenge for researchers. Due to the emergence of IoT, the huge and critical information is available over the Internet. The trust over the information is reduced drastically, causing an increase in security and privacy concern day by day. Blockchain is one of the best-emerging technologies for ensuring privacy and security by using cryptographic algorithms and hashing. We will discuss the basics of blockchain technology, consensus algorithms, comparison of important consensus algorithms, and areas of application.

18 citations


Book ChapterDOI
TL;DR: In this paper, the authors present the main algorithms for constructing an investment portfolio and present two portfolios, a maximum Sharpe ratio portfolio and a minimum risk portfolio, for each of them, the total expected return and the total risk of the portfolio were calculated.
Abstract: Choosing the right investment vehicle is one of the main tasks facing any investor. No investor knows exactly whether their expectations regarding the return on particular equity will be met, but they need to build their strategy in such a way as to eliminate the damage as much as possible. The creation of a universal investment vehicle could facilitate the activities of many investors, but it does not exist, and it seems impossible thus far to build a unified model that covers the whole variety of factors. The purpose of this article is to analyze models and algorithms for constructing an effective investment portfolio. A large number of portfolios were generated using Python. For each of them, the total expected return and the total risk of the portfolio were calculated. Two portfolios, a maximum Sharpe ratio portfolio and a minimum risk portfolio have been constructed. When forming a portfolio, an investor adheres to several fundamental principles: to achieve an optimal ratio of return and risk of assets in the portfolio, to diversify the investment portfolio and ensure its management. The goals of creating an investment portfolio can be different: generate revenue, save money, or maintain liquidity. This article presents the main algorithms for constructing an investment portfolio. #COMESYSO1120.

14 citations


Book ChapterDOI
TL;DR: A Sentiment Analysis Model is proposed that will analyze the sentiments of students in the learning process with in their pandemic using Word2vec technique and Machine Learning techniques to understand the Egyptian student's opinion on learning process during COVID-19 pandemic.
Abstract: Education field is affected by the COVID-19 pandemic which also affects how universities, schools, companies and communities function. One area that has been significantly affected is education at all levels, including both undergraduate and graduate. COVID-19 pandemic emphasis the psychological status of the students since they changed their learning environment. E-learning process focuses on electronic means of communication and online support communities, however social networking sites help students manage their emotional and social needs during pandemic period which allow them to express their opinions without controls. The paper will propose a Sentiment Analysis Model that will analyze the sentiments of students in the learning process with in their pandemic using Word2vec technique and Machine Learning techniques.The sentiment analysis model will start with the processing process on the student's sentiment and selects the features through word embedding then uses three Machine Learning classifies which are Naive Bayes, SVM and Decision Tree. Results including precision, recall and accuracy of all these classifiers are described in this paper. The paper helps understand the Egyptian student's opinion on learning process during COVID-19 pandemic.

14 citations


Book ChapterDOI
TL;DR: Computational results reveal that the proposed optimization technique is well efficient in solving multi-objective discrete combinatorial optimization problems such as the flow shop scheduling problem in the present study.
Abstract: The Jaya algorithm is a novel, simple, and efficient meta-heuristic optimization technique and has received a successful application in the various fields of engineering and sciences. In the present paper, we apply the Jaya algorithm to permutation flow shop scheduling problem (PFSP) with the multi-objective of minimization of maximum completion time (makespan) and tardiness cost under due date constraints. PFSP is a well-known NP-hard and discrete combinatorial optimization problem. Firstly, to retrieve a job sequence, a random preference is allocated to each job in a permutation schedule. Secondly, a job preference vector is transformed into a job permutation vector by means of largest order value (LOV) rule. To deal with the multi-objective criteria, we apply a multi-attribute model (MAM) based on Apriori approach. The correctness of the Jaya algorithm is verified by comparing the results with the total enumeration method and simulated annealing (SA) algorithm. Computational results reveal that the proposed optimization technique is well efficient in solving multi-objective discrete combinatorial optimization problems such as the flow shop scheduling problem in the present study.

13 citations


Book ChapterDOI
TL;DR: A Wavelet Neural Network—Evaluation based on Distance from Average Solution (WNN-EDAS), a novel MCDM approach for the identification of suitable and trustworthy cloud service providers and its accuracy, robustness, and feasibility are presented.
Abstract: The omnipresence of the cloud-based applications and the exponential growth of the cloud services at different dimensions make the selection of user requirement compliant and trustworthy cloud service provider, a challenging task. Multi-Criteria Decision-Making (MCDM) approaches have their significance in solving cloud service selection problem since they evaluate the alternatives (cloud service providers) based on the intrinsic relationships among the criteria (QoS parameters). However, the assignment of appropriate weights to the criteria has a high impact on the accuracy of the service ranking and the performance of the MCDM methods. Hence, this paper presents a Wavelet Neural Network—Evaluation based on Distance from Average Solution (WNN-EDAS), a novel MCDM approach for the identification of suitable and trustworthy cloud service providers. WNN-EDAS employs WNN to calculate appropriate weights for each criterion and EDAS to rank the cloud service providers. The experiments were carried out on Cloud Armor, a real-world trust feedback dataset to demonstrate the accuracy, robustness, and feasibility of WNN-EDAS over the state-of-the-art MCDM approaches in terms of sensitivity analysis and rank reversal.

11 citations


Book ChapterDOI
TL;DR: In this work, link quality and energy-aware (LQEA) metric-based routing strategy in WSN is proposed and the results confirm that LQEA strategy improved the energy efficiency and packet delivery ratio in W SN.
Abstract: Wireless sensor networks (WSNs) have attracted quite care because of their wide potential applications. In WSN, energy is a significant key factor of a sensor node since it is an essential function in the lifetime of a network. Retransmissions caused via collision as well as interference during the transaction between sensor nodes growth complete network delay. Because the network delay increases as the node's waiting time increases, the network function is reduced. To solve these problems, link quality and energy-aware (LQEA) metric-based routing strategy in WSN is proposed. In this strategy, the sensor energy and quality of link metrics decide the relay selection in WSN. This strategy evaluates the node-link quality by weighted throughput; the node updated transmission count; available bandwidth; and channel idle ratio to calculate the weighted throughput. This work is simulated in NS-2 simulator and the results confirm that LQEA strategy improved the energy efficiency and packet delivery ratio in WSN.

11 citations


Book ChapterDOI
TL;DR: An integrated system that can ingest big data from different sources using Micro-Electro-Mechanical System IR sensors and display results in an interactive map, or dashboard, of Egypt and a software based on AI analysis will be applied to execute statistics and forecast how and to what extent the virus will spread.
Abstract: Coronavirus disease 2019 (COVID-19) is one of the most dangerous respiratory illness through the last one hundred years. Its dangerous is returned to its ability to spread quickly between people. This paper proposes a smart real solution to help Egyptian government to track and control the spread of COVID-19. In this paper, we suggest an integrated system that can ingest big data from different sources using Micro-Electro-Mechanical System (MEMS) IR sensors and display results in an interactive map, or dashboard, of Egypt. The proposed system consists of three subsystems, which are: Embedded Microcontroller (EM), Internet of Things (IoT) and Artificial Intelligent (AI) subsystems. The EM subsystem includes accurate temperature measuring device using IR sensors and other detection components. The EM subsystem can be used in the entrance of places like universities, schools, and subways to screen and check temperature of people from a distance within seconds and get data about suspected cases. Then, the IoT subsystem will transmit the collected data from individuals such as temperature, ID, age, gender, location, phone number. etc., to the specific places and organizations. Finally, a software based on AI analysis will be applied to execute statistics and forecast how and to what extent the virus will spread. Due to the important role of Geographic Information Systems (GIS) and interactive maps, or dashboards, in tracking COVID-19, this paper introduces an advanced dashboard of Egypt. This dashboard has been introduced to locate and tally confirmed infections, fatalities, recoveries and present the statistical results of AI model.

Book ChapterDOI
TL;DR: Facial expression analysis could contribute to detect discomfort in automated driving by evaluating the performance of driver and automation in the EU-project MEDIATOR.
Abstract: Driving comfort is considered a key factor for broad public acceptance of automated driving. Based on continuous driver/passenger monitoring, potential discomfort could be avoided by adapting automation features such as the driving style. The EU-project MEDIATOR (mediatorproject.eu) aims at developing a mediating system in automated vehicles by constantly evaluating the performance of driver and automation. As facial expressions could be an indicator of discomfort, a driving simulator study has been carried out to investigate this relationship. A total of 41 participants experienced three potentially uncomfortable automated approach situations to a truck driving ahead. The face video of four cameras was analyzed with the Visage facial feature detection and face analysis software, extracting 23 Action Units (AUs). Situation-specific effects showed that the eyes were kept open and eye blinks were reduced (AU43). Inner brows (AU1) as well as upper lids (AU5) raised, indicating surprise. Lips were pressed (AU24) and stretched (AU20) as sign for tension. Overall, facial expression analysis could contribute to detect discomfort in automated driving.

Book ChapterDOI
TL;DR: The cognitive mimetic approach can be used to describe human interactions with technologies, and analyses human information processes such as perceiving and thinking to mimic how people process information in order to design intelligent technologies.
Abstract: Digital twins – digital models of technical systems and processes – have recently been introduced to work with complex industrial processes. Yet should such models concern only physical objects (as definitions of them often imply), or should users and other human beings also be included? Models that include people have been called human digital twins (HDTs); they facilitate more accurate analyses of technologies in practical use. The cognitive mimetic approach can be used to describe human interactions with technologies. This approach analyses human information processes such as perceiving and thinking to mimic how people process information in order to design intelligent technologies. The results of such analyses can be presented as an ontology of human action, and in this way included in HDT models.

Book ChapterDOI
TL;DR: The link between GDPR provisions and the use of blockchain technology for solving the consent management problem in online social networks is investigated and possible ways to reconcile blockchain technology with the GDPR requirements are demonstrated.
Abstract: Online Social Networks (OSNs) are very popular and widely adopted by the vast majority of Internet users across the globe. Recent scandals on the abuse of users’ personal information via these platforms have raised serious concerns about the trustworthiness of OSN service providers. The unprecedented collection of personal data by OSN service providers poses one of the greatest threats to users’ privacy and their right to be left alone. The recent approval of the GDPR (General Data Protection Regulation) presents OSN service providers with great compliance challenges. A set of new data protection requirements are imposed on data controllers (OSN service providers) by GDPR that offer greater control to data subjects (OSN users) over their personal data. This position paper investigates the link between GDPR provisions and the use of blockchain technology for solving the consent management problem in online social networks. We also describe challenges and opportunities in designing a GDPR-compliant consent management mechanism for online social networks. Key characteristics of blockchain technology that facilitate regulatory compliance were identified. The legal and technological state of play of the blockchain-GDPR relationship is reviewed and possible ways to reconcile blockchain technology with the GDPR requirements are demonstrated. This paper opens up new research directions on the use of the disruptive innovation of blockchain to achieve regulatory compliance in the application domain of online social networks.

Book ChapterDOI
TL;DR: This work presents two approaches entirely based in domain knowledge for automatic generation of training data which can further be used for segmentation of court judgments.
Abstract: In this era of information overload, text segmentation can be used effectively to locate and extract information specific to users’ need within the huge collection of documents. Text segmentation refers to the task of dividing a document into smaller labeled text fragments according to the semantic commonality of the contents. Due to the presence of rich semantic information in legal text, text segmentation becomes very crucial in legal domain for information retrieval. But such supervised classification requires huge training data for building efficient classifier. Collecting and manually annotating gold standards in NLP is very expensive. In recent past the question of whether we can satisfactorily replace them with automatically annotated data is arising more and more interest. This work presents two approaches entirely based in domain knowledge for automatic generation of training data which can further be used for segmentation of court judgments.

Book ChapterDOI
TL;DR: Many of the attacks at network layer are identified in MANETs by most of the researchers, and the methodologies and techniques proposed for detecting and predicting these attacks from various kinds of intrusions within the MANETS are discussed.
Abstract: The mobile ad hoc networks (MANETs) had move towards the wireless networking technology. Due to its dynamic nature MANETs face the challenges towards critical attacks in OSI layers but research shows that in network layer the attacks are effectively done by intruders. In this survey, many of the attacks at network layer are identified in MANETs by most of the researchers which are outlined in this paper. Mostly, AODV routing protocols and other protocols are used for transferring packets to the destination. The communicated packets information is accumulated in the log files, to surveillance these routing of packets from these log files, the techniques used in MANETs are data mining, support vector machines (SVM), genetic algorithms (GA) and other machine-learning approaches. Further, the methodologies and techniques proposed for detecting and predicting these attacks from various kinds of intrusions within the MANETS are discussed.

Book ChapterDOI
TL;DR: The developed model has a unique combination of seven energy alternatives, six criteria, and their related twenty-six sub-criteria, and shows that economic criterion has the highest weight, followed by the environmental and technical criterion.
Abstract: The development of India is continuously affected due to some severe issues of the energy crisis and greenhouse gas emission. To overcome or minimize these problems, India should increase the share of sustainable energy sources in an overall generation. Therefore, this work aims to develop a model for the selection of the most sustainable energy sources in India. The model is developed using integrated multi-criteria decision-making (MCDM) approach to deal with the number of conflicting and uncertain criteria. The developed model has a unique combination of seven energy alternatives, six criteria, and their related twenty-six sub-criteria. The weights were collected using linguistic terminology to avoid any kind of incomplete or vague information. Fuzzy Analytic Hierarchy Process (F-AHP) was employed to make a pairwise comparison and to obtain the weights of the considered criteria and sub-criteria. Fuzzy Weighted Aggregated Sum Product Assessment (F-WASPAS) used for the ranking of the energy alternatives. Result shows that economic criterion has the highest weight, followed by the environmental and technical criterion. Solar energy obtained as the most sustainable alternative energy source in India. Wind energy was chosen as the second-most sustainable alternative energy source, followed by the hydro and biomass energy. Sensitivity analysis was performed by changing the values of λ coefficient. Results were compared and validated with three other well-known MCDM approaches of VIKOR, TOPSIS, and PROMETHEE - II.

Book ChapterDOI
TL;DR: A computer-assisted procedure to examine the Breast Ultrasound Image (BUI) using an integration of Shannon’s Entropy Thresholding (SET) to improve the visibility of the image and Level-Set Segmentation (LSS) to extort the abnormal division.
Abstract: As per the statement of the World Health Organization (WHO), breast malignancy is one of the major impacting cancers among women. The availability of the modern disease diagnostic systems and treatment procedure will assist to improve the survival rate. Even though considerable modalities are available to record the breast abnormality, ultrasound imaging technique is frequently used in clinics to record the abnormalities. This study aims to propose a computer-assisted procedure to examine the Breast Ultrasound Image (BUI). The proposed work implements an integration of Shannon’s Entropy Thresholding (SET) to improve the visibility of the image and Level-Set Segmentation (LSS) to extort the abnormal division. The proposed scheme is a semiautomated approach, which aims to mine the suspicious section from the BUI. The extracted suspicious segment is then compared against a ground truth and the essential performance measures are computed to justify the performance of LSS. The overall performance of LSS is then compared and validated with other methods, such as active-contour (AC) and Chan–Vese (CV) and the results of this study confirmed that performance measures attained with LSS, AC, and CV are roughly similar.

Book ChapterDOI
TL;DR: In this paper, the authors propose an extension of the most widely known federated algorithm, FedAvg, adapting it for continual learning under concept drift, and empirically demonstrate the weaknesses of regular FedAvg and prove that their extended method outperforms the original one in this type of scenario.
Abstract: Service robots and other smart devices, such as smartphones, have access to large amounts of data suitable for learning models, which can greatly improve the customer experience. Federated learning is a popular framework that allows multiple distributed devices to train deep learning models remotely, collaboratively, and preserving data privacy. However, little research has been done regarding the scenario where data distribution is non-identical among the participants and it also changes over time in unforeseen ways, causing what is known as concept drift. This situation is, however, very common in real life, and poses new challenges to both federated and continual learning. In this work, we propose an extension of the most widely known federated algorithm, FedAvg, adapting it for continual learning under concept drift. We empirically demonstrate the weaknesses of regular FedAvg and prove that our extended method outperforms the original one in this type of scenario.

Book ChapterDOI
TL;DR: In this article, the authors have implemented the machine learning algorithms on the student dataset of a university to predict the student's performance, based on the analysis of the result, it has been concluded that accuracy of the random forest (RF) classifier is more than the other classification method such as support vector machine (SVM).
Abstract: The assessment in outcome-based learning is very vital. A significant approach toward measuring the student performance is required. Nowadays, the amounts of data that are stored in educational database are increasing rapidly. At any point of time, the institutions face various problems. One is the poor performance of the student. Second one is the exit of a student due to the complexity of curriculum, financial problems, psychological problems, lack of support, etc. To handle these issues, machine learning can be utilized. Considerable work has been done to measure the student’s performance by using different methodologies and modern technologies. In this work, the machine learning (ML) algorithms have been implemented on the student dataset of a university to predict the student’s performance. Based on the analysis of the result, it has been concluded that accuracy of the random forest (RF) classifier is more than the other classification method such as support vector machine (SVM).

Book ChapterDOI
TL;DR: The importance of enhancement technique which is used in various field is shown, to improve the quality of image in different medical imaging modality in different domain.
Abstract: Image enhancement is the main function in image processing. Different image enhancement techniques exist in the literature. The goal of image enhancement technique is to improve the quality and characteristics of image in such a way that the important information of image is easily extracted. The contrast enhancement techniques are useful in various medical image modality, such as X-ray, MRI, ultrasound, PET, SPECT, etc. Enhancement process is performed on original image to improve the quality of visibility and it is applied in various domains such as spatial, frequency and fuzzy domain. By the process of enhancement of the image become more convenient than original image. The main objective of this process is to improve the quality of image in different medical imaging modality in different domain. Here, we show the importance of enhancement technique which is used in various field.

Book ChapterDOI
TL;DR: A new FO is proposed that aims to improve the quality of selected routes by using combined metrics as well as fuzzy logic and performs better than standard RPL in terms of stability, energy consumption, and packet delivery.
Abstract: Nowadays, connected objects are considered as the most used network for short-range services. As IPv6 is no longer appropriate to support this kind of communication, the IPv6 routing protocol for LLNs (RPL) is proposed by IETF as an alternative to overcome the aforementioned drawback. Accordingly, researchers are focused on RPL protocol to adapt LLN requirements to the IoT context. The selection of optimal routes is based on predefined objective functions. Nevertheless, these objective functions lack to use the right combination of parameters which limits drastically the energy consumption and the quality of data transmission. In this paper, we propose a new FO that aims to improve the quality of selected routes by using combined metrics as well as fuzzy logic. The obtained results show that our solution performs better than standard RPL in terms of stability, energy consumption, and packet delivery.

Book ChapterDOI
TL;DR: This paper investigates the suitability of encoding categorical features based on the posterior probability of an attack conditioned on the feature in the context of IDS on top of latent features in numeric form using KNN classifier.
Abstract: Intrusion detection is an evolving area of research in the field of cyber-security. Machine learning offers many best methodologies to help intrusion detection systems (IDSs) for accurately identifying intrusions. Such IDSs analyze the features of traffic packets to identify different types of attacks. While most of the features used in IDS are numeric, some of the features like Protocol-type, Flag and Service are categorical and hence calls for an effective encoding scheme for transforming the categorical features into numeric form before applying PCA like techniques for extracting latent features from numeric data. In this paper, the authors investigate the suitability of encoding categorical features based on the posterior probability of an attack conditioned on the feature in the context of IDS. KNN classifier is used for construction of IDS on top of latent features in numeric form. The proposed method is trained and tested on NSL-KDD data set to predict one among the possible 40 distinct class labels for a test instance. Classification accuracy and false positive rate (FPR) are considered as performance metrics. The results have shown that the proposed approach is good at detecting intrusions with an accuracy of 98.05% and a false alarm rate of 0.35%.

Book ChapterDOI
TL;DR: A comparative analysis between six completely different bunch of algorithms for detecting communities in social network by taking into account parameters like run-time, cluster size, normalized mutual data, adjusted random score and average score is performed.
Abstract: Community detection in social networks is often thought of a challenged domain that has not been explored completely. In today’s digital world, it is forever laborious to make a relationship between people or objects. Community detection helps us to find such relationships or build such relationships. It also can facilitate bound organizations to induce the opinion of their product from certain people. Many algorithms have emerged over the years which detect communities in the social networks. We performed a comparative analysis between six completely different bunch of algorithms for detecting communities in social network by taking into account parameters like run-time, cluster size, normalized mutual data , adjusted random score and average score.

Book ChapterDOI
TL;DR: In this article, the authors identify the encryption algorithm used by analyzing various encrypted text files by applying modern deep learning classification methods and compared them with existing machine learning methods, including deep belief networks, recurrent neural networks and convolutional neural networks.
Abstract: Deep learning techniques have recently gained momentum in cryptography and cryptanalysis. Deep learning architectures such as deep neural networks, deep belief networks, recurrent neural networks and convolutional neural networks have diverse applications in diverse fields. Encryption algorithm identification by analysis of cipher text enables focused cryptanalysis methods to be applied, thereby increasing the chances of a successful full/partial plain text recovery. In this work, we identified the encryption algorithm used, by analyzing various encrypted text files. We applied modern deep learning classification methods and compared them with existing machine learning methods. We generated cipher text corpus for multilingual dataset that contains 700 files with an average of 4000 characters per file and applied our techniques on this generated ciphertext. We used ciphers namely AES and Blowfish.

Book ChapterDOI
TL;DR: A non-contact thermometer that calculates the temperature from the infrared radiation produced by the subject being measured for industrial purpose as well as for medical purposes is designed.
Abstract: In this paper a non-contact thermometer is designed. It calculates the temperature from the infrared radiation produced by the subject being measured. It can be used for industrial purpose as well as for medical purposes. In medical purpose it is used to measure the core body temperature from forehead temperature depends on ambient air temperature that affects in the heat transfer coefficient. Heat transfers by conduction from the core body temperature and by convection from forehead to ambient air. The overall heat transfer coefficient is determined empirically depends on studies that had been made by measuring forehead temperature and core body temperature in various ambient air temperatures for hundreds of persons. The accuracy of the proposed thermometer is ±0.3 °C compared with Rossmax HA-500 device which depends on the accuracy of the studies and its results in various conditions.

Book ChapterDOI
TL;DR: Recommender systems solve the information overload problem by filtering data on the basis of the user’s preferences, interest, or previous behavior regarding an item, but often sacrifice privacy for accuracy and accuracy for privacy.
Abstract: Recommender systems solve the information overload problem by filtering data on the basis of the user’s preferences, interest, or previous behavior regarding an item. Data filtering techniques employed are content-based (based on the user’s past behavior), collaborative (based on the behavior of users that are alike to the active one), or hybrid (a combination of filtering techniques). Due to its versatility, the most popular technique used in the recommender systems is collaborative filtering. However, the privacy of the user is at risk because malicious users can attack the targeted user or the recommendation server may reveal the personal data of users’ to other parties or misuse the data for targeted advertising. The existing works mostly employ encryption or randomizations based methodologies, but often sacrifice privacy for accuracy and accuracy for privacy.

Book ChapterDOI
TL;DR: The challenges, knowledge and technology gaps regarding AI in the shipping sector are explored and Tactical decisions to understand traffic patterns and future vessel encounters can be compared to a game of chess.
Abstract: Artificial intelligence (AI) may be the panacea for improving safety and efficiency in shipping. Solutions to navigation problems are often challenged by information uncertainty, complexity and time demands. Tactical decisions to understand traffic patterns and future vessel encounters can be compared to a game of chess where an agent has goals and considers the next several moves in advance. AI approaches to machine learning is a reactive tactic but remains relatively ‘‘weak” and relies on computational power and smart algorithms to recreate each decision every time. Ships are required to follow the International Regulations for Preventing Collisions at Sea. While assumed to be the defining rules of the road, these may be ‘‘violated” to solve traffic situations in practice without creating increased risk to the situation. In order to create safe and reliable technologies to support autonomous shipping, the system cannot just rely on where it has to go but anticipate the goals of the surrounding vessels. This paper will explore the challenges, knowledge and technology gaps regarding AI in the shipping sector.

Book ChapterDOI
TL;DR: Host-based intrusion detection is achieved using OSSEC tool and the system is capable of detecting the malicious logs which run in the background from the system.
Abstract: Today, the world is progressively associated with the Internet so that the attackers and hackers are having a high opportunity to enter PCs and networks. In today’s world hackers are using different types of attacks for receiving valuable information. It is important to recognize these attacks ahead of time to secure end clients and the system effects. Intrusion detection system (IDS) has been generally conveyed in PCs and systems to recognize the variety of attacks. In this paper, the basic observation is on log monitoring in host-based intrusion detection systems. In this paper, host-based intrusion detection is achieved using OSSEC tool. By using the OSSEC, the system is capable of detecting the malicious logs which run in the background from the system.

Book ChapterDOI
TL;DR: This work has analyzed the performance of different machine learning techniques on public datasets of healthcare to select the most suitable one for the proposed work and it is observed that random forest model performs the best.
Abstract: With the expeditious development of big data and internet of things (IoT), technology has successfully associated with our everyday life activities with smart healthcare being one. The global acceptance toward smart watches, wearable devices, or wearable biosensors has paved the way for the evolution of novel applications for personalized e-Health and m-Health technologies. The data gathered by wearables can further be analyzed using machine learning algorithms and shared with medical professionals to provide suitable recommendations. In this work, we have analyzed the performance of different machine learning techniques on public datasets of healthcare to select the most suitable one for the proposed work. Based on the results, it is observed that random forest model performs the best. Further, we propose a quantified self-based hybrid model for smart-healthcare environment that would consider user health from multiple perspectives and recommend suitable actions.