scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computer Applications in 2022"



Journal ArticleDOI
TL;DR: This research predicts YouTube Ad view sentiments using Deep Learning and Machine Learning algorithms like Linear Regression (LR), Support Vector Machine (SVM), Decision Tree (DT), Random Forest (RF), and Artificial Neural Network (ANN).
Abstract: Sentiment Analysis is currently a vital area of research. With the advancement in the use of the internet, the creation of social media, websites, blogs, opinions, ratings, etc. has increased rapidly. People express their feedback and emotions on social media posts in the form of likes, dislikes, comments, etc. The rapid growth in the volume of viewer-generated or user-generated data or content on YouTube has led to an increase in YouTube sentiment analysis. Due to this, analyzing the public reactions has become an essential need for information extraction and data visualization in the technical domain. This research predicts YouTube Ad view sentiments using Deep Learning and Machine Learning algorithms like Linear Regression (LR), Support Vector Machine (SVM), Decision Tree (DT), Random Forest (RF), and Artificial Neural Network (ANN). Finally, a comparative analysis is done based on experimental results acquired from different models.

6 citations





Journal ArticleDOI
TL;DR: In this paper , the authors proposed a system that makes use of two types of handwriting analysis, spiral and wave drawings of healthy as well as Parkinson's patients as an input to the system.
Abstract: Parkinson's Disease is a progressive neurodegenerative disorder of movement that affects your ability to control movement. This disease can prove fatal if not detected at an earlier stage. Motor and non-motor symptoms are raised by the loss of dopamine-producing neurons. Currently, there is no test available to detect disease at early stages where the symptoms may be poorly characterised. Handwriting analysis is one of the traditional aspects of studying human personality and also can be used to identify the symptoms of this disease. Identifying such accurate biomarkers provides roots for better clinical diagnosis. In this paper, we proposed a system that makes use of two types of handwriting analysis, spiral and wave drawings of healthy as well as Parkinson's patients as an input to the system. For feature extraction, we are using a histogram of the oriented gradient. The developed system uses a machine learning algorithm and a random forest classifier for the detection of Parkinson's disease among patients. Our model achieved an accuracy of 86.67 % in the case of spiral drawing and 83.30% with wave drawing.

4 citations




DOI
TL;DR: A private model release based on six machine learning classifiers namely Support Vector Machine, Random Forest algorithm, Logistic Regression, K-Nearest Neighbor, Decision Tree, and Naive Bayes are proposed and can be used for the prediction of possible heart disease in patients.
Abstract: The evolution of technology allowed the collection of a large amount of user data, known as big data. Among all the datasets healthcare data is more sensitive. It is extremely important to protect the individual users in such datasets. Even anonymized data release is vulnerable. Hence, in this paper, we suggest a differential privacy-based model release instead of the data release. A private model release based on six machine learning classifiers namely Support Vector Machine (SVM), Random Forest algorithm, Logistic Regression, K-Nearest Neighbor, Decision Tree, and Naive Bayes are proposed. Experimental evaluation is performed using the benchmark heart disease dataset and the accuracy of the model is analyzed. The published private model can be used for the prediction of possible heart disease in patients.

2 citations


DOI
TL;DR: In this article , a non-local network structure and spatial attention mechanism is proposed to remove multi-density rain-fringes while preserving the detailed structure information of the image background.
Abstract: Image rain removal is an important topic in the field of computer vision. In the rainy environment, the rain will seriously affect the quality of imaging, resulting in image deformation, blur, poor visibility, and other problems. So the outdoor vision system cannot accurately detect object, monitor, and other works. Therefore, how to effectively eliminate the rain-weather interference to imaging has a very important practical value. In the absence of the time series information between frame and frame, the bottleneck problem of image rain removal technology is how to effectively remove multi-density rain-fringes while preserving the detailed structure information of the image background. To solve the above problems, a new image rain removal algorithm based on non-local network structure and spatial attention mechanism is proposed in this paper. Firstly, the position relation information between different pixels is obtained by the non-local operation to obtain the global image representation. Secondly, the spatial attention mechanism is used to recalibrate the global information in the spatial dimension. In other words, the nonlinear modeling is carried out on the channel dimension to gather similar features and useful information. Finally, deconvolution and long-distance residual connection are used to restore the size of the rain-removing image layer by layer. The analysis and experimental results show that the proposed algorithm in this paper is effective in removing rain marks, effectively solves the practical difficulties of removing rain stripes with different rain densities, and preserves the details and edge information of the image well.

2 citations





DOI
TL;DR: The proposed work uses the Multi class Support Vector Machione (MSVM) as classifier and three distinct architectures with varied filter widths to acquire different performance characteristics and can reduce the overall number of parameters and execution time with comparable accuracy.
Abstract: Deep learning innovations have paved way for effective classification algorithms using the Convolutional Neural Networks (CNNs). The current scenario uses very deep networks to improve the overall efficiency. This deep nature will result in increased complexity, a high number of parameters, increased execution time, and a more complex hardware platform for execution. Our research focuses on minimizing this complex nature of architecture. To achieve this, we employed the multi-channel CNN with a shallow layers approach, which consists of the main channel and side channels. The proposed work uses the Multi class Support Vector Machione (MSVM) as classifier and three distinct architectures with varied filter widths to acquire different performance characteristics. All these models are trained and tested on a brain tumor type database and performance parameters are compared to deep architectures like the Alexnet, VGG16, VGG19, and Resnet 50. When compared to deep architectures for the same database, our model can reduce the overall number of parameters and execution time with comparable accuracy. To improve the overall efficiency, our final architecture includes a skip connection.





Journal ArticleDOI
TL;DR: An extent review and summary of Big Data Mining techniques with the most common data mining algorithms suitable to be used to handle large datasets and the general pros and cons of these algorithms and the correspondingappropriate fields that apply.
Abstract: The Big Data revolution is taking place due to the evolution of technology, where the technology enables firms to gather extremely huge amount of data, disseminating knowledge to their customers, partners, competitors in the marketplace [1]. The deeper we dive into technology, the more we compound the physical with the virtual world having in mind for instance the IoT (Internet of Things) as a network of physical devices connected together and able to exchange data. There are many Big Data platforms a company can choose like Hadoop and Apache Spark to analyze large sets of data.Moreover, many data mining techniques like Classification, Clustering Analysis, Correlation Analysis, Decision Tree Induction, Regression Analysis can be used to identify patterns for knowledge discovery. In this paper, there is an extent review and summary of Big Data Mining techniqueswith the most common data mining algorithms suitable to be used to handle large datasets. The review depicts the general pros and cons of these algorithms and the correspondingappropriate fields that apply, and in general acts as a guideline to data mining researchers to have an outlook on what algorithms to choose based on their needs and based on the given datasets.




DOI
TL;DR: An improved meta-heuristic algorithm called Fitness Adaptive-Bird Swarm Algorithm is used for optimal power allocation and optimal relay selection and the results have proved the competitive performance of the proposed model by comparing it with the conventional models in terms of throughput and energy efficiency.
Abstract: This paper develops an intelligent resource optimization model in cooperative IoT. For this criterion, optimal power allocation and optimal relay selection are the two main objective functions. An improved meta-heuristic algorithm called Fitness Adaptive-Bird Swarm Algorithm (FA-BSA) is used for optimal power allocation and optimal relay selection. The experiment is done for cooperative multi-hop network topology. The optimal resource allocation is done by minimizing the total transmit power, and optimal relay selection is performed by satisfying the Quality of Service (QoS) requirements. Hence, the achievement of the energy-efficient protocol is successful. The simulation results have proved the competitive performance of the proposed model by comparing it with the conventional models in terms of throughput and energy efficiency.






DOI
TL;DR: In this paper , the authors proposed a traffic-aware autonomous scheduling technique for the Industrial Internet of Things (IIoT) networks, which divides the slotframe into a number of segments and allocates a consecutive segment for the node as per its hop distance from the root.
Abstract: Wireless low-power communication technologies play a vital role in building Industrial Internet of Things (IIoT) systems. The 6TiSCH layer is being developed for IIoT applications to use IPv6 upper stack over IEEE 802.15.4e TSCH MAC. 6TiSCH layer defines the scheduling scheme to achieve stringent industrial requirements and QoS advancement. This paper proposes a novel 6TiSCH traffic-aware autonomous scheduling technique for the IIoT network. In this technique, the schedule is created as per the traffic condition at the node, reducing the packet loss. It divides the slotframe into a number of segments and allocates a consecutive segment for the node as per its hop distance from the root. Supplementary cells are allocated to the nodes that are close to the root as these nodes have more traffic than the other nodes in the network. The proposed technique outperforms by improving the Packet Delivery Ratio (PDR) and reducing end-to-end delay compared to other scheduling techniques. The result shows that PDR is improved up to 23% and reduces delay up to 37%.