scispace - formally typeset
Search or ask a question
Author

Akshay Jain

Bio: Akshay Jain is an academic researcher. The author has contributed to research in topics: Steganography & Steganography tools. The author has an hindex of 2, co-authored 3 publications receiving 87 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: Various algorithms of the decision tree (ID3, C4.5, CART), their features, advantages, and disadvantages are discussed.
Abstract: Today the computer technology and computer network technology has developed so much and is still developing with pace.Thus, the amount of data in the information industry is getting higher day by day. This large amount of data can be helpful for analyzing and extracting useful knowledge from it. The hidden patterns of data are analyzed and then categorized into useful knowledge. This process is known as Data Mining. [4].Among the various data mining techniques, Decision Tree is also the popular one. Decision tree uses divide and conquer technique for the basic learning strategy. A decision tree is a flow chart-like structure in which each internal node represents a “test” on an attribute where each branch represents the outcome of the test and each leaf node represents a class label. This paper discusses various algorithms of the decision tree (ID3, C4.5, CART), their features, advantages, and disadvantages.

120 citations

Journal ArticleDOI
TL;DR: The proposed approach is combination of compression, data hiding technique and encryption to make the transmission and storage of digital data faster, lossy compression is used and lossless LSB steganography is used.
Abstract: The paper presents a new approach in Image steganography. Information security is the important research field. Steganography process hides message into cover file and forms a stego file. In image steganography there is a need of method which will increase the security, reduce the distortion in the stego file and recovers the data without any loss. In the era of multimedia and internet there is need of reducing time for transmission. The proposed approach is combination of compression, data hiding technique and encryption. To make the transmission and storage of digital data faster, lossy compression is used. On the compressed image data hiding is done. The stego image is encrypted using AES to ensure user authentication. If the receiver has encryption key and data hiding key then only he can obtain the secret message. Nodes are selected randomly in data hiding stage. On the randomly selected nodes lossless LSB steganography is used.

2 citations

Journal ArticleDOI
TL;DR: Smart android application is a unified framework covering major cases of phone uses, placement, perspective and interaction in sensible uses with sophisticated user habits, and considers both energy consumption and user friendly relationship.
Abstract: Context awareness is obtaining more and more vital for variety of mobile and pervasive applications on these days smartphones. Whereas human centric contexts(e.g. indoor/outdoor, at homing/office, driving/walking) are extensively researched, few tries have studied from phones perspective(e.g. on table/sofa, in pocket/bag/hand). Thus everyone including us tend to ask such immediate surroundings as micro-environment typically many to a dozen of centimeters, around a phone. Styling and implementing good smart application, a micro-environment sensing platform that automatically records sensing data information and characterize the micro-environment of smartphones, is the main aim. The platform runs as a daemon process on a smartphone and provides finer-grained atmospheric information to upper layer applications via programming interfaces. Smart android application is a unified framework covering major cases of phone uses, placement, perspective and interaction in sensible uses with sophisticated user habits. As a protracted term running middleware, smart android applications consider both energy consumption and user friendly relationship.
Proceedings ArticleDOI
16 Dec 2022
TL;DR: In this article , the authors have defined anti-forensics and the main strategies used by attackers to conceal or delay forensics investigations, they include deception and methods to deceive digital forensics investigators and fraudsters utilize anti forensics technologies to conceal their tracks from computer forensics professionals following a data breach or malware campaign.
Abstract: In the modern world of techniques, the computerized cloud-based digital data storing techniques will grows and they will be shifted to the online level of cloud-based environments. After getting the data online there are most of the issues are produced like hacking, data thefts, data breaches, data virtualization, data validations, user authentication, and more. Most of the cloud storage is closed due to non-payment on time and the data has remained. The data access of this cloud will not be able to handle by the admin or developer, after sometimes data will be leaked on the internet because not any user or admin will be able to handle this cloud. This research paper has a conclusion, solutions, and challenges that will affect the forensic investigation of cloud-based data. The anti-forensic techniques are very dangerous with all which is cause zero traces pieces of evidence or less trace evidence which is the effect on forensic investigations. Anti-forensics tactics are intended to thwart digital forensics investigators. They include deception and methods to deceive digital forensics investigators. Furthermore, fraudsters utilize anti-forensics technologies to conceal their tracks from computer forensics professionals following a data breach or malware campaign. This article defines anti-forensics and the main strategies used by attackers to conceal or delay forensics investigations.

Cited by
More filters
Journal ArticleDOI
TL;DR: A comprehensive survey on Intrusion Detection System (IDS) for IoT is presented and various IDS placement strategies and IDS analysis strategies in IoT architecture are discussed, along with Machine Learning (ML) and Deep Learning techniques for detecting attacks in IoT networks.
Abstract: Internet of Things (IoT) is widely accepted technology in both industrial as well as academic field. The objective of IoT is to combine the physical environment with the cyber world and create one big intelligent network. This technology has been applied to various application domains such as developing smart home, smart cities, healthcare applications, wireless sensor networks, cloud environment, enterprise network, web applications, and smart grid technologies. These wide emerging applications in variety of domains raise many security issues such as protecting devices and network, attacks in IoT networks, and managing resource-constrained IoT networks. To address the scalability and resource-constrained security issues, many security solutions have been proposed for IoT such as web application firewalls and intrusion detection systems. In this paper, a comprehensive survey on Intrusion Detection System (IDS) for IoT is presented for years 2015–2019. We have discussed various IDS placement strategies and IDS analysis strategies in IoT architecture. The paper discusses various intrusions in IoT, along with Machine Learning (ML) and Deep Learning (DL) techniques for detecting attacks in IoT networks. The paper also discusses security issues and challenges in IoT.

107 citations

Journal ArticleDOI
TL;DR: A novel optimized deep learning approach based on binary particle swarm optimization with decision tree (BPSO-DT) and convolutional neural network (CNN) to classify different types of cancer based on tumor RNA sequence (RNA-Seq) gene expression data is introduced.
Abstract: Cancer is one of the most feared and aggressive diseases in the world and is responsible for more than 9 million deaths universally. Staging cancer early increases the chances of recovery. One staging technique is RNA sequence analysis. Recent advances in the efficiency and accuracy of artificial intelligence techniques and optimization algorithms have facilitated the analysis of human genomics. This paper introduces a novel optimized deep learning approach based on binary particle swarm optimization with decision tree (BPSO-DT) and convolutional neural network (CNN) to classify different types of cancer based on tumor RNA sequence (RNA-Seq) gene expression data. The cancer types that will be investigated in this research are kidney renal clear cell carcinoma (KIRC), breast invasive carcinoma (BRCA), lung squamous cell carcinoma (LUSC), lung adenocarcinoma (LUAD) and uterine corpus endometrial carcinoma (UCEC). The proposed approach consists of three phases. The first phase is preprocessing, which at first optimize the high-dimensional RNA-seq to select only optimal features using BPSO-DT and then, converts the optimized RNA-Seq to 2D images. The second phase is augmentation, which increases the original dataset of 2086 samples to be 5 times larger. The selection of the augmentations techniques was based achieving the least impact on manipulating the features of the images. This phase helps to overcome the overfitting problem and trains the model to achieve better accuracy. The third phase is deep CNN architecture. In this phase, an architecture of two main convolutional layers for featured extraction and two fully connected layers is introduced to classify the 5 different types of cancer according to the availability of images on the dataset. The results and the performance metrics such as recall, precision and F1 score show that the proposed approach achieved an overall testing accuracy of 96.90%. The comparative results are introduced, and the proposed method outperforms those in related works in terms of testing accuracy for 5 classes of cancer. Moreover, the proposed approach is less complex and consume less memory.

62 citations

Journal ArticleDOI
TL;DR: A review of explainable artificial intelligence (XAI) can be found in this article, where the authors analyze and review various XAI methods, which are grouped into (i) pre-modeling explainability, (ii) interpretable model, and (iii) post-model explainability.
Abstract: Thanks to the exponential growth in computing power and vast amounts of data, artificial intelligence (AI) has witnessed remarkable developments in recent years, enabling it to be ubiquitously adopted in our daily lives. Even though AI-powered systems have brought competitive advantages, the black-box nature makes them lack transparency and prevents them from explaining their decisions. This issue has motivated the introduction of explainable artificial intelligence (XAI), which promotes AI algorithms that can show their internal process and explain how they made decisions. The number of XAI research has increased significantly in recent years, but there lacks a unified and comprehensive review of the latest XAI progress. This review aims to bridge the gap by discovering the critical perspectives of the rapidly growing body of research associated with XAI. After offering the readers a solid XAI background, we analyze and review various XAI methods, which are grouped into (i) pre-modeling explainability, (ii) interpretable model, and (iii) post-modeling explainability. We also pay attention to the current methods that dedicate to interpret and analyze deep learning methods. In addition, we systematically discuss various XAI challenges, such as the trade-off between the performance and the explainability, evaluation methods, security, and policy. Finally, we show the standard approaches that are leveraged to deal with the mentioned challenges.

54 citations

Journal ArticleDOI
24 Oct 2018
TL;DR: This research focuses on the development of a novel robot learning architecture that uniquely combines learning from demonstration (LfD) and reinforcement learning (RL) algorithms to effectively teach socially assistive robots personalized behaviors.
Abstract: Socially assistive robots can autonomously provide activity assistance to vulnerable populations, including those living with cognitive impairments. To provide effective assistance, these robots should be capable of displaying appropriate behaviors and personalizing them to a user's cognitive abilities. Our research focuses on the development of a novel robot learning architecture that uniquely combines learning from demonstration (LfD) and reinforcement learning (RL) algorithms to effectively teach socially assistive robots personalized behaviors. Caregivers can demonstrate a series of assistive behaviors for an activity to the robot, which it uses to learn general behaviors via LfD. This information is used to obtain initial assistive state-behavior pairings using a decision tree. Then, the robot uses an RL algorithm to obtain a policy for selecting the appropriate behavior personalized to the user's cognition level. Experiments were conducted with the socially assistive robot Casper to investigate the effectiveness of our proposed learning architecture. Results showed that Casper was able to learn personalized behaviors for the new assistive activity of tea-making, and that combining LfD and RL algorithms significantly reduces the time required for a robot to learn a new activity.

48 citations

Journal ArticleDOI
TL;DR: The Random Forest and J48 algorithms are used to obtain a sustainable and practicable model to detect various stages of CKD with comprehensive medical accuracy and it was revealed that J48 predicted CKD in all stages better than random forest with a 85.5% accuracy.
Abstract: Chronic Kidney Disease (CKD), i.e., gradual decrease in the renal function spanning over a duration of several months to years without any major symptoms, is a life-threatening disease. It progresses in six stages according to the severity level. It is categorized into various stages based on the Glomerular Filtration Rate (GFR), which in turn utilizes several attributes, like age, sex, race and Serum Creatinine. Among multiple available models for estimating GFR value, Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI), which is a linear model, has been found to be quite efficient because it allows detecting all CKD stages. Early detection and cure of CKD is extremely desirable as it can lead to the prevention of unwanted consequences. Machine learning methods are being extensively advocated for early detection of symptoms and diagnosis of several diseases recently. With the same motivation, the aim of this study is to predict the various stages of CKD using machine learning classification algorithms on the dataset obtained from the medical records of affected people. Specifically, we have used the Random Forest and J48 algorithms to obtain a sustainable and practicable model to detect various stages of CKD with comprehensive medical accuracy. Comparative analysis of the results revealed that J48 predicted CKD in all stages better than random forest with an accuracy of 85.5%. The study also showed that J48 shows improved performance over Random Forest. The study concluded that it may be used to build an automated system for the detection of severity of CKD.

36 citations