scispace - formally typeset
Search or ask a question
Author

Subhendu Kumar Pani

Bio: Subhendu Kumar Pani is an academic researcher from Biju Patnaik University of Technology. The author has contributed to research in topics: Computer science & Feature selection. The author has an hindex of 8, co-authored 45 publications receiving 254 citations. Previous affiliations of Subhendu Kumar Pani include Veer Surendra Sai University of Technology & Orissa Engineering College.


Papers
More filters
Journal ArticleDOI
TL;DR: A comparative analysis of data classification accuracy using Liver disorder data in different scenarios is presented and the predictive performances of popular classifiers are compared quantitatively.

77 citations

Journal ArticleDOI
TL;DR: A comprehensive review of mainstream consensus protocols such as Delegated Proof of Stake (DPoS), Proof of Activity (PoA) and Proof of Work (PoW) is presented in this article.
Abstract: As Blockchain innovation picks up popularity in many areas, it is frequently hailed as a sound innovation. Because of the decentralization and encryption, many imagine that data put away in a Blockchain is and will consistently be protected. Among various abstraction layers of Blockchain architecture, the consensus layer is the core component behind the performance and security measures of the Blockchain network. Consensus mechanisms are a critical component of a Blockchain system’s long-term stability. Consensus forms the core of blockchain technology. Therefore, a range of consensus protocols has been introduced to maximize Blockchain systems’ efficiency and meet application domains’ individual needs. This research paper describes the layered architecture of Blockchain. A comprehensive review of mainstream consensus protocols mainly Proof of Work (PoW), Proof of Stake (PoS), Delegated Proof of Stake (DPoS), Proof of Activity (PoA) is presented in the paper. These mainstream consensus protocols have been explained and detailed performance analysis of these consensus protocols has been done. We have proposed a performance matrix of these consensus protocols based on different parameters like Degree of decentralization, Latency, Fault Tolerance Rate, Scalability, etc. Consensus protocols being the core of a strong fault-tolerant secured blockchain system, the proposed work intends to help inappropriate protocol selection and further research on strengthening trust and ownership in the technology. Depending upon different parameters like decentralization which is low in POA compared to other protocols, whereas POW is non-scalable, so depending on the priority of a particular performance parameter, the paper will help in the selection of a specific protocol.

49 citations

Journal ArticleDOI
01 Apr 2011
TL;DR: An overview of web usage mining is presented and a survey of the pattern extraction algorithms used for web usagemining is provided.
Abstract: As the size of web increases along with number of users, it is very much essential for the website owners to better understand their customers so that they can provide better service, and also enhance the quality of the website. To achieve this they depend on the web access log files. The web access log files can be mined to extract interesting pattern so that the user behaviour can be understood. This paper presents an overview of web usage mining and also provides a survey of the pattern extraction algorithms used for web usage mining.

49 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed an image steganography procedure by utilizing the combination of various algorithms that build the security of the secret data by utilizing Binary bit-plane decomposition (BBPD) based image encryption technique.
Abstract: Internet of Things (IoT) is a domain where the transfer of big data is taking place every single second. The security of these data is a challenging task; however, security challenges can be mitigated with cryptography and steganography techniques. These techniques are crucial when dealing with user authentication and data privacy. In the proposed work, a highly secured technique is proposed using IoT protocol and steganography. This work proposes an image steganography procedure by utilizing the combination of various algorithms that build the security of the secret data by utilizing Binary bit-plane decomposition (BBPD) based image encryption technique. Thereafter a Salp Swarm Optimization Algorithm (SSOA) based adaptive embedding process is proposed to increase the payload capacity by setting different parameters in the steganographic embedding function for edge and smooth blocks. Here the SSOA algorithm is used to localize the edge and smooth blocks efficiently. Then, the hybrid Fuzzy Neural Network with a backpropagation learning algorithm is used to enhance the quality of the stego images. Then these stego images are transferred to the destination in the highly secured protocol of IoT. The proposed steganography technique shows better results in terms of security, image quality, and payload capacity in comparison with the existing state of art methods.

45 citations

Journal ArticleDOI
TL;DR: The main objective of this paper is to provide a clear idea about the e-Gov using cloud computing models and outlines the problems and requirements for understanding the e -Gov paradigm in India.

41 citations


Cited by
More filters
Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Journal ArticleDOI
TL;DR: This is it, the handbook of data mining and knowledge discovery that will be your best choice for better reading book that you will not spend wasted by reading this website.
Abstract: Give us 5 minutes and we will show you the best book to read today. This is it, the handbook of data mining and knowledge discovery that will be your best choice for better reading book. Your five times will not spend wasted by reading this website. You can take the book as a source to make better concept. Referring the books that can be situated with your needs is sometime difficult. But here, this is so easy. You can find the best thing of book that you can read.

252 citations

Journal ArticleDOI
TL;DR: This review introduces disease prevention and its challenges followed by traditional prevention methodologies, and summarizes state-of-the-art data analytics algorithms used for classification of disease, clustering, anomalies detection, and association as well as their respective advantages, drawbacks and guidelines.
Abstract: Medical data is one of the most rewarding and yet most complicated data to analyze. How can healthcare providers use modern data analytics tools and technologies to analyze and create value from complex data? Data analytics, with its promise to efficiently discover valuable pattern by analyzing large amount of unstructured, heterogeneous, non-standard and incomplete healthcare data. It does not only forecast but also helps in decision making and is increasingly noticed as breakthrough in ongoing advancement with the goal is to improve the quality of patient care and reduces the healthcare cost. The aim of this study is to provide a comprehensive and structured overview of extensive research on the advancement of data analytics methods for disease prevention. This review first introduces disease prevention and its challenges followed by traditional prevention methodologies. We summarize state-of-the-art data analytics algorithms used for classification of disease, clustering (unusually high incidence of a particular disease), anomalies detection (detection of disease) and association as well as their respective advantages, drawbacks and guidelines for selection of specific model followed by discussion on recent development and successful application of disease prevention methods. The article concludes with open research challenges and recommendations.

177 citations

Proceedings ArticleDOI
01 Jan 2017
TL;DR: The problem of the farmers facing a serious setback in productivity is solved by proposing a recommendation system through an ensemble model with majority voting technique using Random tree, CHAID, K-Nearest Neighbor and Naive Bayes as learners to recommend a crop for the site specific parameters with high accuracy and efficiency.
Abstract: Data mining is the practice of examining and deriving purposeful information from the data. Data mining finds its application in various fields like finance, retail, medicine, agriculture etc. Data mining in agriculture is used for analyzing the various biotic and abiotic factors. Agriculture in India plays a predominant role in economy and employment. The common problem existing among the Indian farmers are they don't choose the right crop based on their soil requirements. Due to this they face a serious setback in productivity. This problem of the farmers has been addressed through precision agriculture. Precision agriculture is a modern farming technique that uses research data of soil characteristics, soil types, crop yield data collection and suggests the farmers the right crop based on their site-specific parameters. This reduces the wrong choice on a crop and increase in productivity. In this paper, this problem is solved by proposing a recommendation system through an ensemble model with majority voting technique using Random tree, CHAID, K-Nearest Neighbor and Naive Bayes as learners to recommend a crop for the site specific parameters with high accuracy and efficiency.

133 citations

01 Jan 1998
TL;DR: This paper uses a formal approach to define important terms like fault, fault tolerance, and redundancy, which leads to four distinct forms of fault tolerance and to two main phases in achieving them: detection and correction.
Abstract: Fault tolerance in distributed computing is a wide area with a significant body of literature that is vastly diverse in methodology and terminology. This paper aims at structuring the area and thus guiding readers into this interesting field. We use a formal approach to define important terms like fault, fault tolerance, and redundancy. This leads to four distinct forms of fault tolerance and to two main phases in achieving them: detection and correction. We show that this can help to reveal inherently fundamental structures that contribute to understanding and unifying methods and terminology. By doing this, we survey many existing methodologies and discuss their relations. The underlying system model is the close-to-reality asynchronous message-passing model of distributed computing.

130 citations