scispace - formally typeset
Search or ask a question

Showing papers by "Mayank Dave published in 2015"


Journal ArticleDOI
TL;DR: The experimental results demonstrate that this algorithm provides better robustness without affecting the quality of watermarked image, and combines the advantages and removes the disadvantages of the two transform techniques.
Abstract: In this paper, the effects of different error correction codes on the robustness and imperceptibility of discrete wavelet transform and singular value decomposition based dual watermarking scheme is investigated. Text and image watermarks are embedded into cover radiological image for their potential application in secure and compact medical data transmission. Four different error correcting codes such as Hamming, the Bose, Ray-Chaudhuri, Hocquenghem (BCH), the Reed---Solomon and hybrid error correcting (BCH and repetition code) codes are considered for encoding of text watermark in order to achieve additional robustness for sensitive text data such as patient identification code. Performance of the proposed algorithm is evaluated against number of signal processing attacks by varying the strength of watermarking and covers image modalities. The experimental results demonstrate that this algorithm provides better robustness without affecting the quality of watermarked image.This algorithm combines the advantages and removes the disadvantages of the two transform techniques. Out of the three error correcting codes tested, it has been found that Reed---Solomon shows the best performance. Further, a hybrid model of two of the error correcting codes (BCH and repetition code) is concatenated and implemented. It is found that the hybrid code achieves better results in terms of robustness. This paper provides a detailed analysis of the obtained experimental results.

103 citations


Journal ArticleDOI
TL;DR: Performace of the proposed watermarking algorithm is analyzed against numerous known attacks like compression, filtering, noise, sharpening, scaling and histogram equalization and desired outcome is obtained without much degradation in extracted watermarks and watermarked image quality.
Abstract: This paper presents a new spread-spectrum based secure multiple watermarking scheme on medical images in wavelet transform domain by using selective discrete wavelet transform (DWT) coefficients for embedding. The proposed algorithm is applied for embedding text watermarks like patient identification/source identification represented in binary arrays using ASCII code and doctor’s signature or telemedicine centre name represented in binary image format into host digital radiological image for potential telemedicine applications. The algorithm is based on secure spread-spectrum technique where pseudo-noise (PN) sequences are generated corresponding to each watermarking bit and embedding of these sequences is done column wise into the selected DWT coefficients in the subband. Selection of DWT coefficient for embedding is done by thresholding the coefficient values present in that column. In the embedding process, the cover image is decomposed at second level DWT. The image and text watermark is embedded into the selective coefficients of the first level and second level DWT respectively. In order to enhance the robustness of text watermarks like patient identity code, error correcting code (ECC) is applied to the ASCII representation of the text watermark before embedding. Results are obtained by varying the gain factor, subband decomposition levels, size of watermark, and medical image modalities. Performace of the proposed watermarking algorithm is analyzed against numerous known attacks like compression, filtering, noise, sharpening, scaling and histogram equalization and desired outcome is obtained without much degradation in extracted watermarks and watermarked image quality. The method is compared with other reported techniques and has been found to be giving superior performance for robustness and imperceptibility suggested by other authors.

92 citations



Journal ArticleDOI
TL;DR: A secure multilevel watermarking scheme in which the encrypted text acts as a watermark based on secure spread-spectrum technique for digital images in discrete wavelet transform (DWT) domain is presented.
Abstract: This paper presents a secure multilevel watermarking scheme in which the encrypted text acts as a watermark. The algorithm is based on secure spread-spectrum technique for digital images in discrete wavelet transform (DWT) domain. Potential application of the proposed watermarking scheme is successfully demonstrated for embedding various medical watermarks in text format at different subband decomposition levels depending upon their performance requirements. In the embedding process, the cover CT Scan image is decomposed up to third level of DWT coefficients. Different text watermarks such as personal and medical record of the patient, diagnostic/image codes and doctor code/signature are embedded into the selective coefficients of the second and third level DWT for potential telemedicine applications. Selection of DWT coefficients for embedding is done by column wise thresholding of coefficients values. Also, encryption is applied to the ASCII representation of the text and the encoded text watermark is embedded. The algorithm correctly extracts the embedded watermarks without error and is robust against numerous known attacks without much degradation of the medical image quality of the watermarked image.

55 citations


Journal ArticleDOI
10 Feb 2015
TL;DR: It is found that the proposed scheme correctly extracts the embedded watermarks without error and provides high degree robustness against numerous known attacks while maintaining the imperceptibility of watermarked image.
Abstract: This paper presents a new robust and secure digital watermarking scheme for its potential application in Telemedicine. The algorithm embeds medical text watermarks such as patient’s identity, image identity code and doctor’s identity into selected sub-band discrete wavelet transform (DWT) coefficients of the cover medical image using spread-spectrum technique. In the embedding process, the cover image is decomposed up to third level DWT coefficients. Three different text watermarks are embedded into the selected horizontal and vertical sub band DWT coefficients of the first, second and third level respectively. Selection of these coefficients for embedding purpose is based on threshold criteria defined in this paper. Robustness of the proposed watermarking scheme is further enhanced by applying error correcting code to the ASCII representation of the text watermark and the encoded text watermark is finally embedded into the cover medical image. It is found that the proposed scheme correctly extracts the embedded watermarks without error and provides high degree robustness against numerous known attacks while maintaining the imperceptibility of watermarked image.

21 citations


Journal ArticleDOI
TL;DR: Two naive approaches and two state-of-the-art heuristics are implemented along with the proposed solution to recover the lost connectivity of the partitioned WSN, based on a zero gradient point inside the convex hull polygon.

21 citations


Proceedings ArticleDOI
01 Dec 2015
TL;DR: An implementation of the Improved Bat Algorithm which is based on the echolocation of bats to control congestion in Wireless Sensor Networks at transport layer is shown.
Abstract: Problem of Congestion in Wireless Sensor Network (WSN) is an area which draws attention of various researchers in recent years. The challenge lies in developing a model for routing which can find the optimized route on the basis of distance between source and destination and the residual energy of the node. Various models have been proposed and developed from time to time and their merits and demerits have been discussed. This paper shows an implementation of the Improved Bat Algorithm which is based on the echolocation of bats to control congestion in Wireless Sensor Networks at transport layer. Simulation results have shown that as the number of hops in the path of data transmission increases the Queue length is decreases. The Congestion in the network decreases as the packets are transferred through different routes rather than collecting on a single node. Two important factors such as network lifetime and throughput are also compared with CODA (Congestion Detection and Avoidance) and PSO (Particle Swarm Optimization) algorithm.

8 citations


Journal ArticleDOI
TL;DR: The Firefly Algorithmic rule is implemented in this paper that relies on the attractiveness issue of the firefly insect to control congestion in WSN at transport layer and the results show that the projected approach is best as compared to Congestion Detection and Avoidance and Particle Swarm Optimization on network lifetime and throughput of the network.
Abstract: Congestion in Wireless Sensor Network (WSN) is an issue of concern for several researchers in recent years. The key challenge is to develop an algorithmic rule which may realize the optimased route on the idea of parameters like residual energy, range of retransmissions and the distance between source and destination. The Firefly Algorithmic rule is implemented in this paper that relies on the attractiveness issue of the firefly insect to control congestion in WSN at transport layer. The results additionally show that the projected approach is best as compared to Congestion Detection and Avoidance (CODA) and Particle Swarm Optimization (PSO) on network lifetime and throughput of the network.

7 citations


Proceedings ArticleDOI
01 Sep 2015
TL;DR: This paper presents a two-tier detection scheme to detect parasite P2P botnets, able to detect bots from a monitored network with accuracy above 99% at the same time addressing several shortcomings of previous detection approaches.
Abstract: Peer-to-Peer (P2P) botnets have emerged as a significant threat against network security because of their distributed platform. The decentralized nature of these botnets makes their detection very challenging and the situation gets aggravated if an existing P2P network is exploited for botnet creation (parasite botnets). In this paper, we present a two-tier detection scheme to detect parasite P2P botnets. Our approach detects botnets in their waiting stage itself, without any requirement of seed information about bots and bots' signature. We have considered two basic behavior of botnets for detection: (i) long-living peers and (ii) search requests' intensity. The approach is able to detect bots from a monitored network with accuracy above 99% at the same time addressing several shortcomings of previous detection approaches.

7 citations


Journal ArticleDOI
TL;DR: A new energy efficient recovery approach based on two point crossover genetic algorithm (GA) to reconnect the partitioned network is proposed and the simulation results confirm the effectiveness of the proposed over state-of-the-art approaches.

7 citations


Proceedings ArticleDOI
04 Apr 2015
TL;DR: It is concluded that caching is backbone of Content Centric Networking, which enables in-network caching at network layer and major advantages of CCN are short download time and low communication overhead.
Abstract: Use of Internet in today's world has been largely dominated by content dissemination but currently used IP architecture is based on connection between hosts. In order to disseminate content efficiently in cost effective manner, internet architecture needs to map "what" user wants with "where" host of network is. Content Centric Networking (CCN) decouples content from host. CCN enables in-network caching at network layer. Each node in CCN can cache content along with forwarding the content. In CCN any node can act as host and serve client as if it is actual host for requested content. Content becomes core of network not connection in CCN. Major advantages of CCN are short download time and low communication overhead. As these advantages are because CCN supports in-network caching, so it can be concluded that caching is backbone of CCN. The selection of appropriate router to cache content so that it could be able to serve future requests for longer time is very important. The aim of this paper is to explain CCN functionalities briefly. Proposed caching strategies have also been covered in this paper.

Proceedings ArticleDOI
01 Sep 2015
TL;DR: An Cost Effective Caching algorithm which selects eligible node based on their remaining capacity left to cache more content is proposed, showing that this algorithm significantly improves performance of network.
Abstract: Content Centric Networking (CCN) is a new networking architecture aims to address current challenges faced by TCP/IP architecture. Motivation behind idea of CCN is to avoid repeated delivery of same content (information). CCN avoids sending of same content again and again by caching the sending content at some intermediate node (router). In CCN architecture each node has storage capacity to cache the propagating content. By caching content at intermediate nodes, CCN stops requests to reach to original server of that content. The intermediate node, having requested content in its cache, can directly serve further requests for same content, acting as server. Hence all requests not necessarily reach to original server and that's how considerable amount of bandwidth is saved. To cache content, there is need of algorithm to select subset of nodes among available, to serve new incoming requests for longer time. In our research work we proposed an Cost Effective Caching algorithm which selects eligible node based on their remaining capacity left to cache more content. The main motivation behind proposing this algorithm is to cache content at a node which has enough storage left to cache the content. We implemented and compared performance of this scheme with respect to existing ones and results are showing that this algorithm significantly improves performance of network.

Proceedings ArticleDOI
01 Feb 2015
TL;DR: This paper has proposed a semantic approach that gives ranked list of services based on the web based relatedness score and helps the users in the selection of potentially relevant and semantically similar services within a category.
Abstract: The Current description standards for Web Services such as WSDL and UDDI have a significant drawback of being restricted to the syntactic aspects of service. A service provider registers a service in the universal repository i.e. UDDI so that the service consumers can search and discover the required service that meets the user functional requirements from thousands of registered services. Matching the user request with all services in a particular category of the repository is a cumbersome task. Semantic approaches are required to further assist the user in discovering relevant services. In this paper, we have proposed a semantic approach that gives ranked list of services based on the web based relatedness score and helps the users in the selection of potentially relevant and semantically similar services within a category. The proposed approach has been implemented on 80 OWLS services and the results have shown that the approach gives ranked list of services with ease of the selection process for the user.

01 Jan 2015
TL;DR: A Multi-criterion Fuzzy logic based intra-clusters and inter-cluster multi-hop data dissemination protocol is proposed to make balance among the nodes and chooses more stable nodes as CHs for efficient data dissemination and the simulation results have been confirmed that the proposed approach is more efficient than state-of-the-art approaches.
Abstract: Improving network lifetime in wireless sensor networks (WSNs) is a major concern in the recent years. Currently, various approaches have been proposed with having many concepts in this context. The main problem seems in the recent proposed approaches is that these approaches use direct communication among nodes with the rotation of cluster heads (CHs) periods to distribute energy consumption. However, energy saving mechanisms based only on metric related to nodes' residual power cannot be directly applied to find stable CHs. The reason is that a sensor node or a CH having more residual power is willing to accept all requests, because it has enough residual battery power, therefore, much traffic will be injected to that node. In this case, the energy decay rate of that particular node will tend to be high and causes a sharp decay of its backup battery power. As a consequence, node exhausts its energy quickly and cause nodes' death rate high in the network. Hence, it causes uneven energy dissipation among the nodes in the network. Moreover, it decreases information transmission efficiency of the network drastically. In this research paper, a Multi-criterion Fuzzy logic based intra-cluster and inter-cluster multi-hop data dissemination protocol is proposed to make balance among the nodes and chooses more stable nodes as CHs for efficient data dissemination. The simulation results have been confirmed that our proposed approach is more efficient than state-of-the-art approaches.

Journal ArticleDOI
TL;DR: Simulation results demonstrate that the proposed technique has better fault tolerance with lower complexity than general random-walk-based dissemination process and more scalability as compared to the other protocols.
Abstract: The design of data dissemination protocol has been a great challenge due to the highly dynamic and unreliable wireless channel in vehicular ad hoc networks (VANET). In literature, several interesting solutions are proposed to perform data dissemination for this environment. But these solutions either use architectures requiring centralised coordination or global network knowledge or large intermediate buffers. In this paper, we propose a decentralised technique that overcomes above requirements and provides reliable and scalable communication in both dense and sparse traffic for VANET. Random walks are used in the proposed technique to disseminate data from one vehicle to other vehicles in the network. We use raptor codes to provide low decoding complexity and more scalability for data dissemination. Simulation results demonstrate that the proposed technique has better fault tolerance with lower complexity than general random-walk-based dissemination process and more scalability as compared to the other p...

01 Jan 2015
TL;DR: The algorithm proposed is the watermarking technique in the transform domain to ensure secure transfer of medical data using DWT transformation and substitution method and the watermarked image is encrypted by using the symmetric stream cipher techniques.
Abstract: The protection of data is of at most importance in the medical field to boost the telemedicine applications. There is a need of robust and secure mechanism to transfer the medical images over the Internet. The algorithm proposed in this study is the watermarking technique in the transform domain to ensure secure transfer of medical data. Using DWT transformation and substitution method, we embed the watermark into the cover image and the watermarked image is then encrypted by using the symmetric stream cipher techniques. Performace of the proposed algorithm is analyzed against various signal processing attacks like compression, filtering, noise and histogram equalization and desired outcome is obtained without much degradation in extracted watermark and watermarked image quality.

Proceedings ArticleDOI
21 Feb 2015
TL;DR: The key idea here is to select a step with a potential to earn high profit and being unpredictable in picking up that step at the same moment, thus making it nearly impossible for the adversary to predict the next step.
Abstract: The recent advancements in the game theory have led to it being applied in various applications such as communication, networks, business, biology, political system etc. Precisely, Max-Min Algorithm is a decision based rule used in the game theory for deciding the next step of a player out of a set of possible steps. It can be thought of maximizing the minimum profit of the player. The assumption made in the current literature of zero-sum game theory is that both players are rational and logical to decide the best possible step out of the available options. On the Prima Facie, we expect a player to choose the best possible step for himself/herself. But in doing so, he/she might give away his/her move to his/her rival, who, being a rational thinker, can manipulate the game to take his/her advantage or alternatively rival's loss. Our proposed approach seeks to overcome this loophole presented in the current Max-Min approach by construction of a function which solves the trade-off between predictability and maximum profit. The key idea here is to select a step with a potential to earn high profit and being unpredictable in picking up that step at the same moment, thus making it nearly impossible for the adversary to predict the next step. In the nut shell, our work is an attempt to reduce the worst case complexity of original Max-Min approach.

Journal ArticleDOI
TL;DR: The idea is to combine principles from machine learning, data mining, statistical techniques and measures of semantic relatedness to make the semantic web service discovery process more intelligent, efficient and effective.
Abstract: Main challenges to the current semantic web service technologies are exponential continuous growth in the number of services on the Internet, syntax based discovery, lack of common agreed upon semantic service standards and heterogeneity of ontologies. In this paper, a service discovery approach independent of semantic service description models is proposed to solve the challenges of the current web service discovery. The idea is to combine principles from machine learning, data mining, statistical techniques and measures of semantic relatedness to make the semantic web service discovery process more intelligent, efficient and effective. The proposed approach exploits the use of semantic as well as syntactic information present within the service description profiles. Our approach is unique in terms of its application to any web service description language and the use of Omiotis measure of semantic relatedness for service discovery. The proposed approach has been implemented on OWL-S based service descriptions profiles and is able to find semantic relationship between the services which were otherwise discarded by the OWL-MX matchmaker. Empirical analysis shows that the proposed method out performs the

Proceedings ArticleDOI
18 Jun 2015
TL;DR: This paper lists recent P2P botnet detection techniques that overcome the weaknesses of previous techniques with higher detection accuracy and discusses various such techniques, their advantages, accuracy and the weaknesses they too are having.
Abstract: Peer-to-Peer (P2P) botnets have emerged as a serious threat against the network security. They are used to carry out various illicit activities like click fraud, DDOS attacks and for information exfiltration. These botnets use distributed concept for command dissemination. These botnets are resilient to dynamic churn and to take-down attempts. Earlier P2P botnet detection techniques have some shortcomings such as they have less accuracy, unable to detect stealthy botnets and advanced botnets using fast-flux networks. In this paper, we list recent P2P botnet detection techniques that overcome the weaknesses of previous techniques with higher detection accuracy. We also discuss various such techniques, their advantages, accuracy and the weaknesses they too are having. However, two or more techniques can be used together to have more accurate and robust P2P botnet detection.