scispace - formally typeset
Search or ask a question
Author

Subhadeep Sarkar

Bio: Subhadeep Sarkar is an academic researcher from Indian Institute of Technology Kharagpur. The author has contributed to research in topics: Computer science & Cloud computing. The author has an hindex of 8, co-authored 27 publications receiving 989 citations. Previous affiliations of Subhadeep Sarkar include Boston University & University of Rennes.

Papers
More filters
Journal ArticleDOI
TL;DR: Results show that as the number of applications demanding real-time service increases, the fog computing paradigm outperforms traditional cloud computing.
Abstract: This work performs a rigorous, comparative analysis of the fog computing paradigm and the conventional cloud computing paradigm in the context of the Internet of Things (IoT), by mathematically formulating the parameters and characteristics of fog computing—one of the first attempts of its kind. With the rapid increase in the number of Internet-connected devices, the increased demand of real-time, low-latency services is proving to be challenging for the traditional cloud computing framework. Also, our irreplaceable dependency on cloud computing demands the cloud data centers (DCs) always to be up and running which exhausts huge amount of power and yield tons of carbon dioxide ( $\text{CO}_2$ ) gas. In this work, we assess the applicability of the newly proposed fog computing paradigm to serve the demands of the latency-sensitive applications in the context of IoT. We model the fog computing paradigm by mathematically characterizing the fog computing network in terms of power consumption, service latency, $\text{CO}_2$ emission, and cost, and evaluating its performance for an environment with high number of Internet-connected devices demanding real-time service. A case study is performed with traffic generated from the $100$ highest populated cities being served by eight geographically distributed DCs. Results show that as the number of applications demanding real-time service increases, the fog computing paradigm outperforms traditional cloud computing. For an environment with $50$ percent applications requesting for instantaneous, real-time services, the overall service latency for fog computing is noted to decrease by $50.09$ percent. However, it is mentionworthy that for an environment with less percentage of applications demanding for low-latency services, fog computing is observed to be an overhead compared to the traditional cloud computing. Therefore, the work shows that in the context of IoT, with high number of latency-sensitive applications fog computing outperforms cloud computing.

580 citations

Journal ArticleDOI
24 Mar 2016
TL;DR: From the performance analysis, fog computing is established, in collaboration with the traditional cloud computing platform, as an efficient green computing platform to support the demands of the next generation IoT applications.
Abstract: In this study, the authors focus on theoretical modelling of the fog computing architecture and compare its performance with the traditional cloud computing model. Existing research works on fog computing have primarily focused on the principles and concepts of fog computing and its significance in the context of internet of things ( IoT ). This work, one of the first attempts in its domain, proposes a mathematical formulation for this new computational paradigm by defining its individual components and presents a comparative study with cloud computing in terms of service latency and energy consumption. From the performance analysis, the work establishes fog computing, in collaboration with the traditional cloud computing platform, as an efficient green computing platform to support the demands of the next generation IoT applications. Results show that for a scenario where 25% of the IoT applications demand real-time, low-latency services, the mean energy expenditure in fog computing is 40.48% less than the conventional cloud computing model.

313 citations

Journal ArticleDOI
TL;DR: An algorithm for Priority-based Allocation of Time Slots (PATS) is formulated that considers a fitness parameter characterizing the criticality of health data that a packet carries, energy consumption rate for a transmitting LDPU, and other crucial LDPU properties and designs the constant model hawk-dove game that ensures prioritizing the LDPUs based on crucial properties.
Abstract: In critical medical emergency situations, wireless body area network (WBAN) equipped health monitoring systems treat data packets with critical information regarding patients’ health in the same way as data packets bearing regular healthcare information. This snag results in a higher average waiting time for the local data processing units (LDPUs) transmitting data packets of higher importance. In this paper, we formulate an algorithm for Priority-based Allocation of Time Slots ( PATS ) that considers a fitness parameter characterizing the criticality of health data that a packet carries, energy consumption rate for a transmitting LDPU, and other crucial LDPU properties. Based on this fitness parameter, we design the constant model hawk–dove game that ensures prioritizing the LDPUs based on crucial properties. In comparison with the existing works on priority-based wireless transmission, we measure and take into consideration the urgency, seriousness, and criticality associated with an LDPU and, thus, allocate transmission time slots proportionately. We show that the number of transmitting LDPUs in medical emergency situations can be reduced by $25.97\%$ , in comparison with the existing time-division-based techniques.

128 citations

Journal ArticleDOI
TL;DR: The evolution of WBANs in the context of modern health care and its convergence with nanotechnology is outlined, with a focus on the wireless body-area network.
Abstract: Over the past decade, embedded systems and microelectromechanical systems have evolved in a radical way, redefining our standard of living and enhancing the quality of life. Health care, among various other fields, has benefited vastly from this technological development. The concept of using sensors for health care purposes originated in the late 1980s when sensors were developed to measure certain physiological parameters associated with the human body. In traditional sensor nodes, the signal sources are mostly different environmental phenomena (such as temperature, vibration, and luminosity) or man-made events (such as intrusion and mobile target tracking), whereas in case of the physiological sensors, the signal source is living human tissue. These sensor nodes, as their primary sensing element, have a diaphragm that converts pressure into displacement. This displacement, in turn, is subsequently transformed into an electrical signal.

65 citations

Journal ArticleDOI
TL;DR: A discrete-time Markov chain (DTMC) is constructed that efficiently depicts the states of an IEEE 802.15.6 CSMA/CA-based WBAN, a user priority (UP)-wise analysis is performed, and the importance of the standard from a medical perspective is justified.
Abstract: Recently, the IEEE 802.15.6 Task Group introduced a new wireless communication standard that provides a suitable framework specifically to support the requirements of wireless body area networks (WBANs). The standardization dictates the physical (PHY) layer and medium access control (MAC) layer protocols for WBAN-based communications. Unlike the pre-existing wireless communication standards, IEEE 802.15.6 standardization supports short-range, extremely low power wireless communication with high quality of service and support for high data rates upto 10 Mbps in the vicinity of living tissues. In this work, we construct a discrete-time Markov chain (DTMC) that efficiently depicts the states of an IEEE 802.15.6 CSMA/CA-based WBAN. Following this, we put forward a thorough analysis of the standard in terms of reliability, throughput, average delay, and power consumption. The work concerns non-ideal channel characteristics and a saturated network traffic regime. The major shortcoming of the existing literature on Markov chain-based analysis of IEEE 802.15.6 is that the authors did not take into consideration the time spent by a node awaiting the acknowledgement frame after transmission of a packet, until time-out occurs. Also, most of the work assume that ideal channel characteristics persist for the network which is hardly the case in practice. This work remains distinctive as we take into account the waiting time of a node after it transmits a packet while constructing the DTMC. Based on the DTMC, we perform a user priority (UP)-wise analysis, and justify the importance of the standard from a medical perspective.

60 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors propose a simulator, called iFogSim, to model IoT and fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost.
Abstract: Summary Internet of Things (IoT) aims to bring every object (eg, smart cameras, wearable, environmental sensors, home appliances, and vehicles) online, hence generating massive volume of data that can overwhelm storage systems and data analytics applications. Cloud computing offers services at the infrastructure level that can scale to IoT storage and processing requirements. However, there are applications such as health monitoring and emergency response that require low latency, and delay that is caused by transferring data to the cloud and then back to the application can seriously impact their performances. To overcome this limitation, Fog computing paradigm has been proposed, where cloud services are extended to the edge of the network to decrease the latency and network congestion. To realize the full potential of Fog and IoT paradigms for real-time analytics, several challenges need to be addressed. The first and most critical problem is designing resource management techniques that determine which modules of analytics applications are pushed to each edge device to minimize the latency and maximize the throughput. To this end, we need an evaluation platform that enables the quantification of performance of resource management policies on an IoT or Fog computing infrastructure in a repeatable manner. In this paper we propose a simulator, called iFogSim, to model IoT and Fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost. We describe two case studies to demonstrate modeling of an IoT environment and comparison of resource management policies. Moreover, scalability of the simulation toolkit of RAM consumption and execution time is verified under different circumstances.

1,085 citations

Journal ArticleDOI
TL;DR: A detailed review of the security-related challenges and sources of threat in the IoT applications is presented and four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.
Abstract: The Internet of Things (IoT) is the next era of communication. Using the IoT, physical objects can be empowered to create, receive, and exchange data in a seamless manner. Various IoT applications focus on automating different tasks and are trying to empower the inanimate physical objects to act without any human intervention. The existing and upcoming IoT applications are highly promising to increase the level of comfort, efficiency, and automation for the users. To be able to implement such a world in an ever-growing fashion requires high security, privacy, authentication, and recovery from attacks. In this regard, it is imperative to make the required changes in the architecture of the IoT applications for achieving end-to-end secure IoT environments. In this paper, a detailed review of the security-related challenges and sources of threat in the IoT applications is presented. After discussing the security issues, various emerging and existing technologies focused on achieving a high degree of trust in the IoT applications are discussed. Four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.

800 citations

Journal ArticleDOI
TL;DR: This paper provides a tutorial on fog computing and its related computing paradigms, including their similarities and differences, and provides a taxonomy of research topics in fog computing.

783 citations

Journal ArticleDOI
TL;DR: A standard model for application in future IoT healthcare systems is proposed, and the state-of-the-art research relating to each area of the model is presented, evaluating their strengths, weaknesses, and overall suitability for a wearable IoT healthcare system.
Abstract: Internet of Things (IoT) technology has attracted much attention in recent years for its potential to alleviate the strain on healthcare systems caused by an aging population and a rise in chronic illness. Standardization is a key issue limiting progress in this area, and thus this paper proposes a standard model for application in future IoT healthcare systems. This survey paper then presents the state-of-the-art research relating to each area of the model, evaluating their strengths, weaknesses, and overall suitability for a wearable IoT healthcare system. Challenges that healthcare IoT faces including security, privacy, wearability, and low-power operation are presented, and recommendations are made for future research directions.

735 citations

Book ChapterDOI
TL;DR: In this paper, the challenges in fog computing acting as an intermediate layer between IoT devices/sensors and cloud datacentres and review the current developments in this field are discussed.
Abstract: In recent years, the number of Internet of Things (IoT) devices/sensors has increased to a great extent. To support the computational demand of real-time latency-sensitive applications of largely geo-distributed IoT devices/sensors, a new computing paradigm named "Fog computing" has been introduced. Generally, Fog computing resides closer to the IoT devices/sensors and extends the Cloud-based computing, storage and networking facilities. In this chapter, we comprehensively analyse the challenges in Fogs acting as an intermediate layer between IoT devices/ sensors and Cloud datacentres and review the current developments in this field. We present a taxonomy of Fog computing according to the identified challenges and its key features.We also map the existing works to the taxonomy in order to identify current research gaps in the area of Fog computing. Moreover, based on the observations, we propose future directions for research.

669 citations