scispace - formally typeset
Search or ask a question
Journal ArticleDOI

iFogSim: A toolkit for modeling and simulation of resource management techniques in the Internet of Things, Edge and Fog computing environments

TL;DR: In this paper, the authors propose a simulator, called iFogSim, to model IoT and fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost.
Abstract: Summary Internet of Things (IoT) aims to bring every object (eg, smart cameras, wearable, environmental sensors, home appliances, and vehicles) online, hence generating massive volume of data that can overwhelm storage systems and data analytics applications. Cloud computing offers services at the infrastructure level that can scale to IoT storage and processing requirements. However, there are applications such as health monitoring and emergency response that require low latency, and delay that is caused by transferring data to the cloud and then back to the application can seriously impact their performances. To overcome this limitation, Fog computing paradigm has been proposed, where cloud services are extended to the edge of the network to decrease the latency and network congestion. To realize the full potential of Fog and IoT paradigms for real-time analytics, several challenges need to be addressed. The first and most critical problem is designing resource management techniques that determine which modules of analytics applications are pushed to each edge device to minimize the latency and maximize the throughput. To this end, we need an evaluation platform that enables the quantification of performance of resource management policies on an IoT or Fog computing infrastructure in a repeatable manner. In this paper we propose a simulator, called iFogSim, to model IoT and Fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost. We describe two case studies to demonstrate modeling of an IoT environment and comparison of resource management policies. Moreover, scalability of the simulation toolkit of RAM consumption and execution time is verified under different circumstances.
Citations
More filters
Journal ArticleDOI
TL;DR: The scheduling problem in Fog computing is analyzed, focusing on how user mobility can influence application performance and how three different scheduling policies, namely concurrent, FCFS, and delay-priority, can be used to improve execution based on application characteristics.
Abstract: Fog computing provides a distributed infrastructure at the edges of the network, resulting in low-latency access and faster response to application requests when compared to centralized clouds. With this new level of computing capacity introduced between users and the data center-based clouds, new forms of resource allocation and management can be developed to take advantage of the Fog infrastructure. A wide range of applications with different requirements run on end-user devices, and with the popularity of cloud computing many of them rely on remote processing or storage. As clouds are primarily delivered through centralized data centers, such remote processing/storage usually takes place at a single location that hosts user applications and data. The distributed capacity provided by Fog computing allows execution and storage to be performed at different locations. The combination of distributed capacity, the range and types of user applications, and the mobility of smart devices require resource management and scheduling strategies that takes into account these factors altogether. We analyze the scheduling problem in Fog computing, focusing on how user mobility can influence application performance and how three different scheduling policies, namely concurrent, FCFS, and delay-priority, can be used to improve execution based on application characteristics.

337 citations

Journal ArticleDOI
TL;DR: A Blockchain-enabled Intelligent IoT Architecture with Artificial Intelligence that provides an efficient way of converging blockchain and AI for IoT with current state-of-the-art techniques and applications is proposed.

297 citations

Journal ArticleDOI
01 Nov 2018
TL;DR: A new simulator tool called EdgeCloudSim streamlined for the edge computing scenarios is proposed in this work, which builds upon CloudSim to address the specific demands of edge computing research and support the necessary functionalities.
Abstract: Edge Computing is a fast growing field of research covering a spectrum of technologies such as Cloudlets, Fog Computing and Mobile Edge Computing (MEC). Edge Computing involves technically more sophisticated setup when compared with the pure Cloud Computing and pure Mobile Computing cases since both computational and network resources should be considered simultaneously. In that respect, it provides a larger design space with many parameters rendering a variety of novel approaches feasible. Given the complexity, Edge Computing designs deserve scientific scrutiny for sound assessment of their feasibility. However, despite increasing research activity, this field lacks a simulation tool compatible with the requirements. Starting from available simulators a significant programming effort is required to obtain a simulation tool meeting the actual needs. To decrease the barriers, a new simulator tool called EdgeCloudSim streamlined for Edge Computing scenarios is proposed in this work. EdgeCloudSim builds upon CloudSim to address the specific demands of Edge Computing research and support necessary functionality in terms of computation and networking abilities. To demonstrate the capabilities of EdgeCloudSim an experiment setup based on different edge architectures is simulated and the effect of the computational and networking system parameters on the results are depicted.

296 citations

Proceedings ArticleDOI
TL;DR: The result of this work can serve as a Micro-benchmark in studies/research related with IoT and Fog Computing, and can be used for Quality of Service (QoS) and Service Level Objective benchmarking for IoT applications.

288 citations

Journal ArticleDOI
01 Dec 2017
TL;DR: This work model the service placement problem for IoT applications over fog resources as an optimization problem, which explicitly considers the heterogeneity of applications and resources in terms of Quality of Service attributes, and proposes a genetic algorithm as a problem resolution heuristic.
Abstract: The Internet of Things (IoT) leads to an ever-growing presence of ubiquitous networked computing devices in public, business, and private spaces. These devices do not simply act as sensors, but feature computational, storage, and networking resources. Being located at the edge of the network, these resources can be exploited to execute IoT applications in a distributed manner. This concept is known as fog computing. While the theoretical foundations of fog computing are already established, there is a lack of resource provisioning approaches to enable the exploitation of fog-based computational resources. To resolve this shortcoming, we present a conceptual fog computing framework. Then, we model the service placement problem for IoT applications over fog resources as an optimization problem, which explicitly considers the heterogeneity of applications and resources in terms of Quality of Service attributes. Finally, we propose a genetic algorithm as a problem resolution heuristic and show, through experiments, that the service execution can achieve a reduction of network communication delays when the genetic algorithm is used, and a better utilization of fog resources when the exact optimization method is applied.

275 citations

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors present a cloud centric vision for worldwide implementation of Internet of Things (IoT) and present a Cloud implementation using Aneka, which is based on interaction of private and public Clouds, and conclude their IoT vision by expanding on the need for convergence of WSN, the Internet and distributed computing directed at technological research community.

9,593 citations

Journal ArticleDOI
TL;DR: The result of this case study proves that the federated Cloud computing model significantly improves the application QoS requirements under fluctuating resource and service demand patterns.
Abstract: Cloud computing is a recent advancement wherein IT infrastructure and applications are provided as ‘services’ to end-users under a usage-based payment model. It can leverage virtualized services even on the fly based on requirements (workload patterns and QoS) varying with time. The application services hosted under Cloud computing model have complex provisioning, composition, configuration, and deployment requirements. Evaluating the performance of Cloud provisioning policies, application workload models, and resources performance models in a repeatable manner under varying system and user configurations and requirements is difficult to achieve. To overcome this challenge, we propose CloudSim: an extensible simulation toolkit that enables modeling and simulation of Cloud computing systems and application provisioning environments. The CloudSim toolkit supports both system and behavior modeling of Cloud system components such as data centers, virtual machines (VMs) and resource provisioning policies. It implements generic application provisioning techniques that can be extended with ease and limited effort. Currently, it supports modeling and simulation of Cloud computing environments consisting of both single and inter-networked clouds (federation of clouds). Moreover, it exposes custom interfaces for implementing policies and provisioning techniques for allocation of VMs under inter-networked Cloud computing scenarios. Several researchers from organizations, such as HP Labs in U.S.A., are using CloudSim in their investigation on Cloud resource provisioning and energy-efficient management of data center resources. The usefulness of CloudSim is demonstrated by a case study involving dynamic provisioning of application services in the hybrid federated clouds environment. The result of this case study proves that the federated Cloud computing model significantly improves the application QoS requirements under fluctuating resource and service demand patterns. Copyright © 2010 John Wiley & Sons, Ltd.

4,570 citations

Book ChapterDOI
01 Jan 2014
TL;DR: This chapter proposes a hierarchical distributed architecture that extends from the edge of the network to the core nicknamed Fog Computing, and pays attention to a new dimension that IoT adds to Big Data and Analytics: a massively distributed number of sources at the edge.
Abstract: Internet of Things (IoT) brings more than an explosive proliferation of endpoints. It is disruptive in several ways. In this chapter we examine those disruptions, and propose a hierarchical distributed architecture that extends from the edge of the network to the core nicknamed Fog Computing. In particular, we pay attention to a new dimension that IoT adds to Big Data and Analytics: a massively distributed number of sources at the edge.

1,036 citations

Journal ArticleDOI
TL;DR: The IoT experimentation facility described in this paper is conceived to provide a suitable platform for large scale experimentation and evaluation of IoT concepts under real-life conditions to influence the definition and specification of Future Internet architecture design.

622 citations

Journal ArticleDOI
TL;DR: Results show that as the number of applications demanding real-time service increases, the fog computing paradigm outperforms traditional cloud computing.
Abstract: This work performs a rigorous, comparative analysis of the fog computing paradigm and the conventional cloud computing paradigm in the context of the Internet of Things (IoT), by mathematically formulating the parameters and characteristics of fog computing—one of the first attempts of its kind. With the rapid increase in the number of Internet-connected devices, the increased demand of real-time, low-latency services is proving to be challenging for the traditional cloud computing framework. Also, our irreplaceable dependency on cloud computing demands the cloud data centers (DCs) always to be up and running which exhausts huge amount of power and yield tons of carbon dioxide ( $\text{CO}_2$ ) gas. In this work, we assess the applicability of the newly proposed fog computing paradigm to serve the demands of the latency-sensitive applications in the context of IoT. We model the fog computing paradigm by mathematically characterizing the fog computing network in terms of power consumption, service latency, $\text{CO}_2$ emission, and cost, and evaluating its performance for an environment with high number of Internet-connected devices demanding real-time service. A case study is performed with traffic generated from the $100$ highest populated cities being served by eight geographically distributed DCs. Results show that as the number of applications demanding real-time service increases, the fog computing paradigm outperforms traditional cloud computing. For an environment with $50$ percent applications requesting for instantaneous, real-time services, the overall service latency for fog computing is noted to decrease by $50.09$ percent. However, it is mentionworthy that for an environment with less percentage of applications demanding for low-latency services, fog computing is observed to be an overhead compared to the traditional cloud computing. Therefore, the work shows that in the context of IoT, with high number of latency-sensitive applications fog computing outperforms cloud computing.

580 citations