scispace - formally typeset
Search or ask a question
Book ChapterDOI

Fog Computing: A Taxonomy, Survey and Future Directions

TL;DR: In this paper, the challenges in fog computing acting as an intermediate layer between IoT devices/sensors and cloud datacentres and review the current developments in this field are discussed.
Abstract: In recent years, the number of Internet of Things (IoT) devices/sensors has increased to a great extent. To support the computational demand of real-time latency-sensitive applications of largely geo-distributed IoT devices/sensors, a new computing paradigm named "Fog computing" has been introduced. Generally, Fog computing resides closer to the IoT devices/sensors and extends the Cloud-based computing, storage and networking facilities. In this chapter, we comprehensively analyse the challenges in Fogs acting as an intermediate layer between IoT devices/ sensors and Cloud datacentres and review the current developments in this field. We present a taxonomy of Fog computing according to the identified challenges and its key features.We also map the existing works to the taxonomy in order to identify current research gaps in the area of Fog computing. Moreover, based on the observations, we propose future directions for research.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper provides a tutorial on fog computing and its related computing paradigms, including their similarities and differences, and provides a taxonomy of research topics in fog computing.

783 citations

Journal ArticleDOI
TL;DR: Fog computing extends the cloud services to the edge of network, and makes computation, communication and storage closer to edge devices and end-users, which aims to enhance low-latency, mobility, network bandwidth, security and privacy.

645 citations

Journal ArticleDOI
TL;DR: This survey aims to shape a coherent and comprehensive picture of the current state-of-the-art efforts in this direction by starting with fundamental working principles of blockchains and how blockchain-based systems achieve the characteristics of decentralization, security, and auditability.
Abstract: The blockchain technology has revolutionized the digital currency space with the pioneering cryptocurrency platform named Bitcoin. From an abstract perspective, a blockchain is a distributed ledger capable of maintaining an immutable log of transactions happening in a network. In recent years, this technology has attracted significant scientific interest in research areas beyond the financial sector, one of them being the Internet of Things (IoT). In this context, the blockchain is seen as the missing link toward building a truly decentralized, trustless, and secure environment for the IoT and, in this survey, we aim to shape a coherent and comprehensive picture of the current state-of-the-art efforts in this direction. We start with fundamental working principles of blockchains and how blockchain-based systems achieve the characteristics of decentralization, security, and auditability. From there, we build our narrative on the challenges posed by the current centralized IoT models, followed by recent advances made both in industry and research to solve these challenges and effectively use blockchains to provide a decentralized, secure medium for the IoT.

553 citations

Journal ArticleDOI
TL;DR: This survey starts by providing an overview and fundamental of fog computing architecture, and provides an extensive overview of state-of-the-art network applications and major research aspects to design these networks.
Abstract: Fog computing is an emerging paradigm that extends computation, communication, and storage facilities toward the edge of a network. Compared to traditional cloud computing, fog computing can support delay-sensitive service requests from end-users (EUs) with reduced energy consumption and low traffic congestion. Basically, fog networks are viewed as offloading to core computation and storage. Fog nodes in fog computing decide to either process the services using its available resource or send to the cloud server. Thus, fog computing helps to achieve efficient resource utilization and higher performance regarding the delay, bandwidth, and energy consumption. This survey starts by providing an overview and fundamental of fog computing architecture. Furthermore, service and resource allocation approaches are summarized to address several critical issues such as latency, and bandwidth, and energy consumption in fog computing. Afterward, compared to other surveys, this paper provides an extensive overview of state-of-the-art network applications and major research aspects to design these networks. In addition, this paper highlights ongoing research effort, open challenges, and research trends in fog computing.

475 citations

Journal ArticleDOI
TL;DR: This paper tackles the computation offloading problem in a mixed fog/cloud system by jointly optimizing the offloading decisions and the allocation of computation resource, transmit power, and radio bandwidth while guaranteeing user fairness and maximum tolerable delay.
Abstract: Cooperation between the fog and the cloud in mobile cloud computing environments could offer improved offloading services to smart mobile user equipment (UE) with computation intensive tasks. In this paper, we tackle the computation offloading problem in a mixed fog/cloud system by jointly optimizing the offloading decisions and the allocation of computation resource, transmit power, and radio bandwidth while guaranteeing user fairness and maximum tolerable delay. This optimization problem is formulated to minimize the maximal weighted cost of delay and energy consumption (EC) among all UEs, which is a mixed-integer non-linear programming problem. Due to the NP-hardness of the problem, we propose a low-complexity suboptimal algorithm to solve it, where the offloading decisions are obtained via semidefinite relaxation and randomization, and the resource allocation is obtained using fractional programming theory and Lagrangian dual decomposition. Simulation results are presented to verify the convergence performance of our proposed algorithms and their achieved fairness among UEs, and the performance gains in terms of delay, EC, and the number of beneficial UEs over existing algorithms.

423 citations

References
More filters
Journal ArticleDOI
Weisong Shi1, Jie Cao1, Quan Zhang1, Youhuizi Li1, Lanyu Xu1 
TL;DR: The definition of edge computing is introduced, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge Computing.
Abstract: The proliferation of Internet of Things (IoT) and the success of rich cloud services have pushed the horizon of a new computing paradigm, edge computing, which calls for processing the data at the edge of the network. Edge computing has the potential to address the concerns of response time requirement, battery life constraint, bandwidth cost saving, as well as data safety and privacy. In this paper, we introduce the definition of edge computing, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge computing. Finally, we present several challenges and opportunities in the field of edge computing, and hope this paper will gain attention from the community and inspire more research in this direction.

5,198 citations

Proceedings ArticleDOI
17 Aug 2012
TL;DR: This paper argues that the above characteristics make the Fog the appropriate platform for a number of critical Internet of Things services and applications, namely, Connected Vehicle, Smart Grid, Smart Cities, and, in general, Wireless Sensors and Actuators Networks (WSANs).
Abstract: Fog Computing extends the Cloud Computing paradigm to the edge of the network, thus enabling a new breed of applications and services. Defining characteristics of the Fog are: a) Low latency and location awareness; b) Wide-spread geographical distribution; c) Mobility; d) Very large number of nodes, e) Predominant role of wireless access, f) Strong presence of streaming and real time applications, g) Heterogeneity. In this paper we argue that the above characteristics make the Fog the appropriate platform for a number of critical Internet of Things (IoT) services and applications, namely, Connected Vehicle, Smart Grid, Smart Cities, and, in general, Wireless Sensors and Actuators Networks (WSANs).

4,440 citations

Book ChapterDOI
01 Jan 2019
TL;DR: This chapter argues that the above characteristics make the Fog the appropriate platform for a number of critical internet of things services and applications, namely connected vehicle, smart grid, smart cities, and in general, wireless sensors and actuators networks (WSANs).
Abstract: Fog computing extends the cloud computing paradigm to the edge of the network, thus enabling a new breed of applications and services. Defining characteristics of the Fog are 1) low latency and location awareness, 2) widespread geographical distribution, 3) mobility, 4) very large number of nodes, 5) predominant role of wireless access, 6) strong presence of streaming and real time applications, and 7) heterogeneity. In this chapter, the authors argue that the above characteristics make the Fog the appropriate platform for a number of critical internet of things (IoT) services and applications, namely connected vehicle, smart grid, smart cities, and in general, wireless sensors and actuators networks (WSANs).

2,384 citations

Journal ArticleDOI
TL;DR: In this paper, the authors propose a simulator, called iFogSim, to model IoT and fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost.
Abstract: Summary Internet of Things (IoT) aims to bring every object (eg, smart cameras, wearable, environmental sensors, home appliances, and vehicles) online, hence generating massive volume of data that can overwhelm storage systems and data analytics applications. Cloud computing offers services at the infrastructure level that can scale to IoT storage and processing requirements. However, there are applications such as health monitoring and emergency response that require low latency, and delay that is caused by transferring data to the cloud and then back to the application can seriously impact their performances. To overcome this limitation, Fog computing paradigm has been proposed, where cloud services are extended to the edge of the network to decrease the latency and network congestion. To realize the full potential of Fog and IoT paradigms for real-time analytics, several challenges need to be addressed. The first and most critical problem is designing resource management techniques that determine which modules of analytics applications are pushed to each edge device to minimize the latency and maximize the throughput. To this end, we need an evaluation platform that enables the quantification of performance of resource management policies on an IoT or Fog computing infrastructure in a repeatable manner. In this paper we propose a simulator, called iFogSim, to model IoT and Fog environments and measure the impact of resource management techniques in latency, network congestion, energy consumption, and cost. We describe two case studies to demonstrate modeling of an IoT environment and comparison of resource management policies. Moreover, scalability of the simulation toolkit of RAM consumption and execution time is verified under different circumstances.

1,085 citations

Journal ArticleDOI
30 Sep 2015
TL;DR: This position paper position that a new shift is necessary in computing, taking the control of computing applications, data, and services away from some central nodes to the other logical extreme of the Internet, and refers to this vision of human-centered edge-device based computing as Edge-centric Computing.
Abstract: In many aspects of human activity, there has been a continuous struggle between the forces of centralization and decentralization. Computing exhibits the same phenomenon; we have gone from mainframes to PCs and local networks in the past, and over the last decade we have seen a centralization and consolidation of services and applications in data centers and clouds. We position that a new shift is necessary. Technological advances such as powerful dedicated connection boxes deployed in most homes, high capacity mobile end-user devices and powerful wireless networks, along with growing user concerns about trust, privacy, and autonomy requires taking the control of computing applications, data, and services away from some central nodes (the "core") to the other logical extreme (the "edge") of the Internet. We also position that this development can help blurring the boundary between man and machine, and embrace social computing in which humans are part of the computation and decision making loop, resulting in a human-centered system design. We refer to this vision of human-centered edge-device based computing as Edge-centric Computing. We elaborate in this position paper on this vision and present the research challenges associated with its implementation.

844 citations