scispace - formally typeset
Search or ask a question

Showing papers on "Utility computing published in 2017"


Journal ArticleDOI
TL;DR: Fog computing extends the cloud services to the edge of network, and makes computation, communication and storage closer to edge devices and end-users, which aims to enhance low-latency, mobility, network bandwidth, security and privacy.

645 citations


Journal ArticleDOI
TL;DR: This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing, and the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared.

442 citations


Journal ArticleDOI
TL;DR: This paper provides an overview of existing security and privacy concerns, particularly for the fog computing, and highlights ongoing research effort, open challenges, and research trends in privacy and security issues for fog computing.
Abstract: Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heterogeneity, and large-scale geo-distribution. This paper provides an overview of existing security and privacy concerns, particularly for the fog computing. Afterward, this survey highlights ongoing research effort, open challenges, and research trends in privacy and security issues for fog computing.

414 citations


Proceedings ArticleDOI
06 Jun 2017
TL;DR: A set of parameters are defined based on which one of these implementations can be chosen optimally given a particular use-case or application and a decision tree for the selection of the optimal implementation is presented.
Abstract: When it comes to storage and computation of large scales of data, Cloud Computing has acted as the de-facto solution over the past decade. However, with the massive growth in intelligent and mobile devices coupled with technologies like Internet of Things (IoT), V2X Communications, Augmented Reality (AR), the focus has shifted towards gaining real-time responses along with support for context-awareness and mobility. Due to the delays induced on the Wide Area Network (WAN) and location agnostic provisioning of resources on the cloud, there is a need to bring the features of the cloud closer to the consumer devices. This led to the birth of the Edge Computing paradigm which aims to provide context aware storage and distributed Computing at the edge of the networks. In this paper, we discuss the three different implementations of Edge Computing namely Fog Computing, Cloudlet and Mobile Edge Computing in detail and compare their features. We define a set of parameters based on which one of these implementations can be chosen optimally given a particular use-case or application and present a decision tree for the selection of the optimal implementation.

387 citations


Journal ArticleDOI
TL;DR: This article first proposes a transparent computing based IoT architecture, and clearly identifies its advantages and associated challenges, and presents a case study to clearly show how to build scalable lightweight wearables with the proposed architecture.
Abstract: By moving service provisioning from the cloud to the edge, edge computing becomes a promising solution in the era of IoT to meet the delay requirements of IoT applications, enhance the scalability and energy efficiency of lightweight IoT devices, provide contextual information processing, and mitigate the traffic burdens of the backbone network. However, as an emerging field of study, edge computing is still in its infancy and faces many challenges in its implementation and standardization. In this article, we study an implementation of edge computing, which exploits transparent computing to build scalable IoT platforms. Specifically, we first propose a transparent computing based IoT architecture, and clearly identify its advantages and associated challenges. Then, we present a case study to clearly show how to build scalable lightweight wearables with the proposed architecture. Some future directions are finally pointed out to foster continued research efforts.

320 citations


Journal ArticleDOI
TL;DR: The basic features of the cloud computing, security issues, threats and their solutions are discussed, and several key topics related to the cloud, namely cloud architecture framework, service and deployment model, cloud technologies, cloud security concepts, threats, and attacks are described.

318 citations


Journal ArticleDOI
TL;DR: A hierarchical distributed Fog Computing architecture is introduced to support the integration of massive number of infrastructure components and services in future smart cities and demonstrates the feasibility of the system's city-wide implementation in the future.
Abstract: Data intensive analysis is the major challenge in smart cities because of the ubiquitous deployment of various kinds of sensors. The natural characteristic of geodistribution requires a new computing paradigm to offer location-awareness and latency-sensitive monitoring and intelligent control. Fog Computing that extends the computing to the edge of network, fits this need. In this paper, we introduce a hierarchical distributed Fog Computing architecture to support the integration of massive number of infrastructure components and services in future smart cities. To secure future communities, it is necessary to integrate intelligence in our Fog Computing architecture, e.g., to perform data representation and feature extraction, to identify anomalous and hazardous events, and to offer optimal responses and controls. We analyze case studies using a smart pipeline monitoring system based on fiber optic sensors and sequential learning algorithms to detect events threatening pipeline safety. A working prototype was constructed to experimentally evaluate event detection performance of the recognition of 12 distinct events. These experimental results demonstrate the feasibility of the system's city-wide implementation in the future.

284 citations


Journal ArticleDOI
01 Dec 2017
TL;DR: This work model the service placement problem for IoT applications over fog resources as an optimization problem, which explicitly considers the heterogeneity of applications and resources in terms of Quality of Service attributes, and proposes a genetic algorithm as a problem resolution heuristic.
Abstract: The Internet of Things (IoT) leads to an ever-growing presence of ubiquitous networked computing devices in public, business, and private spaces. These devices do not simply act as sensors, but feature computational, storage, and networking resources. Being located at the edge of the network, these resources can be exploited to execute IoT applications in a distributed manner. This concept is known as fog computing. While the theoretical foundations of fog computing are already established, there is a lack of resource provisioning approaches to enable the exploitation of fog-based computational resources. To resolve this shortcoming, we present a conceptual fog computing framework. Then, we model the service placement problem for IoT applications over fog resources as an optimization problem, which explicitly considers the heterogeneity of applications and resources in terms of Quality of Service attributes. Finally, we propose a genetic algorithm as a problem resolution heuristic and show, through experiments, that the service execution can achieve a reduction of network communication delays when the genetic algorithm is used, and a better utilization of fog resources when the exact optimization method is applied.

275 citations


Journal ArticleDOI
TL;DR: This paper presents the definitions of the MEC given by researchers and discusses the opportunities brought by the M EC and some of the important research challenges highlighted in MEC environment.

273 citations


Journal ArticleDOI
TL;DR: A conceptual smart pre-copy live migration approach is presented for VM migration that can estimate the downtime after each iteration to determine whether to proceed to the stop-and-copy stage during a system failure or an attack on a fog computing node.
Abstract: Fog computing, an extension of cloud computing services to the edge of the network to decrease latency and network congestion, is a relatively recent research trend. Although both cloud and fog offer similar resources and services, the latter is characterized by low latency with a wider spread and geographically distributed nodes to support mobility and real-time interaction. In this paper, we describe the fog computing architecture and review its different services and applications. We then discuss security and privacy issues in fog computing, focusing on service and resource availability. Virtualization is a vital technology in both fog and cloud computing that enables virtual machines (VMs) to coexist in a physical server (host) to share resources. These VMs could be subject to malicious attacks or the physical server hosting it could experience system failure, both of which result in unavailability of services and resources. Therefore, a conceptual smart pre-copy live migration approach is presented for VM migration. Using this approach, we can estimate the downtime after each iteration to determine whether to proceed to the stop-and-copy stage during a system failure or an attack on a fog computing node. This will minimize both the downtime and the migration time to guarantee resource and service availability to the end users of fog computing. Last, future research directions are outlined.

257 citations


Journal ArticleDOI
TL;DR: There is a significant number of computing tasks in healthcare that require or can benefit from fog computing principles, and processing on higher network tiers is required due to constraints in wireless devices and the need to aggregate data.
Abstract: Fog computing is an architectural style in which network components between devices and the cloud execute application-specific logic We present the first review on fog computing within healthcare informatics, and explore, classify, and discuss different application use cases presented in the literature For that, we categorize applications into use case classes and list an inventory of application-specific tasks that can be handled by fog computing We discuss on which level of the network such fog computing tasks can be executed, and provide tradeoffs with respect to requirements relevant to healthcare Our review indicates that: 1) there is a significant number of computing tasks in healthcare that require or can benefit from fog computing principles; 2) processing on higher network tiers is required due to constraints in wireless devices and the need to aggregate data; and 3) privacy concerns and dependability prevent computation tasks to be completely moved to the cloud These findings substantiate the need for a coherent approach toward fog computing in healthcare, for which we present a list of recommended research and development actions

Journal ArticleDOI
TL;DR: The main security and privacy challenges in this field which have grown much interest among the academia and research community are presented and corresponding security solutions have been proposed and identified in literature by many researchers to counter the challenges.

Journal ArticleDOI
TL;DR: The fog architecture will further enable pooling, orchestrating, managing, and securing the resources and functions distributed in the cloud, anywhere along the cloud- to-thing continuum, and on the things to support end-to-end services and applications.
Abstract: Fog computing is an end-to-end horizontal architecture that distributes computing, storage, control, and networking functions closer to users along the cloud-to-thing continuum. The word “edge” may carry different meanings. A common usage of the term refers to the edge network as opposed to the core network, with equipment such as edge routers, base stations, and home gateways. In that sense, there are several differences between fog and edge. First, fog is inclusive of cloud, core, metro, edge, clients, and things. The fog architecture will further enable pooling, orchestrating, managing, and securing the resources and functions distributed in the cloud, anywhere along the cloud-to-thing continuum, and on the things to support end-to-end services and applications. Second, fog seeks to realize a seamless continuum of computing services from the cloud to the things rather than treating the network edges as isolated computing platforms. Third, fog envisions a horizontal platform that will support the common fog computing functions for multiple industries and application domains, including but not limited to traditional telco services. Fourth, a dominant part of edge is mobile edge, whereas the fog computing architecture will be flexible enough to work over wireline as well as wireless networks.

Journal ArticleDOI
TL;DR: This paper intends to carry out a comprehensive survey on the models proposed in literature with respect to the implementation principles to address the QoS guarantee issue.
Abstract: Cloud can be defined as a new computing paradigm that provides scalable, on-demand, and virtualized resources for users. In this style of computing, users can access a shared pool of computing resources which are provisioned with minimal management efforts of users. Yet there are some obstacles and concerns about the use of clouds. Guaranteeing quality of service U+0028 QoS U+0029 by service providers can be regarded as one of the main concerns for companies tending to use it. Service provisioning in clouds is based on service level agreements representing a contract negotiated between users and providers. According to this contract, if a provider cannot satisfy its agreed application requirements, it should pay penalties as compensation. In this paper, we intend to carry out a comprehensive survey on the models proposed in literature with respect to the implementation principles to address the QoS guarantee issue.

Journal ArticleDOI
TL;DR: This paper proposes a resource allocation strategy for fog computing based on priced timed Petri nets (PTPNs), by which the user can choose the satisfying resources autonomously from a group of preallocated resources.
Abstract: Fog computing, also called “clouds at the edge,” is an emerging paradigm allocating services near the devices to improve the quality of service (QoS) The explosive prevalence of Internet of Things, big data, and fog computing in the context of cloud computing makes it extremely challenging to explore both cloud and fog resource scheduling strategy so as to improve the efficiency of resources utilization, satisfy the users’ QoS requirements, and maximize the profit of both resource providers and users This paper proposes a resource allocation strategy for fog computing based on priced timed Petri nets (PTPNs), by which the user can choose the satisfying resources autonomously from a group of preallocated resources Our strategy comprehensively considers the price cost and time cost to complete a task, as well as the credibility evaluation of both users and fog resources We construct the PTPN models of tasks in fog computing in accordance with the features of fog resources Algorithm that predicts task completion time is presented Method of computing the credibility evaluation of fog resource is also proposed In particular, we give the dynamic allocation algorithm of fog resources Simulation results demonstrate that our proposed algorithms can achieve a higher efficiency than static allocation strategies in terms of task completion time and price

Journal ArticleDOI
TL;DR: This work proposes a novel framework for coordinated processing between edge and cloud computing/processing by integrating advantages from both the platforms and provides various synergies and distinctions between cloud and edge processing.
Abstract: Recently, big data analytics has received important attention in a variety of application domains including business, finance, space science, healthcare, telecommunication and Internet of Things (IoT). Among these areas, IoT is considered as an important platform in bringing people, processes, data and things/objects together in order to enhance the quality of our everyday lives. However, the key challenges are how to effectively extract useful features from the massive amount of heterogeneous data generated by resource-constrained IoT devices in order to provide real-time information and feedback to the end-users, and how to utilize this data-aware intelligence in enhancing the performance of wireless IoT networks. Although there are parallel advances in cloud computing and edge computing for addressing some issues in data analytics, they have their own benefits and limitations. The convergence of these two computing paradigms, i.e., massive virtually shared pool of computing and storage resources from the cloud and real-time data processing by edge computing, could effectively enable live data analytics in wireless IoT networks. In this regard, we propose a novel framework for coordinated processing between edge and cloud computing/processing by integrating advantages from both the platforms. The proposed framework can exploit the network-wide knowledge and historical information available at the cloud center to guide edge computing units towards satisfying various performance requirements of heterogeneous wireless IoT networks. Starting with the main features, key enablers and the challenges of big data analytics, we provide various synergies and distinctions between cloud and edge processing. More importantly, we identify and describe the potential key enablers for the proposed edge-cloud collaborative framework, the associated key challenges and some interesting future research directions.

Journal ArticleDOI
TL;DR: This work makes a novel attempt to identify the need of DDoS mitigation solutions involving multi-level information flow and effective resource management during the attack, and concludes that there is a strong requirement of solutions, which are designed keeping utility computing models in mind.

Proceedings ArticleDOI
12 Oct 2017
TL;DR: This paper designs seven interactive wearable cognitive assistance applications and evaluates their performance in terms of latency across a range of edge computing configurations, mobile hardware, and wireless networks, including 4G LTE.
Abstract: An emerging class of interactive wearable cognitive assistance applications is poised to become one of the key demonstrators of edge computing infrastructure. In this paper, we design seven such applications and evaluate their performance in terms of latency across a range of edge computing configurations, mobile hardware, and wireless networks, including 4G LTE. We also devise a novel multi-algorithm approach that leverages temporal locality to reduce end-to-end latency by 60% to 70%, without sacrificing accuracy. Finally, we derive target latencies for our applications, and show that edge computing is crucial to meeting these targets.

Journal ArticleDOI
TL;DR: Some of the more important architectural requirements for critical Internet of Things networks in the context of exemplary use cases are discussed, and how fog computing techniques can help fulfill them.
Abstract: Fog computing is an architecture that extends the traditionally centralized functions of cloud computing to the edge and into close proximity to the things in an Internet of Things network. Fog computing brings many advantages, including enhanced performance, better efficiency, network bandwidth savings, improved security, and resiliency. This article discusses some of the more important architectural requirements for critical Internet of Things networks in the context of exemplary use cases, and how fog computing techniques can help fulfill them.

Journal ArticleDOI
TL;DR: This article proposes a new concept called fog vehicular computing (FVC) to augment the computation and storage power of fog computing and designs a comprehensive architecture for FVC and presents a number of salient applications.
Abstract: Fog computing has emerged as a promising solution for accommodating the surge of mobile traffic and reducing latency, both known to be inherent problems of cloud computing. Fog services, including computation, storage, and networking, are hosted in the vicinity of end users (edge of the network), and, as a result, reliable access is provisioned to delay-sensitive mobile applications. However, in some cases, the fog computing capacity is overwhelmed by the growing number of demands from patrons, particularly during peak hours, and this can subsequently result in acute performance degradation. In this article, we address this problem by proposing a new concept called fog vehicular computing (FVC) to augment the computation and storage power of fog computing. We also design a comprehensive architecture for FVC and present a number of salient applications. The result of implementation clearly shows the effectiveness of the proposed architecture. Finally, some open issues and envisioned directions are discussed for future research in the context of FVC.

Journal ArticleDOI
TL;DR: This paper describes in detail about the Edge Mesh computing paradigm, including the proposed software framework, research challenges, and benefits of Edge Mesh, which distributes the decision-making tasks among edge devices within the network instead of sending all the data to a centralized server.
Abstract: In recent years, there has been a paradigm shift in Internet of Things (IoT) from centralized cloud computing to edge computing (or fog computing). Developments in ICT have resulted in the significant increment of communication and computation capabilities of embedded devices and this will continue to increase in coming years. However, existing paradigms do not utilize low-level devices for any decision-making process. In fact, gateway devices are also utilized mostly for communication interoperability and some low-level processing. In this paper, we have proposed a new computing paradigm, named Edge Mesh, which distributes the decision-making tasks among edge devices within the network instead of sending all the data to a centralized server. All the computation tasks and data are shared using a mesh network of edge devices and routers. Edge Mesh provides many benefits, including distributed processing, low latency, fault tolerance, better scalability, better security, and privacy. These benefits are useful for critical applications, which require higher reliability, real-time processing, mobility support, and context awareness. We first give an overview of existing computing paradigms to establish the motivation behind Edge Mesh. Then, we describe in detail about the Edge Mesh computing paradigm, including the proposed software framework, research challenges, and benefits of Edge Mesh. We have also described the task management framework and done a preliminary study on task allocation problem in Edge Mesh. Different application scenarios, including smart home, intelligent transportation system, and healthcare, are presented to illustrate the significance of Edge Mesh computing paradigm.

Journal ArticleDOI
TL;DR: The definition and architecture of fog computing, an emerging distributed computing platform aimed at bringing computation close to its data sources, and the framework of resource allocation for latency reduction combined with reliability, fault tolerance, privacy, and underlying optimization problems are discussed.
Abstract: Fog computing (FC) is an emerging distributed computing platform aimed at bringing computation close to its data sources, which can reduce the latency and cost of delivering data to a remote cloud. This feature and related advantages are desirable for many Internet-of-Things applications, especially latency sensitive and mission intensive services. With comparisons to other computing technologies, the definition and architecture of FC are presented in this paper. The framework of resource allocation for latency reduction combined with reliability, fault tolerance, privacy, and underlying optimization problems are also discussed. We then investigate an application scenario and conduct resource optimization by formulating the optimization problem and solving it via a genetic algorithm. The resulting analysis generates some important insights on the scalability of the FC systems.

Proceedings ArticleDOI
08 May 2017
TL;DR: A new simulator tool called EdgeCloudSim streamlined for Edge Computing scenarios is proposed, which builds upon CloudSim to address the specific demands of Edge Computing research and support necessary functionality in terms of computation and networking abilities.
Abstract: Edge Computing is a fast growing field of research covering a spectrum of technologies such as Cloudlets, Fog Computing and Mobile Edge Computing (MEC). Edge Computing involves technically more sophisticated setup when compared with the pure Cloud Computing and pure Mobile Computing cases since both computational and network resources should be considered simultaneously. In that respect, it provides a larger design space with many parameters rendering a variety of novel approaches feasible. Given the complexity, Edge Computing designs deserve scientific scrutiny for sound assessment of their feasibility. However, despite increasing research activity, this field lacks a simulation tool compatible with the requirements. Starting from available simulators a significant programming effort is required to obtain a simulation tool meeting the actual needs. To decrease the barriers, a new simulator tool called EdgeCloudSim streamlined for Edge Computing scenarios is proposed in this work. EdgeCloudSim builds upon CloudSim to address the specific demands of Edge Computing research and support necessary functionality in terms of computation and networking abilities. To demonstrate the capabilities of EdgeCloudSim an experiment setup based on different edge architectures is simulated and the effect of the computational and networking system parameters on the results are depicted.

Proceedings ArticleDOI
25 Jun 2017
TL;DR: Zenith is proposed, a novel model for allocating computing resources in an edge computing platform that allows service providers to establish resource sharing contracts with edge infrastructure providers apriori and employs a latency-aware scheduling and resource provisioning algorithm.
Abstract: In the Internet of Things(IoT) era, the demands for low-latency computing for time-sensitive applications (e.g., location-based augmented reality games, real-time smart grid management, real-time navigation using wearables) has been growing rapidly. Edge Computing provides an additional layer of infrastructure to fill latency gaps between the IoT devices and the back-end computing infrastructure. In the edge computing model, small-scale micro-datacenters that represent ad-hoc and distributed collection of computing infrastructure pose new challenges in terms of management and effective resource sharing to achieve a globally efficient resource allocation. In this paper, we propose Zenith, a novel model for allocating computing resources in an edge computing platform that allows service providers to establish resource sharing contracts with edge infrastructure providers apriori. Based on the established contracts, service providers employ a latency-aware scheduling and resource provisioning algorithm that enables tasks to complete and meet their latency requirements. The proposed techniques are evaluated through extensive experiments that demonstrate the effectiveness, scalability and performance efficiency of the proposed model.

Journal ArticleDOI
TL;DR: This paper designs a novel information-centric heterogeneous networks framework aiming at enabling content caching and computing, and forms the virtual resource allocation strategy as a joint optimization problem, where the gains of not only virtualization but also caching and Computing are taken into consideration.
Abstract: In order to better accommodate the dramatically increasing demand for data caching and computing services, storage and computation capabilities should be endowed to some of the intermediate nodes within the network, therefore increasing the data throughput and reducing the network operation cost. In this paper, we design a novel information-centric heterogeneous networks framework aiming at enabling content caching and computing. Furthermore, due to the virtualization of the whole system, communication, computing, and caching resources can be shared among all users associated with different virtual service providers. We formulate the virtual resource allocation strategy as a joint optimization problem, where the gains of not only virtualization but also caching and computing are taken into consideration in the proposed information-centric heterogeneous networks virtualization architecture. In addition, a distributed algorithm based on alternating direction method of multipliers is adopted in order to solve the formulated problem. Since each base station only needs to solve its own problem without exchange of channel state information by using the distributed algorithm, the computational complexity and signaling overhead can be greatly reduced. Finally, extensive simulations are presented to show the effectiveness of the proposed scheme under different system parameters.

Journal ArticleDOI
TL;DR: The significance of edge computing is highlighted by providing real-life scenarios that have strict constraint requirements on application response time by devise a taxonomy to classify the current research efforts in the domain of edge Computing.
Abstract: The virtually unlimited available resources and wide range of services provided by the cloud have resulted in the emergence of new cloud-based applications, such as smart grids, smart building control, and virtual reality. These developments, however, have also been accompanied by a problem for delay-sensitive applications that have stringent delay requirements. The current cloud computing paradigm cannot realize the requirements of mobility support, location awareness, and low latency. Hence, to address the problem, an edge computing paradigm that aims to extend the cloud resources and services and enable them to be nearer the edge of an enterprise's network has been introduced. In this article, we highlight the significance of edge computing by providing real-life scenarios that have strict constraint requirements on application response time. From the previous literature, we devise a taxonomy to classify the current research efforts in the domain of edge computing. We also discuss the key requirements that enable edge computing. Finally, current challenges in realizing the vision of edge computing are discussed.

Journal ArticleDOI
TL;DR: The authors propose a flexible software architecture, which can incorporate different design choices and user-specified polices in fog environments and present their design of WM-FOG, a computing framework for fog environments that embraces this software architecture.
Abstract: This article presents a detailed description of fog computing (also known as edge computing) and explores its research challenges and problems. Based on the authors' understanding of these challenges and problems, they propose a flexible software architecture, which can incorporate different design choices and user-specified polices. They present their design of WM-FOG, a computing framework for fog environments that embraces this software architecture, and evaluate their prototype system.

Journal ArticleDOI
TL;DR: A novel cost-oriented optimization model is proposed for a cloud-based ICT infrastructure to allocate cloud computing resources in a flexible and cost-efficient way and compared with the mature simulating annealing based algorithm.
Abstract: With the rapid increase of monitoring devices and controllable facilities in the demand side of electricity networks, more solid information and communication technology (ICT) resources are required to support the development of demand side management (DSM). Different from traditional computation in power systems which customizes ICT resources for mapping applications separately, DSM especially asks for scalability and economic efficiency, because there are more and more stakeholders participating in the computation process. This paper proposes a novel cost-oriented optimization model for a cloud-based ICT infrastructure to allocate cloud computing resources in a flexible and cost-efficient way. Uncertain factors including imprecise computation load prediction and unavailability of computing instances can also be considered in the proposed model. A modified priority list algorithm is specially developed in order to efficiently solve the proposed optimization model and compared with the mature simulating annealing based algorithm. Comprehensive numerical studies are fulfilled to demonstrate the effectiveness of the proposed cost-oriented model on reducing the operation cost of cloud platform in DSM.

Journal ArticleDOI
TL;DR: The disadvantages of cloud computing when big data encounters IoT are analyzed, and two promising computing paradigms are introduced, including fog computing and transparent computing, to support the big data services of IoT.
Abstract: The explosive growth of data volume and the ever-increasing demands of data value extraction have driven us into the era of big data. The “5V” (Variety, Velocity, Volume, Value, and Veracity) characteristics of big data pose great challenges to traditional computing paradigms and motivate the emergence of new solutions. Cloud computing is one of the representative technologies that can perform massive-scale and complex data computing by taking advantages of virtualized resources, parallel processing and data service integration with scalable data storage. However, as we are also experiencing the revolution of Internet-of-things (IoT), the limitations of cloud computing on supporting lightweight end devices significantly impede the flourish of cloud computing at the intersection of big data and IoT era. It also promotes the urgency of proposing new computing paradigms. We provide an overview on the topic of big data, and a comprehensive survey on how cloud computing as well as its related technologies can address the challenges arisen by big data. Then, we analyze the disadvantages of cloud computing when big data encounters IoT, and introduce two promising computing paradigms, including fog computing and transparent computing, to support the big data services of IoT. Finally, some open challenges and future directions are summarized to foster continued research efforts into this evolving field of study.

Journal ArticleDOI
TL;DR: A three-tier system architecture is proposed and mathematically characterize each tier in terms of energy consumption and latency so that the transmission latency and bandwidth burden caused by cloud computing can be effectively reduced.