scispace - formally typeset
Search or ask a question

Showing papers on "Edge computing published in 2016"


Journal ArticleDOI
Weisong Shi1, Jie Cao1, Quan Zhang1, Youhuizi Li1, Lanyu Xu1 
TL;DR: The definition of edge computing is introduced, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge Computing.
Abstract: The proliferation of Internet of Things (IoT) and the success of rich cloud services have pushed the horizon of a new computing paradigm, edge computing, which calls for processing the data at the edge of the network. Edge computing has the potential to address the concerns of response time requirement, battery life constraint, bandwidth cost saving, as well as data safety and privacy. In this paper, we introduce the definition of edge computing, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge computing. Finally, we present several challenges and opportunities in the field of edge computing, and hope this paper will gain attention from the community and inspire more research in this direction.

5,198 citations


Journal ArticleDOI
TL;DR: This survey paper summarizes the opportunities and challenges of fog, focusing primarily in the networking context of IoT.
Abstract: Fog is an emergent architecture for computing, storage, control, and networking that distributes these services closer to end users along the cloud-to-things continuum. It covers both mobile and wireline scenarios, traverses across hardware and software, resides on network edge but also over access networks and among end users, and includes both data plane and control plane. As an architecture, it supports a growing variety of applications, including those in the Internet of Things (IoT), fifth-generation (5G) wireless systems, and embedded artificial intelligence (AI). This survey paper summarizes the opportunities and challenges of fog, focusing primarily in the networking context of IoT.

1,986 citations


Journal ArticleDOI
TL;DR: The success of the Internet of Things and rich cloud services have helped create the need for edge computing, in which data processing occurs in part at the network edge, rather than completely in the cloud.
Abstract: The success of the Internet of Things and rich cloud services have helped create the need for edge computing, in which data processing occurs in part at the network edge, rather than completely in the cloud. Edge computing could address concerns such as latency, mobile devices' limited battery life, bandwidth costs, security, and privacy.

938 citations


Journal ArticleDOI
TL;DR: Fog computing is designed to overcome limitations in traditional systems, the cloud, and even edge computing to handle the growing amount of data that is generated by the Internet of Things.
Abstract: The Internet of Things (IoT) could enable innovations that enhance the quality of life, but it generates unprecedented amounts of data that are difficult for traditional systems, the cloud, and even edge computing to handle. Fog computing is designed to overcome these limitations.

873 citations


Journal ArticleDOI
TL;DR: By sacrificing modest computation resources to save communication bandwidth and reduce transmission latency, fog computing can significantly improve the performance of cloud computing.
Abstract: Mobile users typically have high demand on localized and location-based information services. To always retrieve the localized data from the remote cloud, however, tends to be inefficient, which motivates fog computing. The fog computing, also known as edge computing, extends cloud computing by deploying localized computing facilities at the premise of users, which prestores cloud data and distributes to mobile users with fast-rate local connections. As such, fog computing introduces an intermediate fog layer between mobile users and cloud, and complements cloud computing toward low-latency high-rate services to mobile users. In this fundamental framework, it is important to study the interplay and cooperation between the edge (fog) and the core (cloud). In this paper, the tradeoff between power consumption and transmission delay in the fog-cloud computing system is investigated. We formulate a workload allocation problem which suggests the optimal workload allocations between fog and cloud toward the minimal power consumption with the constrained service delay. The problem is then tackled using an approximate approach by decomposing the primal problem into three subproblems of corresponding subsystems, which can be, respectively, solved. Finally, based on simulations and numerical results, we show that by sacrificing modest computation resources to save communication bandwidth and reduce transmission latency, fog computing can significantly improve the performance of cloud computing.

681 citations


Book ChapterDOI
01 Jan 2016
TL;DR: This chapter provides a background and motivations regarding the emergence of Fog computing, and defines its key characteristics, including a reference architecture for Fog computing.
Abstract: The Internet of Everything (IoE) solutions gradually bring every object online, and processing data in a centralized cloud does not scale to requirements of such an environment. This is because there are applications such as health monitoring and emergency response that require low latency, so delay caused by transferring data to the cloud and then back to the application can seriously impact the performance. To this end, Fog computing has emerged, where cloud computing is extended to the edge of the network to decrease the latency and network congestion. Fog computing is a paradigm for managing a highly distributed and possibly virtualized environment that provides compute and network services between sensors and cloud data centers. This chapter provides a background and motivations regarding the emergence of Fog computing, and defines its key characteristics. In addition, a reference architecture for Fog computing is presented, and recent related development and applications are discussed.

376 citations


Proceedings ArticleDOI
26 Dec 2016
TL;DR: In this paper, the challenges and opportunities of edge computing are considered and the challenges that arise out of this new direction in the computing landscape, as well as the opportunities that arise from the new direction.
Abstract: Many cloud-based applications employ a data centers as a central server to process data that is generated by edge devices, such as smartphones, tablets and wearables. This model places ever increasing demands on communication and computational infrastructure with inevitable adverse effect on Quality-of-Service and Experience. The concept of Edge Computing is predicated on moving some of this computational load towards the edge of the network to harness computational capabilities that are currently untapped in edge nodes, such as base stations, routers and switches. This position paper considers the challenges and opportunities that arise out of this new direction in the computing landscape.

326 citations


Journal ArticleDOI
TL;DR: It can be said that MEC has definitely a window of opportunity to contribute to the creation of a common layer of integration for the IoT world and could pave the way and be natively integrated in the network of tomorrow.
Abstract: Mobile-Edge computing (MEC) is an emerging technology currently recognized as a key enabler for 5G networks. Compatible with current 4G networks, MEC will address many key uses of the 5G system, motivated by the massive diffusion of the Internet of Things (IoT). This article aims to provide a tutorial on MEC technology and an overview of the MEC framework and architecture recently defined by the European Telecommunications Standards Institute (ETSI) MEC Industry Specification Group (ISG) standardization organization. We provide some examples of MEC deployment, with special reference to IoT cases, since IoT is recognized as a main driver for 5G. Finally, we discuss the main benefits and challenges for MEC moving toward 5G.

308 citations


Journal ArticleDOI
TL;DR: This installment of "Blue Skies" discusses osmotic computing features, challenges, and future directions.
Abstract: Osmotic computing is a new paradigm to support the efficient execution of Internet of Things (IoT) services and applications at the network edge. This paradigm is founded on the need for a holistic distributed system abstraction enabling the deployment of lightweight microservices on resource-constrained IoT platforms at the network edge, coupled with more complex microservices running on large-scale datacenters. This paradigm is driven by the significant increase in resource capacity/capability at the network edge, along with support for data transfer protocols that enable such resources to interact more seamlessly with datacenter-based services. This installment of "Blue Skies" discusses osmotic computing features, challenges, and future directions.

296 citations


Journal ArticleDOI
TL;DR: Mobile-edge computing (MEC) is a novel paradigm that extends cloud-computing capabilities and services to the edge of the network that can support applications and services with reduced latency and improved QoS.
Abstract: Current activities in the Internet of Things (IoT) are focused on architectures, protocols, and networking for the efficient interconnection of heterogeneous things, infrastructure deployment, and creation of value-added services. The majority of the IoT products, services, and platforms are supported by cloud-computing platforms. With the IoT being a multidisciplinary ecosystem, it is now being utilized in connection with scenarios demanding real-time data processing and feedback, for example, connected and autonomous vehicles scenarios. Cloud platforms are not suitable for scenarios involving real-time operation, low latency requirements, and high quality of service (QoS). Recently, mobile-edge computing (MEC) has gained momentum from the industry to address the mentioned requirements. MEC is a novel paradigm that extends cloud-computing capabilities and services to the edge of the network. Due to dense geographical distribution, proximity to consumers, support for high mobility, and open platform, MEC can support applications and services with reduced latency and improved QoS. Thus, MEC is becoming an important enabler of consumer-centric IoT applications and services that demand real-time operations. The OpenFog Consortium and standards development organizations like ETSI have also recognized the benefits the IoT and MEC can bring to consumers. Potential applications for MEC-enabled IoT include smart mobility, connected vehicles, emergency response, smart cities, content distribution, and location-based services.

180 citations


Proceedings ArticleDOI
01 Nov 2016
TL;DR: A conceptual framework for fog resource provisioning is presented and an optimization problem is formalized which is able to take into account existing resources in fog/IoTlandscapes, to providelay-sensitive utilization of available fog-based computational resources.
Abstract: The advent of the Internet of Things (IoT) leadsto the pervasion of business and private spaces with ubiquitous, networked computing devices. These devices do not simply actas sensors, but feature computational, storage, and networkingresources. These resources are close to the edge of the network, and it is a promising approach to exploit them in order to executeIoT services. This concept is known as fog computing.Despite existing theoretical foundations, the adoption of fogcomputing is still at its very beginning. Especially, there is alack of approaches for the leasing and releasing of resources. Toresolve this shortcoming, we present a conceptual framework forfog resource provisioning. We formalize an optimization problemwhich is able to take into account existing resources in fog/IoTlandscapes. The goal of this optimization problem is to providedelay-sensitive utilization of available fog-based computationalresources. We evaluate the resource provisioning model to showthe benefits of our contributions. Our results show a decrease indelays of up to 39% compared to a baseline approach, yieldingshorter round-trip times and makespans.

Journal ArticleDOI
22 Jul 2016-Sensors
TL;DR: The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture, and demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.
Abstract: The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.

Proceedings ArticleDOI
04 Aug 2016
TL;DR: Experimental results from WiFi and 4G LTE networks are presented that confirm substantial wins from edge computing for highly interactive mobile applications.
Abstract: Computational offloading services at the edge of the Internet for mobile devices are becoming a reality. Using a wide range of mobile applications, we explore how such infrastructure improves latency and energy consumption relative to the cloud. We present experimental results from WiFi and 4G LTE networks that confirm substantial wins from edge computing for highly interactive mobile applications.

Proceedings ArticleDOI
22 Aug 2016
TL;DR: This paper considers a cloudlet in an Orthogonal Frequency-Division Multiplexing Access (OFDMA) system with multiple mobile devices, where the proposed algorithm significantly outperforms per- resource optimization, accommodating more offloading requests while achieving salient energy saving.
Abstract: In mobile edge computing systems, mobile devices can offload compute-intensive tasks to a nearby \emph{cloudlet}, so as to save energy and extend battery life. Unlike a fully-fledged cloud, a cloudlet is a small-scale datacenter deployed at a wireless access point, and thus is highly constrained by both radio and compute resources. We show in this paper that separately optimizing the allocation of either compute or radio resource - as most existing works did - is highly \emph{suboptimal}: the congestion of compute resource leads to the waste of radio resource, and vice versa. To address this problem, we propose a joint scheduling algorithm that allocates both radio and compute resources coordinately. Specifically, we consider a cloudlet in an Orthogonal Frequency-Division Multiplexing Access (OFDMA) system with multiple mobile devices, where we study subcarrier allocation for task offloading and CPU time allocation for task execution in the cloudlet. Simulation results show that the proposed algorithm significantly outperforms per- resource optimization, accommodating more offloading requests while achieving salient energy saving.

Proceedings ArticleDOI
01 Dec 2016
TL;DR: A technique for managing computation offloading in a local IoT network under bandwidth constraints and shows on average 1 hour (up to 1.5 hour) improvement in battery life of edge devices and utilizes the available resources of the gateway effectively.
Abstract: With the proliferation of portable and mobile IoT devices and their increasing processing capability, we witness that the edge of network is moving to the IoT gateways and smart devices. To avoid Big Data issues (e.g. high latency of cloud based IoT), the processing of the captured data is starting from the IoT edge node. However, the available processing capabilities and energy resources are still limited and do not allow to fully process the data on-board. It calls for offloading some portions of computation to the gateway or servers. Due to the limited bandwidth of the IoT gateways, choosing the offloading levels of connected devices and allocating bandwidth to them is a challenging problem. This paper proposes a technique for managing computation offloading in a local IoT network under bandwidth constraints. The existing bandwidth allocation and computation offloading management techniques underutilize the gateway's resources (e.g. bandwidth) due to the fragmentation issue. This issue stems from the discrete coarse-grained choices (i.e. offloading levels) on the IoT end nodes. Our proposed technique addresses this issue, and utilizes the available resources of the gateway effectively. The experimental results show on average 1 hour (up to 1.5 hour) improvement in battery life of edge devices. The utilization of gateway's bandwidth increased by 40%.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: In this article, the authors propose Edge-Fog Cloud which distributes task processing on the participating cloud resources in the network and develop the Least Processing Cost First (LPCF) method for assigning the processing tasks to nodes which provide the optimal processing time and near optimal networking costs.
Abstract: Internet of Things typically involves a significant number of smart sensors sensing information from the environment and sharing it to a cloud service for processing. Various architectural abstractions, such as Fog and Edge computing, have been proposed to localize some of the processing near the sensors and away from the central cloud servers. In this paper, we propose Edge-Fog Cloud which distributes task processing on the participating cloud resources in the network. We develop the Least Processing Cost First (LPCF) method for assigning the processing tasks to nodes which provide the optimal processing time and near optimal networking costs. We evaluate LPCF in a variety of scenarios and demonstrate its effectiveness in finding the processing task assignments.

Proceedings ArticleDOI
01 Oct 2016
TL;DR: The entire ParaDrop framework is implemented and deployed and its overall architecture is described and its initial experiences using it as an edge computing platform are described.
Abstract: We introduce ParaDrop, a specific edge computing platform that provides computing and storage resources at the "extreme" edge of the network allowing third-party developers to flexibly create new types of services. This extreme edge of the network is the WiFi Access Point (AP) or the wireless gateway through which all end-device traffic (personal devices, sensors, etc.) passes through. ParaDrop's focus on WiFi APs also stems from the fact that the WiFi AP has unique contextual knowledge of its end-devices (e.g., proximity, channel characteristics) that are lost as we get deeper into the network. While different variations and implementations of edge computing platforms have been created over the last decade, ParaDrop focuses on specific design issues around how to structure an architecture, a programming interface, and orchestration framework through which such edge computing services can be dynamically created, installed, and revoked. ParaDrop consists of the following three main components: a flexible hosting substrate in the WiFi APs that supports multi-tenancy, a cloud-based backend through which such computations are orchestrated across many ParaDrop APs, and an API through which third-party developers can deploy and manage their computing functions across such different ParaDrop APs. We have implemented and deployed the entire ParaDrop framework and, in this paper, describe its overall architecture and our initial experiences using it as an edge computing platform.

Proceedings ArticleDOI
01 Jan 2016
TL;DR: FogGIS as mentioned in this paper is a framework based on fog computing for mining analytics from geospatial data, it has been built a prototype using Intel Edison, an embedded microprocessor and has validated by doing preliminary analysis including compression and overlay analysis.
Abstract: Cloud Geographic Information Systems (GIS) has emerged as a tool for analysis, processing and transmission of geospatial data. The Fog computing is a paradigm where Fog devices help to increase throughput and reduce latency at the edge of the client. This paper developed a Fog Computing based framework named FogGIS for mining analytics from geospatial data. It has been built a prototype using Intel Edison, an embedded microprocessor. FogGIS has validated by doing preliminary analysis including compression and overlay analysis. Results showed that Fog Computing hold a great promise for analysis of geospatial data. Several open source compression techniques have been used for reducing the transmission to the cloud.

Posted Content
TL;DR: This position paper considers the challenges and opportunities that arise out of this new direction in the computing landscape that is predicated on moving some of this computational load towards the edge of the network.
Abstract: Many cloud-based applications employ a data centre as a central server to process data that is generated by edge devices, such as smartphones, tablets and wearables. This model places ever increasing demands on communication and computational infrastructure with inevitable adverse effect on Quality-of-Service and Experience. The concept of Edge Computing is predicated on moving some of this computational load towards the edge of the network to harness computational capabilities that are currently untapped in edge nodes, such as base stations, routers and switches. This position paper considers the challenges and opportunities that arise out of this new direction in the computing landscape.

Proceedings ArticleDOI
01 Dec 2016
TL;DR: The paper presents a performance analysis of the Golang implementation of Virtual Resources, a software-defined IoT management construct that enables multi-tenancy support and load distribution onto edge hosts.
Abstract: Current IoT systems tend to be cloud-centric which in turn introduces network latency and constrained interaction with sensors and actuators. This paper presents the idea of using restful micro-services called Virtual Resources. A Virtual Resource is a software-defined IoT management construct that enables multi-tenancy support and load distribution onto edge hosts. The paper presents a performance analysis of our Golang implementation of Virtual Resources in various settings.

Proceedings ArticleDOI
16 May 2016
TL;DR: This paper devise a methodology, referred to as MEdia FOg Resource Estimation (MeFoRE), to provide resource estimation on the basis of service give-up ratio, also called Relinquish Rate (RR), and enhance QoS on the based of previous Quality of Experience (QoE) and Net Promoter Score (NPS) records.
Abstract: Internet of Things (IoT) is now transitioning from theory to practice. This means that a lot of data will be generated and the management of this data is going to be a big challenge. To transform IoT into reality and build upon realistic and more useful services, better resource management is required at the perception layer. In this regard, Fog computing plays a very vital role. With the advent of Vehicular Ad hoc Networks (VANET) and remote healthcare and monitoring, quick response time and latency minimization are required. However, the receiving nodes have a very fluctuating behavior in resource consumption especially if they are mobile. Fog, a localized cloud placed close to the underlying IoTs, provides the means to cater such issues by analyzing the behavior of the nodes and estimating resources accordingly. Similarly, Service Level Agreement (SLA) management and meeting the Quality of Service (QoS) requirements also become issues. In this paper, we devise a methodology, referred to as MEdia FOg Resource Estimation (MeFoRE), to provide resource estimation on the basis of service give-up ratio, also called Relinquish Rate (RR), and enhance QoS on the basis of previous Quality of Experience (QoE) and Net Promoter Score (NPS) records. The algorithms are implemented using CloudSim and applied on real IoT traces on the basis of Amazon EC2 resource pricing.

Proceedings ArticleDOI
16 Sep 2016
TL;DR: In this paper, an efficient reinforcement learning-based resource management algorithm was proposed to minimize the long-term system cost, including both service delay and operational cost, by using a decomposition of the value iteration and (online) reinforcement learning.
Abstract: Mobile edge computing (a.k.a. fog computing) has recently emerged to enable in-situ processing of delay-sensitive applications at the edge of mobile networks. Providing grid power supply in support of mobile edge computing, however, is costly and even infeasible (in certain rugged or under-developed areas), thus mandating on-site renewable energy as a major or even sole power supply in increasingly many scenarios. Nonetheless, the high intermittency and unpredictability of renewable energy make it very challenging to deliver a high quality of service to users in renewable-powered mobile edge computing systems. In this paper, we address the challenge of incorporating renewables into mobile edge computing and propose an efficient reinforcement learning-based resource management algorithm, which learns on-the-fly the optimal policy of dynamic workload offloading (to centralized cloud) and edge server provisioning to minimize the long-term system cost (including both service delay and operational cost). Our online learning algorithm uses a decomposition of the (offline) value iteration and (online) reinforcement learning, thus achieving a significant improvement of learning rate and run- time performance when compared to standard reinforcement learning algorithms such as Q- learning.

Patent
29 Dec 2016
TL;DR: In this paper, the authors describe a distributed processing of Internet of Things (IoT) device data by edge systems co-located within a globally-distributed set of co-location facilities deployed and managed by a colocation facility provider.
Abstract: Techniques are described for distributed processing of Internet of Things (IoT) device data by edge systems co-located within a globally-distributed set of co-location facilities deployed and managed by a co-location facility provider. For example, a method includes selecting, by at least one of a plurality of edge computing systems co-located within respective co-location facilities each deployed and managed by a single co-location facility provider, a selected edge computing system of the plurality of edge computing systems to process data associated with events generated by an IoT device. The method also includes provisioning, at the selected edge computing system, an application programming interface (API) endpoint for communication with the IoT device, receiving, by the selected edge computing system at the endpoint, the data associated with the events generated by the IoT device, and processing, by the selected edge computing system, the data associated with the events generated by the IoT device.

Journal ArticleDOI
TL;DR: In this paper, the authors present adaptive edge computing solutions based on regressive admission control (REAC) and fuzzy weighted queueing (FWQ) that monitor and react to network quality-of-service (QoS) changes within heterogeneous networks, and in a vehicular use case scenario utilizing IEEE 802.11p technology.
Abstract: The vision of future networking is that not only people but also all things, services, and media will be connected and integrated, creating an Internet of Everything (IoE). Internet-of-Things (IoT) systems aim to connect and scale billions of devices in various domains such as transportation, industry, smart home/city, medical services, and energy systems. Different wireless and wired technologies link sensors and systems together, through wireless access points, gateways, and routers that in turn connect to the web and cloud-based intelligence. IoT architectures make great demands on network control methods for the efficient management of massive amounts of nodes and data. Therefore, some of the cloud’s management tasks should be distributed around the edges of networked systems, utilizing fog computing to control and manage, e.g., network resources, quality, traffic prioritizations, and security. In this work, we present adaptive edge computing solutions based on regressive admission control (REAC) and fuzzy weighted queueing (FWQ) that monitor and react to network quality-of-service (QoS) changes within heterogeneous networks, and in a vehicular use case scenario utilizing IEEE 802.11p technology. These adaptive solutions are providing more stable network performance and optimizing the network path and resources.

Proceedings ArticleDOI
18 May 2016
TL;DR: The results showed the efficacy of FIT as a Fog interface to translate the clinical speech processing chain (CLIP) from a cloud-based backend to a fog-based smart gateway.
Abstract: There is an increasing demand for smart fog-computing gateways as the size of cloud data is growing. This paper presents a Fog computing interface (FIT) for processing clinical speech data. FIT builds upon our previous work on EchoWear, a wearable technology that validated the use of smartwatches for collecting clinical speech data from patients with Parkinson's disease (PD). The fog interface is a low-power embedded system that acts as a smart interface between the smartwatch and the cloud. It collects, stores, and processes the speech data before sending speech features to secure cloud storage. We developed and validated a working prototype of FIT that enabled remote processing of clinical speech data to get speech clinical features such as loudness, short-time energy, zero-crossing rate, and spectral centroid. We used speech data from six patients with PD in their homes for validating FIT. Our results showed the efficacy of FIT as a Fog interface to translate the clinical speech processing chain (CLIP) from a cloud-based backend to a fog-based smart gateway.

Proceedings ArticleDOI
10 Nov 2016
TL;DR: This paper presents the idea and evaluation of using virtual resources in combination with a permission-based blockchain for provisioning IoT services on edge hosts.
Abstract: Moving IoT components from the cloud onto edge hosts helps in reducing overall network traffic and thus minimizes latency. However, provisioning IoT services on the IoT edge devices presents new challenges regarding system design and maintenance. One possible approach is the use of software-defined IoT components in the form of virtual IoT resources. This, in turn, allows exposing the thing/device layer and the core IoT service layer as collections of micro services that can be distributed to a broad range of hosts. This paper presents the idea and evaluation of using virtual resources in combination with a permission-based blockchain for provisioning IoT services on edge hosts.

Proceedings ArticleDOI
01 Oct 2016
TL;DR: This paper proposes a new computing paradigm, Firework, which is designed for big data processing in collaborative edge environment (CEE), and targets to share data while ensuring data privacy and integrity for stakeholders.
Abstract: Cloud computing, arguably, has become the de facto computing platform for the big data processing by researchers and practitioners for the last decade, and enabled different stakeholders to discover valuable information from large scale data. At the same time, in the decade, we have witnessed the fast growing deployment of billions of sensors and actuators in multiple applications domains, such as transportation, manufacturing, connected/wearable health care, smart city and so on, stimulating the emerging of Edge Computing (a.k.a., fog computing, cloudlet). However, data, as the core of both cloud computing and edge computing, is still owned by each stakeholder and rarely shared due to privacy concern and formidable cost of data transportation, which significantly limits Internet of Things (IoT) applications that need data input from multiple stakeholders (e.g., video analytics collects data from cameras owned by police department, transportation department, retailer stores, etc.). In this paper, we envision that in the era of IoT the demand of distributed big data sharing and processing applications will dramatically increase since the data producing and consuming are pushed to the edge of the network. Data processing in collaborative edge environment needs to fuse data owned by multiple stakeholders, while keeping the computation within stakeholders' data facilities. To attack this challenge, we propose a new computing paradigm, Firework, which is designed for big data processing in collaborative edge environment (CEE). Firework fuses geographically distributed data by creating virtual shared data views that are exposed to end users via predefined interfaces by data owners. The interfaces are provided in the form of a set of datasets and a set of functions, where the functions are privacy preserved and bound to the datasets. Firework targets to share data while ensuring data privacy and integrity for stakeholders. By pushing the data processing as close as to data sources, Firework also aims to avoid data movement from the edge of the network to the cloud and improve the response latency.

Journal ArticleDOI
TL;DR: This paper focuses on the development of distributed computing and storage infrastructure that will enable the deployment of applications and services at the edge of the network, allowing operators to offer a virtualized environment to enterprise customers and industries to implement applications and Services close to end users.
Abstract: Future 5G cellular networks are expected to play a major role in supporting the Internet of Things (IoT) due to their ubiquitous coverage, plug-and-play configuration, and embedded security. Besides connectivity, however, the IoT will need computation and storage in proximity of sensors and actuators to support timecritical and opportunistic applications. Mobile-edge computing (MEC) is currently under standardization as a novel paradigm expected to enrich future broadband communication networks [1], [2]. With MEC, traditional networks will be empowered by placing cloud-computing-like capabilities within the radio access network, in an MEC server located in close proximity to end users. Such distributed computing and storage infrastructure will enable the deployment of applications and services at the edge of the network, allowing operators to offer a virtualized environment to enterprise customers and industries to implement applications and services close to end users.

Book ChapterDOI
01 Dec 2016
TL;DR: This paper evaluates through performance analysis three "off-the-shelf" object store solutions, namely Rados, Cassandra and InterPlanetary File System (IPFS), and shows that among the three tested solutions IPFS fills most of the criteria expected for a Fog/Edge computing infrastructure.
Abstract: Fog/Edge computing infrastructures have been proposed as an alternative of current Cloud Computing facilities to address the latency issue that prevents the development of several applications. The main idea is to deploy smaller data-centers at the edge of the backbone in order to bring Cloud computing resources closer to the end-usages. While coupleof works illustrated the advantages of such infrastructures in particular for the Internet of Things (IoT) applications, the way of designing elementary services that can take advantage of such massively distributed infrastructures has not been yet discussed. In this paper, we propose to deal with such a question from the storage point of view. First, we propose a list of properties a storage system should meet in this context. Second, we evaluate through performance analysis three "off-the-shelf" object store solutions, namely Rados, Cassandra and InterPlanetary File System (IPFS). In particular, we focused (i) on access times to push and get objects under different scenarios and (ii) on the amount of network traffic that is exchanged between the different sites during such operations. Experiments have been conducted using the Yahoo Cloud System Benchmark (YCSB) on top of the Grid'5000 testbed. We show that among the three tested solutions IPFS fills most of the criteria expected for a Fog/Edge computing infrastructure.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: The results indicate that the type of IoT application, the availability of local renewable energy, and weather forecasting all influence how a system makes dynamic decisions in terms of saving energy.
Abstract: The Internet of things (IoT) is hailed as the next big phase in the evolution of the Internet. IoT devices are rapidly proliferating, owing to widespread adoption by industries in various sectors. Unlike security and privacy concerns, the energy consumption of the IoT and its applications has received little attention by the research community. This paper explores different options for deploying energy-efficient IoT applications. Specifically, we evaluate the use of a combination of Fog computing and microgrids for reducing the energy consumption of IoT applications. First, we study the energy consumption of different types of IoT applications (such as IoT applications with differing traffic generation or computation) running from both Fog and Cloud computing. Next, we consider the role of local renewable energy provided by microgrids, and local weather forecasting, along with Fog computing. To evaluate our proposal, energy consumption modeling, practical experiments and measurements were performed. The results indicate that the type of IoT application, the availability of local renewable energy, and weather forecasting all influence how a system makes dynamic decisions in terms of saving energy.