scispace - formally typeset
Open AccessJournal ArticleDOI

Multitier Fog Computing With Large-Scale IoT Data Analytics for Smart Cities

Reads0
Chats0
TLDR
Fogs can largely improve the performance of smart city analytics services than cloud only model in terms of job blocking probability and service utility.
Abstract
Analysis of Internet of Things (IoT) sensor data is a key for achieving city smartness. In this paper a multitier fog computing model with large-scale data analytics service is proposed for smart cities applications. The multitier fog is consisted of ad-hoc fogs and dedicated fogs with opportunistic and dedicated computing resources, respectively. The proposed new fog computing model with clear functional modules is able to mitigate the potential problems of dedicated computing infrastructure and slow response in cloud computing. We run analytics benchmark experiments over fogs formed by Rapsberry Pi computers with a distributed computing engine to measure computing performance of various analytics tasks, and create easy-to-use workload models. Quality of services (QoS) aware admission control, offloading, and resource allocation schemes are designed to support data analytics services, and maximize analytics service utilities. Availability and cost models of networking and computing resources are taken into account in QoS scheme design. A scalable system level simulator is developed to evaluate the fog-based analytics service and the QoS management schemes. Experiment results demonstrate the efficiency of analytics services over multitier fogs and the effectiveness of the proposed QoS schemes. Fogs can largely improve the performance of smart city analytics services than cloud only model in terms of job blocking probability and service utility.

read more

Content maybe subject to copyright    Report

University of Dundee
Multitier Fog Computing With Large-Scale IoT Data Analytics for Smart Cities
He, Jianhua; Wei, Jian; Chen, Kai; Tang, Zuoyin; Zhou, Yi; Zhang, Yan
Published in:
IEEE Internet of Things Journal
DOI:
10.1109/JIOT.2017.2724845
Publication date:
2018
Document Version
Peer reviewed version
Link to publication in Discovery Research Portal
Citation for published version (APA):
He, J., Wei, J., Chen, K., Tang, Z., Zhou, Y., & Zhang, Y. (2018). Multitier Fog Computing With Large-Scale IoT
Data Analytics for Smart Cities. IEEE Internet of Things Journal, 5(2), 677-686.
https://doi.org/10.1109/JIOT.2017.2724845
General rights
Copyright and moral rights for the publications made accessible in Discovery Research Portal are retained by the authors and/or other
copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with
these rights.
• Users may download and print one copy of any publication from Discovery Research Portal for the purpose of private study or research.
• You may not further distribute the material or use it for any profit-making activity or commercial gain.
• You may freely distribute the URL identifying the publication in the public portal.
Take down policy
If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately
and investigate your claim.
Download date: 10. Aug. 2022

1
Multi-tier Fog Computing with Large-scale IoT
Data Analytics for Smart Cities
Jianhua He, Jian Wei, Kai Chen, Zuoyin Tang, Yi Zhou, and Yan Zhang
Abstract—Analysis of Internet of Things (IoT) sensor data
is a key for achieving city smartness. In this paper a multi-
tier fog computing model with large-scale data analytics service
is proposed for smart cities applications. The multi-tier fog is
consisted of ad-hoc fogs and dedicated fogs with opportunistic
and dedicated computing resources, respectively. The proposed
new fog computing model with clear functional modules is able
to mitigate the potential problems of dedicated computing infras-
tructure and slow response in cloud computing. We run analytics
benchmark experiments over fogs formed by Rapsberry Pi com-
puters with a distributed computing engine to measure computing
performance of various analytics tasks, and create easy-to-use
workload models. QoS aware admission control, offloading and
resource allocation schemes are designed to support data analyt-
ics services, and maximize analytics service utilities. Availability
and cost models of networking and computing resources are
taken into account in QoS scheme design. A scalable system
level simulator is developed to evaluate the fog based analytics
service and the QoS management schemes. Experiment results
demonstrate the efficiency of analytics services over multi-tier
fogs and the effectiveness of the proposed QoS schemes. Fogs can
largely improve the performance of smart city analytics services
than cloud only model in terms of job blocking probability and
service utility.
Keywords: Internet of Things; Smart Cities; Fog Comput-
ing; Quality of Services; Data Analytics; Spark; Raspberry Pi
I. INTRODUCTION
With greater access to public resource such as education
and health and more job opportunities, more and more people
leave villages to live in cities. A rapid urbanization of the
world’s population was witnessed in the last decade. The
global proportion of urban population was reported by United
Nation to be 49% (3.2 billion) in 2005, and is expected to rise
to 60% (4.9 billion) by 2030. However, the fast increasing
urban population exacerbates the existing problems faced by
modern cities, such as traffic congestion, pollution, low quality
public services, insufficient public resource and budget for
health and education. Smart cities is an ambitious vision to
tackle the above city problems by making more efficient use
of city resource and infrastructure and improve the quality
of life for citizens. It is proposed to capitalize on the latest
Jianhua He, Jian Wei and Zuoyin Tang are with School of
Engineering and Applied Science, Aston University, UK, Email:
{j.he7,weij2,z.tang1}@aston.ac.uk. Kai Chen and Yi Zhou are
with Department of Electronics Engineering, Shanghai Jiaotong University,
China, Email: kchen@sjtu.edu.cn. Yan Zhang is with Department of
Informatics, University of Oslo, Norway, Emai: yanzhang@ieee.org. Dr
Yi Zhou is the Corresponding Author.
Copyright (c) 2012 IEEE. Personal use of this material is permitted.
However, permission to use this material for any other purposes must be
obtained from the IEEE by sending a request to pubs-permissions@ieee.org.
technology advances of Internet of Things (IoT), communi-
cation and networking, computing and big data analytics, to
provide smartness on many sectors, such as transport and
traffic management, health care, water, energy, and waste
management.
IoT provides a vital instrument to sense and control the
physical city environment [1]–[3]. IoT data analytics is a key
in achieving and delivering the city smartness. With virtually
unlimited computing and storage resource, clouds are thought
to be the natural places for big data analytics [4] [5]. and
can provide easy management of IoT services [5]. However,
with expansion of IoT systems and emerging big data from
smart city applications and fast response requirement from
applications such as public safety and emergency response,
there are problem with cloud based solution due to real-
time and reliable transport of enormous IoT traffic over
communication networks, especially wireless access networks,
which is well known with features of low bandwidth and high
communication cost.
There are several edge computing models (including
Cloudlet, mobile edge computing and fog computing) pro-
posed to tackle the data analytics problems in the cloud
computing based solution [3], [6]–[9]. The principle is moving
computing and caching resources and analytics services closer
to the things where data is generated. However it is noted
that for the cloudlet, mobile edge computing and fog radio
access networks based solutions [7] [9], computing facilities
are provided by the third party at fixed locations, which can
be powerful for big data analytics but may not be flexible
enough for on demand deployment when there is a need.
And the wireless access bottleneck problem still exist for
the IoT data traffic. Fog computing is gaining increasing
research and development momentums but still in a very early
stage. According to [3], [8] end devices such as smart phones
and WiFi access points can be used for data analytics when
available and needed. But they are expected to take only very
simple time-sensitive data processing tasks. Less time sensitive
analysis and big data analytics are performed in the clouds.
The original fog computing model does not solve the large
scale data analytics problems faced by the IoT applications.
In addition, its network architecture and service model are not
clearly specified.
It is noted that within the last several years we witnessed an
explosive growth of mobile smart personal devices (e.g. smart
phones and tablets). These smart personal devices with in-
creasingly available computing and communication resources
can be utilized to form small ad-hoc fogs. On the other hand,
the number of small cell base stations and WiFi based home

2
hotspots are also expected to grow fast. Dedicated computing
resource can be deployed alongside these small base stations
and home hotspots in addition to the macro cellular base
stations to form dedicated fogs. With properly design QoS
management schemes these multi-tier fogs can complement
to each other and remote clouds to provide more effective
and prompt response to fast changing circumstances of smart
cities.
In this paper we propose a multi-tier fog computing based
large scale data analytics service for smart city applications.
There are three main contributions:
A multi-tier fogs computing framework is proposed,
which include both ad-hoc fogs with distributed oppor-
tunistic computing resources and dedicated fogs with
specifically deployed computing resources. The fogs can
utilize opportunistic and dedicated computing resources
to mitigate the problem of huge initial fog infrastructure
investment. Large scale analytics service can be run
over multi-tier fog computing system with support of
distributed computing engines.
Analytics benchmarking over multi-fogs is run over small
size Rapspberry Pi computers with Spark computing
engine to create workload models of various analytics
jobs. In the existing offloading schemes the workload of
computing jobs were usually represented by the instruc-
tions per second (e.g., [6]). By contrast, in this paper
easy-to-use job level working load models are created and
utilized in the design of practical QoS aware management
schemes.
QoS aware service and resource management schemes are
designed and developed for admission control, offloading
and resource allocation, to provide real-time analytics
services to smart city applications and improve utility
for fog computing operators. The network bandwidth
and latency, communication and computing costs, and
computing time are all taken into account. to satisfy the
QoS constraint of real time job completion and improve
computing utility for the QoS aware analytics services.
To the best of our knowledge, QoS issues for mobile edge
computing and fog computing have rarely been touched
in the literature.
In the rest of the paper we present a framework of multi-
tier fog computing system for smart city applications and the
large-scale analytics service model in Section II. In Section III
design and results on the benchmarking experiments over both
ad-hoc fogs and dedicated fogs are reported. Design of QoS
aware service and resource management schemes is presented
in Section IV. In Section V, analytics services and the QoS
schemes are evaluated and QoS performance is analyzed.
Section VI concludes the paper.
II. MULTI-TIER FOGS MODEL AND SCALABLE ANALYTICS
SERVICE FOR SMART CITIES
A. Proposed Multi-tier Fog Computing Model
In the Cisco fog computing model, fog aggregation nodes
are not clearly defined. For example, it is not clear where these
nodes are located, how much computation power and storage
resources they have, and by whom they may be deployed.
The fog nodes are expected to analyze and act on the large
volume of data generated by thousands of things across a large
geographic area in less than one second [3]. It is not likely
that the small size fog nodes like smart phones and video
cameras can deliver the expected analytics services. But there
is no discussions in the literature if dedicated fog nodes can
complete the tasks. If the fog computing relies on the dedicated
fog nodes and fog aggregation nodes for fast and reliable data
analytics services, then there is little difference between fog
computing and cloudlet models.
On the other hand, the fog nodes in the Cisco fog computing
model are not expected to take complex and advanced data
analytics. The majority of IoT data traffic still goes to the
traditional data centers for big data analytics, which does not
solve the bandwidth and prompt response problems faced by
real time smart city applications. In addition, the connections
of IoT devices to the Internet may not exist or have very lim-
ited network bandwidth, such as in the scenarios of emergency
response and anti-terrorism events. Under theses conditions
data analytics services for smart city applications may not be
effectively delivered through the public clouds.
We propose a new multi-tier fogs computing model for
smart city applications, which includes both ad-hoc fogs and
dedicated fogs. Fig. 1 presents the architecture for the multi-
tier fog computing model. In the hierarchical architecture, the
Tier 1 fogs are dedicated fogs, which include the MEC and
fogs supported by the dedicated routers and cellular network
base stations. Tier 2 fogs are ad-hoc fogs, which are formed
by opportunistic devices with computing and networking re-
sources, such as smart phones, laptops and vehicles.
Fog nodes can share unused computing resources to provide
data analytics services for both IoT applications and mobile
applications. They can participate in a hierarchical cloud
computing system, working with traditional remote clouds and
optional cloudlets. They can also work in a stand alone mode.
With data analytics services from multi-tier fogs, analytics
results are sent to the interested users of the analytics services.
A large volume of IoT data from smart city applications
may not need to be sent to the remote clouds. Therefore the
response latency and bandwidth consumption problems could
be solved.
B. Fog Functional Model
Next we present the function model for the fog nodes.
Each fog is formed by a cluster of computers with a pool
of computing resources. There are two types of fog nodes,
i.e., fog master and fog worker. The functionalities of these
nodes are illustrated by the functional models shown in Fig. 2.
In the ad-hoc fogs, fog workers are usually interconnected
devices which join the fogs by invitation, e.g., from fog
masters. The fog workers are responsible for sharing their
computing resources, undertaking computing jobs, monitor-
ing and reporting available computing and communication
resources to fog masters etc.
Each fog has at least one fog master. Multiple masters can
be present in one fog for improved reliability. The masters may
physically co-locate with the normal fog workers, or locate

3
Fig. 1. A multi-tier fog cloud architecture.
separately. The masters have the main responsibilities such as
fog creation, service management and job scheduling.
Fig. 2. Function model for fog nodes.
1) Resource Module: The resource module is at the bottom
of the function models for both fog master and fog workers. It
represents physical resources of fog nodes, which may include
sensing resources, computing resources and communication
resources for connection to other fog nodes. It is noteworthy
that apart from WiFi technology, other communication tech-
nologies such as cellular radio and visible light communication
technologies can also be used for fog node communication.
2) Networking and Virtualization Module: Networking is
a critical part of fog, especially for the scenarios where ad-
hoc fogs nodes are mobile and the wireless link bandwidth
is limited. The fog master should keep tracking the mobility
and network connection of the fog workers, and adaptively
allocate the computing tasks to the fog workers to maximize
the computing QoS.
In the fogs resource virtualization is optional but very
important for the fog nodes which may have their own heavy
computing tasks. With virtualization a part of computing re-
sources can be reserved for the local computing tasks. And fog
computing tasks can be run only in the isolated resources, by
which local computing and security performance are ensured.
The existing virtualization technologies can be applied with
modifications for both ad-hoc and dedicated fogs.
3) Fog and Resource Management Module: Fogs can be
formed on demand and managed by fog master nodes. Each
fog has a life cycle of formation, maintenance and release. Fog
workers are responsible of monitoring and reporting comput-
ing resources and communication conditions to fog masters.
Fog masters maintain the status of the available computing
resources and communication conditions of the members in the
fogs. Special incentive and reward schemes can be applied by
fog masters to encourage interconnected devices to join fogs
and share their unused computing resources. It is noted that
mobility and security can have large impact on ad-hoc fogs.
With the centralized fog and resource management framework,
fog worker mobility and security could be handled effectively
to achieve high level QoS.
4) Job Admission and Scheduling Module: When a com-
puting job request is received (from smart city applications
or other IoT applications), a fog master needs to assess
the computing resources required to complete the job, and
admit or reject the job request according to the available
compute resources. If a job is accepted, it is scheduled to run
over one or more fog workers depending on their available
compute resources and network conditions. The fog master
may communicate with other fog masters to jointly work on
analytics tasks, or make decisions on offloading jobs to other
fogs or remote clouds.
5) Services Module: There are three standard service mod-
els provided by traditional clouds, namely infrastructure as
a service (IaaS), platform as a service (PaaS), software as a
service (SaaS). If the fog nodes are static and have powerful
computing resources, a large computation resource pool can
be created for fogs and the standard cloud service models
can be offered by the fogs. However, due to the limitations
on the computing power, bandwidth of wireless connections
and mobility of fog nodes, ad-hoc fogs may not be ideal to
provide these standard cloud computing services. We propose
to provide large scale analytics service over fogs. With the
analytics service model the users of IoT applications can
request analytics services from fog masters. The masters
analyze the analytics service request, choose the required ana-
lytics algorithm and computing engine, and assess the service
requirements on computing and communication resources. If
the service request is admissible, fog member nodes and com-
puting resources are scheduled to provide the service. Multiple
fog member nodes may work collectively with distributed
computing engines to provide advanced analytics if needed.
III. BENCHMARKING EXPERIMENTS FOR ANALYTICS
APPLICATIONS OVER FOGS
As the multi-tier fog and cloud systems will undertake
various analytics services from smart city applications with
diverse QoS requirement, it is important to design and imple-
ment QoS aware service and resource management schemes

4
for the analytics services. However, in order to do so, a key
is to measure and model the workloads of different analytics
services over the fogs with various computing and commu-
nication resources. In this section we present benchmarking
experiments over ad-hoc fogs (A-Fogs) and dedicated fogs
(D-Fogs), to provide basis for QoS scheme design.
A. Overall Experiment Methodology
For the benchmarking of analytics systems there could be
three major dimensions of diversities that need to consider
[10]: analytics computing platform, analytics algorithm and
analytics job dataset. In our previous work, we have performed
an intensive benchmarking of analytics systems over both
private and public clouds, with various computing platforms
and analytics algorithms. It was found from our previous work
that Spark and GraphLab perform the best over the other
computing platforms [11]–[13]. As GraphLab has not been
updated for a long time, we use Spark as the only computing
platform for benchmarking. The overall benchmarking exper-
iment framework is shown in Fig. 3.
For the analytics algorithms, we present only experiments
with Logistic regression (LR) and support vector machine
(SVM) for demonstration purpose only. LR and SVM are
two typical machine learning algorithms for classification
applications, which identify the category an object belongs
to. It is noted that we have tested fog computing performance
with various analytics algorithms as done in [10]. It is trivial
to include more analytics algorithms and computing platforms
to the benchmarking experiments.
Fig. 3. Overall benchmarking framework.
B. Computing System Setup for A-Fog Benchmarking
In the benchmarking with A-Fogs, we consider a pool of
computing resources with one desktop PC and 8 Raspberry Pi
3 credit card sized micro computers, which forms an A-Fog
environment [14]. The Raspberry Pis are connected to a WiFi
ad hoc network through their built-in wireless 802.11 module.
One of the computers acts as fog master, while the rest act as
fog workers. Virtual machines are installed on the computers
and each is allocated 700 MB RAM. Spark with the latest
version 2.0 is installed in the virtual machines. Analytics job
requests are sent from one of the fog nodes to the master
node, which dispatches the jobs to the fog member nodes. Job
completion time and resource consumption of the analytics
jobs over Spark are recorded and used in the QoS aware
resource management.
Raspberry Pi is a single-board computer with a 1.2 GHz
64-bit quad-core ARMv8 CPU and 1 GB RAM [14]. A 32
GB micro SD card with operating system Raspbian installed
is slotted on each machine. Raspberry Pi has the features of
low cost, low power consumption, small size but still good
computing power. It has been used for many cost-effective
entertainment, surveillance, mobile and IoT applications. In
addition computing power and storage of Raspberry Pi have
the similar features of A-Fog nodes such as smart phones,
tablets, but has a better user-friendly programming environ-
ment. Therefore Raspberry Pi is selected in our benchmarking
instead of laptops.
Spark is an open source fast and general distributed com-
puting engine for large scale data analytics [11]. It is very
popular for big data analytics with significant performance
enhancement over Hadoop [12], [13]. Spark has been evaluated
and mainly used in large centrally controlled computer clus-
ters. To the best of our knowledge, it has not been tested and
evaluated in highly resource (computing and communications)
constrained distributed computing environments.
Experiment datasets for analytics jobs are generated by the
Spark datasets generator. The synthetic datasets are used for
performance evaluation in our study mainly for easy control of
the dataset size. Datasets generated from real IoT applications
can be used as well. For the benchmarking over A-Fogs, ve
datasets with different data sizes are used, with labels DS-A-
1, DS-A-2, DS-A-3, DS-A-4 and DS-A-5. The letter A in
the labels designates to the A-Fogs. Table I summarizes the
datasets used for the algorithms LR and SVM. In this set of
experiments the largest file size is 1.77 G, which is believed
to be large enough for A-Fogs mainly consisted of devices
with limited computing resource and energy. Larger datasets
should be offloaded to and processed by dedicated fogs or
remote clouds.
TABLE I
SUMMARY OF DATASETS FOR AD-HOC FOG EXPERIMENTS.
Application Datasets Size (MB) # of vertices (10
6
)
LR DS-A-1 58.3 1
DS-A-2 145.8 2.5
DS-A-3 291.6 5
DS-A-4 583.3 10
DS-A-5 1770 30
SVM DS-A-1 61.3 1
DS-A-2 153.3 2.5
DS-A-3 306.6 5
DS-A-4 590 10
DS-A-5 1770 30
C. Benchmark Results with A-Fogs
Benchmarking results with Spark are presented over 14 A-
Fog computer settings as shown in Table. II. It is noted that the
letter A in the computer setting labels designates to A-Fogs.

Citations
More filters
Journal ArticleDOI

All one needs to know about fog computing and related edge computing paradigms: A complete survey

TL;DR: This paper provides a tutorial on fog computing and its related computing paradigms, including their similarities and differences, and provides a taxonomy of research topics in fog computing.
Journal ArticleDOI

Blockchain for Internet of Things: A Survey

TL;DR: An in-depth survey of BCoT is presented and the insights of this new paradigm are discussed and the open research directions in this promising area are outlined.
Journal ArticleDOI

Edge computing: A survey

TL;DR: This article discusses the importance of Edge computing in real life scenarios where response time constitutes the fundamental requirement for many applications and identifies the requirements and discusses open research challenges in Edge computing.
Journal ArticleDOI

All One Needs to Know about Fog Computing and Related Edge Computing Paradigms: A Complete Survey

TL;DR: In this paper, the authors provide a tutorial on fog computing and its related computing paradigms, including their similarities and differences, and provide a taxonomy of research topics in fog computing.
Journal ArticleDOI

Narrow Band Internet of Things

TL;DR: The background and state-of-the-art of the narrow-band Internet of Things (NB-IoT) is reviewed, including smart cities, smart buildings, intelligent environment monitoring, intelligent user services, and smart metering, and five intelligent applications are analyzed.
References
More filters
Journal ArticleDOI

A view of cloud computing

TL;DR: The clouds are clearing the clouds away from the true potential and obstacles posed by this computing capability.
Proceedings Article

Spark: cluster computing with working sets

TL;DR: Spark can outperform Hadoop by 10x in iterative machine learning jobs, and can be used to interactively query a 39 GB dataset with sub-second response time.
Proceedings ArticleDOI

Fog computing and its role in the internet of things

TL;DR: This paper argues that the above characteristics make the Fog the appropriate platform for a number of critical Internet of Things services and applications, namely, Connected Vehicle, Smart Grid, Smart Cities, and, in general, Wireless Sensors and Actuators Networks (WSANs).
Journal ArticleDOI

Internet of Things for Smart Cities

TL;DR: This paper will present and discuss the technical solutions and best-practice guidelines adopted in the Padova Smart City project, a proof-of-concept deployment of an IoT island in the city of Padova, Italy, performed in collaboration with the city municipality.
Proceedings Article

Resilient distributed datasets: a fault-tolerant abstraction for in-memory cluster computing

TL;DR: Resilient Distributed Datasets is presented, a distributed memory abstraction that lets programmers perform in-memory computations on large clusters in a fault-tolerant manner and is implemented in a system called Spark, which is evaluated through a variety of user applications and benchmarks.
Related Papers (5)
Frequently Asked Questions (16)
Q1. What are the contributions mentioned in the paper "Multi-tier fog computing with large-scale iot data analytics for smart cities" ?

In this paper a multitier fog computing model with large-scale data analytics service is proposed for smart cities applications. The authors run analytics benchmark experiments over fogs formed by Rapsberry Pi computers with a distributed computing engine to measure computing performance of various analytics tasks, and create easy-to-use workload models. The proposed new fog computing model with clear functional modules is able to mitigate the potential problems of dedicated computing infrastructure and slow response in cloud computing. 

2) Networking and Virtualization Module: Networking is a critical part of fog, especially for the scenarios where adhoc fogs nodes are mobile and the wireless link bandwidth is limited. 

For the data analytics service and the QoS aware job management schemes, the major concern from the data analytics service users is on the service quality such as job blocking probability; while for the service operators of the fogs and the clouds they are more concerned on the utility (or revenue) generated from the analytics services. 

QoS aware job admission control, offloading and resource allocation schemes were designed and developed to provide QoS support for large scale data analytics services over multi-tier fogs. 

When the multi-tier computing architecture is used (with A-Fog, D-Fog and cloud), the overall analytics service has the lowest blocking probability, which is less than 0.03 for most job arrival rates. 

For the benchmarking of analytics systems there could be three major dimensions of diversities that need to consider [10]: analytics computing platform, analytics algorithm and analytics job dataset. 

In addition the desktop computer shows a large impact on the analytics service capacity, which substantially reduces job completion time and extends the size of datasets that can be processed. 

Special incentive and reward schemes can be applied by fog masters to encourage interconnected devices to join fogs and share their unused computing resources. 

Due to the limited computing resources at the A-Fog, the blocking probability increases fast with job arrival rates larger than 0.2, which indicates strong need of dedicated computing resources at D-Fog and/or clouds. 

According to their market research, the currently cheapest mobile broadband data tariff in the UK is 2 pounds per GB mobile data traffic. 

Then the computing cost for a given job can be computed according to the computing resource allocated for this job, the computation time with the allocated resources, and the computing resource price. 

The multi-tier fog computing system achieves the highest servie utility for most job arrival rates, which is almost double of the cloud only system. 

To evaluate and compare the analytics computing performance under different experiment configurations, 12 computer settings representing the number of CPU cores and the memory size are used, which are shownin Table. 

In the multi-tier fog computing model there are both ad-hoc fogs with opportunistic computing resources and dedicated fogs with dedicated computing resources. 

On the other hand, the A-Fog only computing architecture can provide quite good service quality until job arrival rate is larger than 0.2 jobs per second. 

in order to do so, a key is to measure and model the workloads of different analytics services over the fogs with various computing and communication resources.