scispace - formally typeset
Search or ask a question

Showing papers on "Cloud computing published in 2020"


Journal ArticleDOI
TL;DR: The concept of federated learning (FL) as mentioned in this paperederated learning has been proposed to enable collaborative training of an ML model and also enable DL for mobile edge network optimization in large-scale and complex mobile edge networks, where heterogeneous devices with varying constraints are involved.
Abstract: In recent years, mobile devices are equipped with increasingly advanced sensing and computing capabilities. Coupled with advancements in Deep Learning (DL), this opens up countless possibilities for meaningful applications, e.g., for medical purposes and in vehicular networks. Traditional cloud-based Machine Learning (ML) approaches require the data to be centralized in a cloud server or data center. However, this results in critical issues related to unacceptable latency and communication inefficiency. To this end, Mobile Edge Computing (MEC) has been proposed to bring intelligence closer to the edge, where data is produced. However, conventional enabling technologies for ML at mobile edge networks still require personal data to be shared with external parties, e.g., edge servers. Recently, in light of increasingly stringent data privacy legislations and growing privacy concerns, the concept of Federated Learning (FL) has been introduced. In FL, end devices use their local data to train an ML model required by the server. The end devices then send the model updates rather than raw data to the server for aggregation. FL can serve as an enabling technology in mobile edge networks since it enables the collaborative training of an ML model and also enables DL for mobile edge network optimization. However, in a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved. This raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale. In this survey, we begin with an introduction to the background and fundamentals of FL. Then, we highlight the aforementioned challenges of FL implementation and review existing solutions. Furthermore, we present the applications of FL for mobile edge network optimization. Finally, we discuss the important challenges and future research directions in FL.

895 citations


Journal ArticleDOI
TL;DR: By consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge, i.e., Edge DL.
Abstract: Ubiquitous sensors and smart devices from factories and communities are generating massive amounts of data, and ever-increasing computing power is driving the core of computation and services from the cloud to the edge of the network. As an important enabler broadly changing people’s lives, from face recognition to ambitious smart factories and cities, developments of artificial intelligence (especially deep learning, DL) based applications and services are thriving. However, due to efficiency and latency issues, the current cloud computing service architecture hinders the vision of “providing artificial intelligence for every person and every organization at everywhere”. Thus, unleashing DL services using resources at the network edge near the data sources has emerged as a desirable solution. Therefore, edge intelligence , aiming to facilitate the deployment of DL services by edge computing, has received significant attention. In addition, DL, as the representative technique of artificial intelligence, can be integrated into edge computing frameworks to build intelligent edge for dynamic, adaptive edge maintenance and management. With regard to mutually beneficial edge intelligence and intelligent edge , this paper introduces and discusses: 1) the application scenarios of both; 2) the practical implementation methods and enabling technologies, namely DL training and inference in the customized edge computing framework; 3) challenges and future trends of more pervasive and fine-grained intelligence. We believe that by consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge , i.e., Edge DL.

611 citations


Journal ArticleDOI
TL;DR: This paper presents the IoT technology from a bird's eye view covering its statistical/architectural trends, use cases, challenges and future prospects, and discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.
Abstract: The Internet of Things (IoT)-centric concepts like augmented reality, high-resolution video streaming, self-driven cars, smart environment, e-health care, etc. have a ubiquitous presence now. These applications require higher data-rates, large bandwidth, increased capacity, low latency and high throughput. In light of these emerging concepts, IoT has revolutionized the world by providing seamless connectivity between heterogeneous networks (HetNets). The eventual aim of IoT is to introduce the plug and play technology providing the end-user, ease of operation, remotely access control and configurability. This paper presents the IoT technology from a bird’s eye view covering its statistical/architectural trends, use cases, challenges and future prospects. The paper also presents a detailed and extensive overview of the emerging 5G-IoT scenario. Fifth Generation (5G) cellular networks provide key enabling technologies for ubiquitous deployment of the IoT technology. These include carrier aggregation, multiple-input multiple-output (MIMO), massive-MIMO (M-MIMO), coordinated multipoint processing (CoMP), device-to-device (D2D) communications, centralized radio access network (CRAN), software-defined wireless sensor networking (SD-WSN), network function virtualization (NFV) and cognitive radios (CRs). This paper presents an exhaustive review for these key enabling technologies and also discusses the new emerging use cases of 5G-IoT driven by the advances in artificial intelligence, machine and deep learning, ongoing 5G initiatives, quality of service (QoS) requirements in 5G and its standardization issues. Finally, the paper discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.

591 citations


Journal ArticleDOI
TL;DR: A novel over-the-air computation based approach for fast global model aggregation via exploring the superposition property of a wireless multiple-access channel and providing a difference-of-convex-functions (DC) representation for the sparse and low-rank function to enhance sparsity and accurately detect the fixed-rank constraint in the procedure of device selection.
Abstract: The stringent requirements for low-latency and privacy of the emerging high-stake applications with intelligent devices such as drones and smart vehicles make the cloud computing inapplicable in these scenarios. Instead, edge machine learning becomes increasingly attractive for performing training and inference directly at network edges without sending data to a centralized data center. This stimulates a nascent field termed as federated learning for training a machine learning model on computation, storage, energy and bandwidth limited mobile devices in a distributed manner. To preserve data privacy and address the issues of unbalanced and non-IID data points across different devices, the federated averaging algorithm has been proposed for global model aggregation by computing the weighted average of locally updated model at each selected device. However, the limited communication bandwidth becomes the main bottleneck for aggregating the locally computed updates. We thus propose a novel over-the-air computation based approach for fast global model aggregation via exploring the superposition property of a wireless multiple-access channel. This is achieved by joint device selection and beamforming design, which is modeled as a sparse and low-rank optimization problem to support efficient algorithms design. To achieve this goal, we provide a difference-of-convex-functions (DC) representation for the sparse and low-rank function to enhance sparsity and accurately detect the fixed-rank constraint in the procedure of device selection. A DC algorithm is further developed to solve the resulting DC program with global convergence guarantees. The algorithmic advantages and admirable performance of the proposed methodologies are demonstrated through extensive numerical results.

579 citations


Journal ArticleDOI
Yiqiang Chen1, Xin Qin1, Jindong Wang2, Chaohui Yu1, Wen Gao 
TL;DR: FedHealth is proposed, the first federated transfer learning framework for wearable healthcare that performs data aggregation through federated learning, and then builds relatively personalized models by transfer learning.
Abstract: With the rapid development of computing technology, wearable devices make it easy to get access to people's health information. Smart healthcare achieves great success by training machine learning models on a large quantity of user personal data. However, there are two critical challenges. First, user data often exist in the form of isolated islands, making it difficult to perform aggregation without compromising privacy security. Second, the models trained on the cloud fail on personalization. In this article, we propose FedHealth, the first federated transfer learning framework for wearable healthcare to tackle these challenges. FedHealth performs data aggregation through federated learning, and then builds relatively personalized models by transfer learning. Wearable activity recognition experiments and real Parkinson's disease auxiliary diagnosis application have evaluated that FedHealth is able to achieve accurate and personalized healthcare without compromising privacy and security. FedHealth is general and extensible in many healthcare applications.

486 citations


Journal ArticleDOI
TL;DR: An important guiding source for researchers and engineers studying the smart grid, which helps transmission and distribution system operators to follow the right path as they are transforming their classical grids to smart grids.

472 citations


Journal ArticleDOI
TL;DR: A comprehensive review and updated solutions related to 5G network slicing using SDN and NFV, and a discussion on various open source orchestrators and proof of concepts representing industrial contribution are provided.

458 citations


Journal ArticleDOI
TL;DR: The purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges, as well as some promising cross-cutting data reduction and forensics intelligence techniques.
Abstract: Today is the era of the Internet of Things (IoT). The recent advances in hardware and information technology have accelerated the deployment of billions of interconnected, smart and adaptive devices in critical infrastructures like health, transportation, environmental control, and home automation. Transferring data over a network without requiring any kind of human-to-computer or human-to-human interaction, brings reliability and convenience to consumers, but also opens a new world of opportunity for intruders, and introduces a whole set of unique and complicated questions to the field of Digital Forensics. Although IoT data could be a rich source of evidence, forensics professionals cope with diverse problems, starting from the huge variety of IoT devices and non-standard formats, to the multi-tenant cloud infrastructure and the resulting multi-jurisdictional litigations. A further challenge is the end-to-end encryption which represents a trade-off between users’ right to privacy and the success of the forensics investigation. Due to its volatile nature, digital evidence has to be acquired and analyzed using validated tools and techniques that ensure the maintenance of the Chain of Custody. Therefore, the purpose of this paper is to identify and discuss the main issues involved in the complex process of IoT-based investigations, particularly all legal, privacy and cloud security challenges. Furthermore, this work provides an overview of the past and current theoretical models in the digital forensics science. Special attention is paid to frameworks that aim to extract data in a privacy-preserving manner or secure the evidence integrity using decentralized blockchain-based solutions. In addition, the present paper addresses the ongoing Forensics-as-a-Service (FaaS) paradigm, as well as some promising cross-cutting data reduction and forensics intelligence techniques. Finally, several other research trends and open issues are presented, with emphasis on the need for proactive Forensics Readiness strategies and generally agreed-upon standards.

440 citations


Proceedings ArticleDOI
07 Jun 2020
TL;DR: In this paper, the authors proposed a client-edge-cloud hierarchical federated learning system, supported with a HierFAVG algorithm that allows multiple edge servers to perform partial model aggregation.
Abstract: Federated Learning is a collaborative machine learning framework to train a deep learning model without accessing clients' private data. Previous works assume one central parameter server either at the cloud or at the edge. The cloud server can access more data but with excessive communication overhead and long latency, while the edge server enjoys more efficient communications with the clients. To combine their advantages, we propose a client-edge-cloud hierarchical Federated Learning system, supported with a HierFAVG algorithm that allows multiple edge servers to perform partial model aggregation. In this way, the model can be trained faster and better communication-computation trade-offs can be achieved. Convergence analysis is provided for HierFAVG and the effects of key parameters are also investigated, which lead to qualitative design guidelines. Empirical experiments verify the analysis and demonstrate the benefits of this hierarchical architecture in different data distribution scenarios. Particularly, it is shown that by introducing the intermediate edge servers, the model training time and the energy consumption of the end devices can be simultaneously reduced compared to cloud-based Federated Learning.

433 citations


Journal ArticleDOI
TL;DR: By selectively analyzing the literature, this paper systematically survey how the adoption of the above-mentioned Industry 4.0 technologies (and their integration) applied to the health domain is changing the way to provide traditional services and products.

431 citations


Journal ArticleDOI
TL;DR: The other major technology transformations that are likely to define 6G are discussed: cognitive spectrum sharing methods and new spectrum bands; the integration of localization and sensing capabilities into the system definition, the achievement of extreme performance requirements on latency and reliability; new network architecture paradigms involving sub-networks and RAN-Core convergence; and new security and privacy schemes.
Abstract: The focus of wireless research is increasingly shifting toward 6G as 5G deployments get underway. At this juncture, it is essential to establish a vision of future communications to provide guidance for that research. In this paper, we attempt to paint a broad picture of communication needs and technologies in the timeframe of 6G. The future of connectivity is in the creation of digital twin worlds that are a true representation of the physical and biological worlds at every spatial and time instant, unifying our experience across these physical, biological and digital worlds. New themes are likely to emerge that will shape 6G system requirements and technologies, such as: (i) new man-machine interfaces created by a collection of multiple local devices acting in unison; (ii) ubiquitous universal computing distributed among multiple local devices and the cloud; (iii) multi-sensory data fusion to create multi-verse maps and new mixed-reality experiences; and (iv) precision sensing and actuation to control the physical world. With rapid advances in artificial intelligence, it has the potential to become the foundation for the 6G air interface and network, making data, compute and energy the new resources to be exploited for achieving superior performance. In addition, in this paper we discuss the other major technology transformations that are likely to define 6G: (i) cognitive spectrum sharing methods and new spectrum bands; (ii) the integration of localization and sensing capabilities into the system definition, (iii) the achievement of extreme performance requirements on latency and reliability; (iv) new network architecture paradigms involving sub-networks and RAN-Core convergence; and (v) new security and privacy schemes.

Journal ArticleDOI
TL;DR: In this article, the authors provide a comprehensive overview of mobile edge computing (MEC) and its potential use cases and applications, as well as discuss challenges and potential future directions for MEC research.
Abstract: Driven by the emergence of new compute-intensive applications and the vision of the Internet of Things (IoT), it is foreseen that the emerging 5G network will face an unprecedented increase in traffic volume and computation demands. However, end users mostly have limited storage capacities and finite processing capabilities, thus how to run compute-intensive applications on resource-constrained users has recently become a natural concern. Mobile edge computing (MEC), a key technology in the emerging fifth generation (5G) network, can optimize mobile resources by hosting compute-intensive applications, process large data before sending to the cloud, provide the cloud-computing capabilities within the radio access network (RAN) in close proximity to mobile users, and offer context-aware services with the help of RAN information. Therefore, MEC enables a wide variety of applications, where the real-time response is strictly required, e.g., driverless vehicles, augmented reality, robotics, and immerse media. Indeed, the paradigm shift from 4G to 5G could become a reality with the advent of new technological concepts. The successful realization of MEC in the 5G network is still in its infancy and demands for constant efforts from both academic and industry communities. In this survey, we first provide a holistic overview of MEC technology and its potential use cases and applications. Then, we outline up-to-date researches on the integration of MEC with the new technologies that will be deployed in 5G and beyond. We also summarize testbeds and experimental evaluations, and open source activities, for edge computing. We further summarize lessons learned from state-of-the-art research works as well as discuss challenges and potential future directions for MEC research.

Journal ArticleDOI
TL;DR: VerifyNet is proposed, the first privacy-preserving and verifiable federated learning framework that claims that it is impossible that an adversary can deceive users by forging Proof, unless it can solve the NP-hard problem adopted in the model.
Abstract: As an emerging training model with neural networks, federated learning has received widespread attention due to its ability to update parameters without collecting users’ raw data. However, since adversaries can track and derive participants’ privacy from the shared gradients, federated learning is still exposed to various security and privacy threats. In this paper, we consider two major issues in the training process over deep neural networks (DNNs): 1) how to protect user’s privacy (i.e., local gradients) in the training process and 2) how to verify the integrity (or correctness) of the aggregated results returned from the server. To solve the above problems, several approaches focusing on secure or privacy-preserving federated learning have been proposed and applied in diverse scenarios. However, it is still an open problem enabling clients to verify whether the cloud server is operating correctly, while guaranteeing user’s privacy in the training process. In this paper, we propose VerifyNet, the first privacy-preserving and verifiable federated learning framework. In specific, we first propose a double-masking protocol to guarantee the confidentiality of users’ local gradients during the federated learning. Then, the cloud server is required to provide the Proof about the correctness of its aggregated results to each user. We claim that it is impossible that an adversary can deceive users by forging Proof , unless it can solve the NP-hard problem adopted in our model. In addition, VerifyNet is also supportive of users dropping out during the training process. The extensive experiments conducted on real-world data also demonstrate the practical performance of our proposed scheme.

Journal ArticleDOI
TL;DR: A novel framework called HealthFog is proposed for integrating ensemble deep learning in Edge computing devices and deployed it for a real-life application of automatic Heart Disease analysis.

Journal ArticleDOI
TL;DR: This study aims to comprehensively explore different aspects of the GEE platform, including its datasets, functions, advantages/limitations, and various applications, and observed that Landsat and Sentinel datasets were extensively utilized by GEE users.
Abstract: Remote sensing (RS) systems have been collecting massive volumes of datasets for decades, managing and analyzing of which are not practical using common software packages and desktop computing resources. In this regard, Google has developed a cloud computing platform, called Google Earth Engine (GEE), to effectively address the challenges of big data analysis. In particular, this platform facilitates processing big geo data over large areas and monitoring the environment for long periods of time. Although this platform was launched in 2010 and has proved its high potential for different applications, it has not been fully investigated and utilized for RS applications until recent years. Therefore, this study aims to comprehensively explore different aspects of the GEE platform, including its datasets, functions, advantages/limitations, and various applications. For this purpose, 450 journal articles published in 150 journals between January 2010 and May 2020 were studied. It was observed that Landsat and Sentinel datasets were extensively utilized by GEE users. Moreover, supervised machine learning algorithms, such as Random Forest, were more widely applied to image classification tasks. GEE has also been employed in a broad range of applications, such as Land Cover/land Use classification, hydrology, urban planning, natural disaster, climate analyses, and image processing. It was generally observed that the number of GEE publications have significantly increased during the past few years, and it is expected that GEE will be utilized by more users from different fields to resolve their big data processing challenges.

Journal ArticleDOI
TL;DR: In this article, the authors proposed Edgent, a framework that leverages edge computing for DNN collaborative inference through device-edge synergy, which adaptively partitions computation between device and edge for purpose of coordinating the powerful cloud resource and the proximal edge resource for real-time DNN inference.
Abstract: As a key technology of enabling Artificial Intelligence (AI) applications in 5G era, Deep Neural Networks (DNNs) have quickly attracted widespread attention. However, it is challenging to run computation-intensive DNN-based tasks on mobile devices due to the limited computation resources. What’s worse, traditional cloud-assisted DNN inference is heavily hindered by the significant wide-area network latency, leading to poor real-time performance as well as low quality of user experience. To address these challenges, in this paper, we propose Edgent , a framework that leverages edge computing for DNN collaborative inference through device-edge synergy. Edgent exploits two design knobs: (1) DNN partitioning that adaptively partitions computation between device and edge for purpose of coordinating the powerful cloud resource and the proximal edge resource for real-time DNN inference; (2) DNN right-sizing that further reduces computing latency via early exiting inference at an appropriate intermediate DNN layer. In addition, considering the potential network fluctuation in real-world deployment, Edgent is properly design to specialize for both static and dynamic network environment. Specifically, in a static environment where the bandwidth changes slowly, Edgent derives the best configurations with the assist of regression-based prediction models, while in a dynamic environment where the bandwidth varies dramatically, Edgent generates the best execution plan through the online change point detection algorithm that maps the current bandwidth state to the optimal configuration. We implement Edgent prototype based on the Raspberry Pi and the desktop PC and the extensive experimental evaluations demonstrate Edgent ’s effectiveness in enabling on-demand low-latency edge intelligence.

Posted Content
TL;DR: This work presents a systematic learning-theoretic study of personalization, and proposes and analyzes three approaches: user clustering, data interpolation, and model interpolation.
Abstract: The standard objective in machine learning is to train a single model for all users. However, in many learning scenarios, such as cloud computing and federated learning, it is possible to learn a personalized model per user. In this work, we present a systematic learning-theoretic study of personalization. We propose and analyze three approaches: user clustering, data interpolation, and model interpolation. For all three approaches, we provide learning-theoretic guarantees and efficient algorithms for which we also demonstrate the performance empirically. All of our algorithms are model-agnostic and work for any hypothesis class.

Journal ArticleDOI
TL;DR: Some typical application scenarios of edge computing in IIoT, such as prognostics and health management, smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics, are introduced.
Abstract: The Industrial Internet of Things (IIoT) is a crucial research field spawned by the Internet of Things (IoT). IIoT links all types of industrial equipment through the network; establishes data acquisition, exchange, and analysis systems; and optimizes processes and services, so as to reduce cost and enhance productivity. The introduction of edge computing in IIoT can significantly reduce the decision-making latency, save bandwidth resources, and to some extent, protect privacy. This paper outlines the research progress concerning edge computing in IIoT. First, the concepts of IIoT and edge computing are discussed, and subsequently, the research progress of edge computing is discussed and summarized in detail. Next, the future architecture from the perspective of edge computing in IIoT is proposed, and its technical progress in routing, task scheduling, data storage and analytics, security, and standardization is analyzed. Furthermore, we discuss the opportunities and challenges of edge computing in IIoT in terms of 5G-based edge communication, load balancing and data offloading, edge intelligence, as well as data sharing security. Finally, we introduce some typical application scenarios of edge computing in IIoT, such as prognostics and health management (PHM), smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics.

Journal ArticleDOI
TL;DR: This article analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5Gs and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading.
Abstract: To satisfy the increasing demand of mobile data traffic and meet the stringent requirements of the emerging Internet-of-Things (IoT) applications such as smart city, healthcare, and augmented/virtual reality (AR/VR), the fifth-generation (5G) enabling technologies are proposed and utilized in networks As an emerging key technology of 5G and a key enabler of IoT, multiaccess edge computing (MEC), which integrates telecommunication and IT services, offers cloud computing capabilities at the edge of the radio access network (RAN) By providing computational and storage resources at the edge, MEC can reduce latency for end users Hence, this article investigates MEC for 5G and IoT comprehensively It analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5G and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading In addition, this article provides an overview of the role of MEC in 5G and IoT, bringing light into the different MEC-enabled 5G and IoT applications as well as the promising future directions of integrating MEC with 5G and IoT Moreover, this article further elaborates research challenges and open issues of MEC for 5G and IoT Last but not least, we propose a use case that utilizes MEC to achieve edge intelligence in IoT scenarios

Journal ArticleDOI
TL;DR: A Blockchain-enabled Intelligent IoT Architecture with Artificial Intelligence that provides an efficient way of converging blockchain and AI for IoT with current state-of-the-art techniques and applications is proposed.

Journal ArticleDOI
01 Sep 2020
TL;DR: An ML-based improved model has been applied to predict the potential threat of COVID-19 in countries worldwide and it is shown that using iterative weighting for fitting Generalized Inverse Weibull distribution, a better fit can be obtained to develop a prediction framework.
Abstract: The outbreak of COVID-19 Coronavirus, namely SARS-CoV-2, has created a calamitous situation throughout the world. The cumulative incidence of COVID-19 is rapidly increasing day by day. Machine Learning (ML) and Cloud Computing can be deployed very effectively to track the disease, predict growth of the epidemic and design strategies and policies to manage its spread. This study applies an improved mathematical model to analyse and predict the growth of the epidemic. An ML-based improved model has been applied to predict the potential threat of COVID-19 in countries worldwide. We show that using iterative weighting for fitting Generalized Inverse Weibull distribution, a better fit can be obtained to develop a prediction framework. This has been deployed on a cloud computing platform for more accurate and real-time prediction of the growth behavior of the epidemic. A data driven approach with higher accuracy as here can be very useful for a proactive response from the government and citizens. Finally, we propose a set of research opportunities and setup grounds for further practical applications.

Journal ArticleDOI
01 Feb 2020
TL;DR: Key design issues, methodologies, and hardware platforms are introduced, including edge-assisted perception, mapping, and localization for intelligent IoV, and typical use cases for intelligent vehicles are illustrated.
Abstract: The Internet of Vehicles (IoV) is an emerging paradigm that is driven by recent advancements in vehicular communications and networking. Meanwhile, the capability and intelligence of vehicles are being rapidly enhanced, and this will have the potential of supporting a plethora of new exciting applications that will integrate fully autonomous vehicles, the Internet of Things (IoT), and the environment. These trends will bring about an era of intelligent IoV, which will heavily depend on communications, computing, and data analytics technologies. To store and process the massive amount of data generated by intelligent IoV, onboard processing and cloud computing will not be sufficient due to resource/power constraints and communication overhead/latency, respectively. By deploying storage and computing resources at the wireless network edge, e.g., radio access points, the edge information system (EIS), including edge caching, edge computing, and edge AI, will play a key role in the future intelligent IoV. EIS will provide not only low-latency content delivery and computation services but also localized data acquisition, aggregation, and processing. This article surveys the latest development in EIS for intelligent IoV. Key design issues, methodologies, and hardware platforms are introduced. In particular, typical use cases for intelligent vehicles are illustrated, including edge-assisted perception, mapping, and localization. In addition, various open-research problems are identified.

Journal ArticleDOI
TL;DR: The concept of edge computing is summarized and compares it with cloud computing, the architecture of edge Computing, keyword technology, security and privacy protection, and the applications are summarized.
Abstract: With the rapid development of the Internet of Everything (IoE), the number of smart devices connected to the Internet is increasing, resulting in large-scale data, which has caused problems such as bandwidth load, slow response speed, poor security, and poor privacy in traditional cloud computing models. Traditional cloud computing is no longer sufficient to support the diverse needs of today's intelligent society for data processing, so edge computing technologies have emerged. It is a new computing paradigm for performing calculations at the edge of the network. Unlike cloud computing, it emphasizes closer to the user and closer to the source of the data. At the edge of the network, it is lightweight for local, small-scale data storage and processing. This article mainly reviews the related research and results of edge computing. First, it summarizes the concept of edge computing and compares it with cloud computing. Then summarize the architecture of edge computing, keyword technology, security and privacy protection, and finally summarize the applications of edge computing.

Journal ArticleDOI
TL;DR: A cloud battery management system for battery systems to improve the computational power and data storage capability by cloud computing and a state-of-charge estimation algorithm with particle swarm optimization is innovatively exploited to monitor both capacity fade and power fade of the battery during aging.
Abstract: Battery management is critical to enhancing the safety, reliability, and performance of the battery systems This paper presents a cloud battery management system for battery systems to improve the computational power and data storage capability by cloud computing With the Internet of Things, all battery relevant data are measured and transmitted to the cloud seamlessly, building up the digital twin for the battery system, where battery diagnostic algorithms evaluate the data and open the window into battery’s charge and aging level The application of equivalent circuit models in the digital twin for battery systems is explored with the development of cloud-suited state-of-charge and state-of-health estimation approaches The proposed state-of-charge estimation with an adaptive extended H-infinity filter is robust and accurate for both lithium-ion and lead-acid batteries, even with a significant initialization error Furthermore, a state-of-health estimation algorithm with particle swarm optimization is innovatively exploited to monitor both capacity fade and power fade of the battery during aging The functionalities and stability of both hardware and software of the cloud battery management system are validated with prototypes under field operation and experimental validation for both stationary and mobile applications

Journal ArticleDOI
TL;DR: This work proposed a new IoT layered model: generic and stretched with the privacy and security components and layers identification, and implemented security certificates to allow data transfer between the layers of the proposed cloud/edge enabled IoT model.
Abstract: Privacy and security are among the significant challenges of the Internet of Things (IoT). Improper device updates, lack of efficient and robust security protocols, user unawareness, and famous active device monitoring are among the challenges that IoT is facing. In this work, we are exploring the background of IoT systems and security measures, and identifying (a) different security and privacy issues, (b) approaches used to secure the components of IoT-based environments and systems, (c) existing security solutions, and (d) the best privacy models necessary and suitable for different layers of IoT driven applications. In this work, we proposed a new IoT layered model: generic and stretched with the privacy and security components and layers identification. The proposed cloud/edge supported IoT system is implemented and evaluated. The lower layer represented by the IoT nodes generated from the Amazon Web Service (AWS) as Virtual Machines. The middle layer (edge) implemented as a Raspberry Pi 4 hardware kit with support of the Greengrass Edge Environment in AWS. We used the cloud-enabled IoT environment in AWS to implement the top layer (the cloud). The security protocols and critical management sessions were between each of these layers to ensure the privacy of the users’ information. We implemented security certificates to allow data transfer between the layers of the proposed cloud/edge enabled IoT model. Not only is the proposed system model eliminating possible security vulnerabilities, but it also can be used along with the best security techniques to countermeasure the cybersecurity threats facing each one of the layers; cloud, edge, and IoT.

Journal ArticleDOI
TL;DR: A state-of-art survey on the integration of blockchain with 5G networks and beyond, including discussions on the potential of blockchain for enabling key 5G technologies, including cloud/edge computing, Software Defined Networks, Network Function Virtualization, Network Slicing, and D2D communications.

Journal ArticleDOI
TL;DR: This work proposes EUAGame, a game-theoretic approach that formulates the EUA problem as a potential game and designs a novel decentralized algorithm for finding a Nash equilibrium in the game as a solution to theEUA problem.
Abstract: Edge Computing provides mobile and Internet-of-Things (IoT) app vendors with a new distributed computing paradigm which allows an app vendor to deploy its app at hired edge servers distributed near app users at the edge of the cloud. This way, app users can be allocated to hired edge servers nearby to minimize network latency and energy consumption. A cost-effective edge user allocation (EUA) requires maximum app users to be served with minimum overall system cost. Finding a centralized optimal solution to this EUA problem is NP-hard. Thus, we propose EUAGame, a game-theoretic approach that formulates the EUA problem as a potential game. We analyze the game and show that it admits a Nash equilibrium. Then, we design a novel decentralized algorithm for finding a Nash equilibrium in the game as a solution to the EUA problem. The performance of this algorithm is theoretically analyzed and experimentally evaluated. The results show that the EUA problem can be solved effectively and efficiently.

Journal ArticleDOI
TL;DR: This article proposes a web attack detection system that takes advantage of analyzing URLs, designed to detect web attacks and is deployed on edge devices, and is competitive in detecting web attacks.
Abstract: With the development of Internet of Things (IoT) and cloud technologies, numerous IoT devices and sensors transmit huge amounts of data to cloud data centers for further processing. While providing us considerable convenience, cloud-based computing and storage also bring us many security problems, such as the abuse of information collection and concentrated web servers in the cloud. Traditional intrusion detection systems and web application firewalls are becoming incompatible with the new network environment, and related systems with machine learning or deep learning are emerging. However, cloud-IoT systems increase attacks against web servers, since data centralization carries a more attractive reward. In this article, based on distributed deep learning, we propose a web attack detection system that takes advantage of analyzing URLs. The system is designed to detect web attacks and is deployed on edge devices. The cloud handles the above challenges in the paradigm of the Edge of Things. Multiple concurrent deep models are used to enhance the stability of the system and the convenience in updating. We implemented experiments on the system with two concurrent deep models and compared the system with existing systems by using several datasets. The experimental results with 99.410% in accuracy, 98.91% in true positive rate (TPR), and 99.55% in detection rate of normal requests (DRN) demonstrate the system is competitive in detecting web attacks.

Posted Content
TL;DR: Flower is presented, a FL framework which is both agnostic towards heterogeneous client environments and also scales to a large number of clients, including mobile and embedded devices and describes the design goals and implementation considerations of Flower.
Abstract: Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared prediction model, while keeping their training data on the device, thereby decoupling the ability to do machine learning from the need to store the data in the cloud. However, FL is difficult to implement and deploy in practice, considering the heterogeneity in mobile devices, e.g., different programming languages, frameworks, and hardware accelerators. Although there are a few frameworks available to simulate FL algorithms (e.g., TensorFlow Federated), they do not support implementing FL workloads on mobile devices. Furthermore, these frameworks are designed to simulate FL in a server environment and hence do not allow experimentation in distributed mobile settings for a large number of clients. In this paper, we present Flower (https://flower.dev/), a FL framework which is both agnostic towards heterogeneous client environments and also scales to a large number of clients, including mobile and embedded devices. Flower's abstractions let developers port existing mobile workloads with little overhead, regardless of the programming language or ML framework used, while also allowing researchers flexibility to experiment with novel approaches to advance the state-of-the-art. We describe the design goals and implementation considerations of Flower and show our experiences in evaluating the performance of FL across clients with heterogeneous computational and communication capabilities.

Journal Article
TL;DR: The detailed cloud computing service system based on big data, which provides high performance solutions for large-scale data storage, processing and analysis, is introduced.
Abstract: As one of the main development directions in the information field, big data technology can be applied for data mining, data analysis and data sharing in the massive data, and it created huge economic benefits by using the potential value of data. Meanwhile, it can provide decision-making strategies for social and economic development. Big data service architecture is a new service economic model that takes data as a resource, and it loads and extracts the data collected from different data sources. This service architecture provides various customized data processing methods, data analysis and visualization services for service consumers. This paper first briefly introduces the general big data service architecture and the technical processing framework, which covered data collection and storage. Next, we discuss big data processing and analysis according to different service requirements, which can present valuable data for service consumers. Then, we introduce the detailed cloud computing service system based on big data, which provides high performance solutions for large-scale data storage, processing and analysis. Finally, we summarize some big data application scenarios over various fields.