scispace - formally typeset
Search or ask a question

Showing papers on "Edge computing published in 2021"


Journal ArticleDOI
TL;DR: 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.
Abstract: The fifth generation (5G) wireless communication networks are being deployed worldwide from 2020 and more capabilities are in the process of being standardized, such as mass connectivity, ultra-reliability, and guaranteed low latency. However, 5G will not meet all requirements of the future in 2030 and beyond, and sixth generation (6G) wireless communication networks are expected to provide global coverage, enhanced spectral/energy/cost efficiency, better intelligence level and security, etc. To meet these requirements, 6G networks will rely on new enabling technologies, i.e., air interface and transmission technologies and novel network architecture, such as waveform design, multiple access, channel coding schemes, multi-antenna technologies, network slicing, cell-free architecture, and cloud/fog/edge computing. Our vision on 6G is that it will have four new paradigm shifts. First, to satisfy the requirement of global coverage, 6G will not be limited to terrestrial communication networks, which will need to be complemented with non-terrestrial networks such as satellite and unmanned aerial vehicle (UAV) communication networks, thus achieving a space-air-ground-sea integrated communication network. Second, all spectra will be fully explored to further increase data rates and connection density, including the sub-6 GHz, millimeter wave (mmWave), terahertz (THz), and optical frequency bands. Third, facing the big datasets generated by the use of extremely heterogeneous networks, diverse communication scenarios, large numbers of antennas, wide bandwidths, and new service requirements, 6G networks will enable a new range of smart applications with the aid of artificial intelligence (AI) and big data technologies. Fourth, network security will have to be strengthened when developing 6G networks. This article provides a comprehensive survey of recent advances and future trends in these four aspects. Clearly, 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.

935 citations


Journal ArticleDOI
TL;DR: This paper aims to provide a survey-based tutorial on potential applications and supporting technologies of Industry 5.0 from the perspective of different industry practitioners and researchers.

314 citations


Journal ArticleDOI
TL;DR: A use case of fully autonomous driving is presented to show 6G supports massive IoT and some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized.
Abstract: Nowadays, many disruptive Internet-of-Things (IoT) applications emerge, such as augmented/virtual reality online games, autonomous driving, and smart everything, which are massive in number, data intensive, computation intensive, and delay sensitive. Due to the mismatch between the fifth generation (5G) and the requirements of such massive IoT-enabled applications, there is a need for technological advancements and evolutions for wireless communications and networking toward the sixth-generation (6G) networks. 6G is expected to deliver extended 5G capabilities at a very high level, such as Tbps data rate, sub-ms latency, cm-level localization, and so on, which will play a significant role in supporting massive IoT devices to operate seamlessly with highly diverse service requirements. Motivated by the aforementioned facts, in this article, we present a comprehensive survey on 6G-enabled massive IoT. First, we present the drivers and requirements by summarizing the emerging IoT-enabled applications and the corresponding requirements, along with the limitations of 5G. Second, visions of 6G are provided in terms of core technical requirements, use cases, and trends. Third, a new network architecture provided by 6G to enable massive IoT is introduced, i.e., space–air–ground–underwater/sea networks enhanced by edge computing. Fourth, some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized. Finally, a use case of fully autonomous driving is presented to show 6G supports massive IoT.

263 citations


Journal ArticleDOI
TL;DR: A smart, Deep Reinforcement Learning based Resource Allocation (DRLRA) scheme, which can allocate computing and network resources adaptively, reduce the average service time and balance the use of resources under varying MEC environment is proposed.
Abstract: The development of mobile devices with improving communication and perceptual capabilities has brought about a proliferation of numerous complex and computation-intensive mobile applications. Mobile devices with limited resources face more severe capacity constraints than ever before. As a new concept of network architecture and an extension of cloud computing, Mobile Edge Computing (MEC) seems to be a promising solution to meet this emerging challenge. However, MEC also has some limitations, such as the high cost of infrastructure deployment and maintenance, as well as the severe pressure that the complex and mutative edge computing environment brings to MEC servers. At this point, how to allocate computing resources and network resources rationally to satisfy the requirements of mobile devices under the changeable MEC conditions has become a great aporia. To combat this issue, we propose a smart, Deep Reinforcement Learning based Resource Allocation (DRLRA) scheme, which can allocate computing and network resources adaptively, reduce the average service time and balance the use of resources under varying MEC environment. Experimental results show that the proposed DRLRA performs better than the traditional OSPF algorithm in the mutative MEC conditions.

261 citations


Journal ArticleDOI
TL;DR: The landscape of MAR through the past and its future prospects with respect to the 5G systems and complementary technology MEC are discussed and an informative analysis of the network formation of current and future MAR systems in terms of cloud, edge, localized, and hybrid architectural options is provided.
Abstract: The Augmented Reality (AR) technology enhances the human perception of the world by combining the real environment with the virtual space. With the explosive growth of powerful, less expensive mobile devices, and the emergence of sophisticated communication infrastructure, Mobile Augmented Reality (MAR) applications are gaining increased popularity. MAR allows users to run AR applications on mobile devices with greater mobility and at a lower cost. The emerging 5G communication technologies act as critical enablers for future MAR applications to achieve ultra-low latency and extremely high data rates while Multi-access Edge Computing (MEC) brings enhanced computational power closer to the users to complement MAR. This paper extensively discusses the landscape of MAR through the past and its future prospects with respect to the 5G systems and complementary technology MEC. The paper especially provides an informative analysis of the network formation of current and future MAR systems in terms of cloud, edge, localized, and hybrid architectural options. The paper discusses key application areas for MAR and their future with the advent of 5G technologies. The paper also discusses the requirements and limitations of MAR technical aspects such as communication, mobility management, energy management, service offloading and migration, security, and privacy and analyzes the role of 5G technologies.

259 citations


Journal ArticleDOI
TL;DR: Several main issues in FLchain design are identified, including communication cost, resource allocation, incentive mechanism, security and privacy protection, and the applications of FLchain in popular MEC domains, such as edge data sharing, edge content caching and edge crowdsensing are investigated.
Abstract: Mobile-edge computing (MEC) has been envisioned as a promising paradigm to handle the massive volume of data generated from ubiquitous mobile devices for enabling intelligent services with the help of artificial intelligence (AI). Traditionally, AI techniques often require centralized data collection and training in a single entity, e.g., an MEC server, which is now becoming a weak point due to data privacy concerns and high overhead of raw data communications. In this context, federated learning (FL) has been proposed to provide collaborative data training solutions, by coordinating multiple mobile devices to train a shared AI model without directly exposing their underlying data, which enjoys considerable privacy enhancement. To improve the security and scalability of FL implementation, blockchain as a ledger technology is attractive for realizing decentralized FL training without the need for any central server. Particularly, the integration of FL and blockchain leads to a new paradigm, called FLchain , which potentially transforms intelligent MEC networks into decentralized, secure, and privacy-enhancing systems. This article presents an overview of the fundamental concepts and explores the opportunities of FLchain in MEC networks. We identify several main issues in FLchain design, including communication cost, resource allocation, incentive mechanism, security and privacy protection. The key solutions and the lessons learned along with the outlooks are also discussed. Then, we investigate the applications of FLchain in popular MEC domains, such as edge data sharing, edge content caching and edge crowdsensing. Finally, important research challenges and future directions are also highlighted.

238 citations


Journal ArticleDOI
TL;DR: The major purpose of this work is to create a novel and secure cache decision system (CDS) in a wireless network that operates over an SB, which will offer the users safer and efficient environment for browsing the Internet, sharing and managing large-scale data in the fog.
Abstract: This work proposes an innovative infrastructure of secure scenario which operates in a wireless-mobile 6G network for managing big data (BD) on smart buildings (SBs). Count on the rapid growth of telecommunication field new challenges arise. Furthermore, a new type of wireless network infrastructure, the sixth generation (6G), provides all the benefits of its past versions and also improves some issues which its predecessors had. In addition, relative technologies to the telecommunications filed, such as Internet of Things, cloud computing (CC) and edge computing (EC), can operate through a 6G wireless network. Take into account all these, we propose a scenario that try to combine the functions of the Internet of Things with CC, EC and BD in order to achieve a Smart and Secure environment. The major purpose of this work is to create a novel and secure cache decision system (CDS) in a wireless network that operates over an SB, which will offer the users safer and efficient environment for browsing the Internet, sharing and managing large-scale data in the fog. This CDS consisted of two types of servers, one cloud server and one edge server. In order to come up with our proposal, we study related cache scenarios systems which are listed, presented, and compared in this work.

229 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a comprehensive survey on AIoT to show how AI can empower the IoT to make it faster, smarter, greener, and safer, and highlight the challenges facing AI-oT and some potential research opportunities.
Abstract: In the Internet-of-Things (IoT) era, billions of sensors and devices collect and process data from the environment, transmit them to cloud centers, and receive feedback via the Internet for connectivity and perception. However, transmitting massive amounts of heterogeneous data, perceiving complex environments from these data, and then making smart decisions in a timely manner are difficult. Artificial intelligence (AI), especially deep learning, is now a proven success in various areas, including computer vision, speech recognition, and natural language processing. AI introduced into the IoT heralds the era of AI of things (AIoT). This article presents a comprehensive survey on AIoT to show how AI can empower the IoT to make it faster, smarter, greener, and safer. Specifically, we briefly present the AIoT architecture in the context of cloud computing, fog computing, and edge computing. Then, we present progress in AI research for IoT from four perspectives: 1) perceiving; 2) learning; 3) reasoning; and 4) behaving. Next, we summarize some promising applications of AIoT that are likely to profoundly reshape our world. Finally, we highlight the challenges facing AIoT and some potential research opportunities.

216 citations


Journal ArticleDOI
TL;DR: An imitation learning enabled branch-and-bound solution in edge intelligent IoVs to speed up the problem solving process with few training samples is put forward and it is proved that OMEN achieves near-optimal performance.
Abstract: Recently, Internet of Vehicles (IoV) has become one of the most active research fields in both academic and industry, which exploits resources of vehicles and Road Side Units (RSUs) to execute various vehicular applications. Due to the increasing number of vehicles and the asymmetrical distribution of traffic flows, it is essential for the network operator to design intelligent offloading strategies to improve network performance and provide high-quality services for users. However, the lack of global information and the time-variety of IoVs make it challenging to perform effective offloading and caching decisions under long-term energy constraints of RSUs. Since Artificial Intelligence (AI) and machine learning can greatly enhance the intelligence and the performance of IoVs, we push AI inspired computing, caching and communication resources to the proximity of smart vehicles, which jointly enable RSU peer offloading, vehicle-to-RSU offloading and content caching in the IoV framework. A Mix Integer Non-Linear Programming (MINLP) problem is formulated to minimize total network delay, consisting of communication delay, computation delay, network congestion delay and content downloading delay of all users. Then, we develop an online multi-decision making scheme (named OMEN) by leveraging Lyapunov optimization method to solve the formulated problem, and prove that OMEN achieves near-optimal performance. Leveraging strong cognition of AI, we put forward an imitation learning enabled branch-and-bound solution in edge intelligent IoVs to speed up the problem solving process with few training samples. Experimental results based on real-world traffic data demonstrate that our proposed method outperforms other methods from various aspects.

206 citations


Journal ArticleDOI
TL;DR: In this article, a cost-efficient in-home health monitoring system for IoMT by dividing it into two sub-networks, i.e., intra-WBANs and beyond WBANs, is presented.
Abstract: The prompt evolution of Internet of Medical Things (IoMT) promotes pervasive in-home health monitoring networks. However, excessive requirements of patients result in insufficient spectrum resources and communication overload. Mobile Edge Computing (MEC) enabled 5G health monitoring is conceived as a favorable paradigm to tackle such an obstacle. In this paper, we construct a cost-efficient in-home health monitoring system for IoMT by dividing it into two sub-networks, i.e., intra-Wireless Body Area Networks (WBANs) and beyond-WBANs. Highlighting the characteristics of IoMT, the cost of patients depends on medical criticality, Age of Information (AoI) and energy consumption. For intra-WBANs, a cooperative game is formulated to allocate the wireless channel resources. While for beyond-WBANs, considering the individual rationality and potential selfishness, a decentralized non-cooperative game is proposed to minimize the system-wide cost in IoMT. We prove that the proposed algorithm can reach a Nash equilibrium. In addition, the upper bound of the algorithm time complexity and the number of patients benefiting from MEC is theoretically derived. Performance evaluations demonstrate the effectiveness of our proposed algorithm with respect to the system-wide cost and the number of patients benefiting from MEC.

202 citations


Journal ArticleDOI
TL;DR: This article reformulates the microservice coordination problem using Markov decision process framework and then proposes a reinforcement learning-based online micro service coordination algorithm to learn the optimal strategy, which proves that the offline algorithm can find the optimal solution while the online algorithm can achieve near-optimal performance.
Abstract: As an emerging service architecture, microservice enables decomposition of a monolithic web service into a set of independent lightweight services which can be executed independently. With mobile edge computing, microservices can be further deployed in edge clouds dynamically, launched quickly, and migrated across edge clouds easily, providing better services for users in proximity. However, the user mobility can result in frequent switch of nearby edge clouds, which increases the service delay when users move away from their serving edge clouds. To address this issue, this article investigates microservice coordination among edge clouds to enable seamless and real-time responses to service requests from mobile users. The objective of this work is to devise the optimal microservice coordination scheme which can reduce the overall service delay with low costs. To this end, we first propose a dynamic programming-based offline microservice coordination algorithm, that can achieve the globally optimal performance. However, the offline algorithm heavily relies on the availability of the prior information such as computation request arrivals, time-varying channel conditions and edge cloud's computation capabilities required, which is hard to be obtained. Therefore, we reformulate the microservice coordination problem using Markov decision process framework and then propose a reinforcement learning-based online microservice coordination algorithm to learn the optimal strategy. Theoretical analysis proves that the offline algorithm can find the optimal solution while the online algorithm can achieve near-optimal performance. Furthermore, based on two real-world datasets, i.e., the Telecom's base station dataset and Taxi Track dataset from Shanghai, experiments are conducted. The experimental results demonstrate that the proposed online algorithm outperforms existing algorithms in terms of service delay and migration costs, and the achieved performance is close to the optimal performance obtained by the offline algorithm.

Journal ArticleDOI
TL;DR: A weighted cost model to minimize the execution time and energy consumption of IoT applications, in a computing environment with multiple IoT devices, multiple fog/edge servers and cloud servers is proposed and a new application placement technique based on the Memetic Algorithm is proposed to make batch application placement decision for concurrent IoT applications.
Abstract: Fog/Edge computing emerges as a novel computing paradigm that harnesses resources in the proximity of the Internet of Things (IoT) devices so that, alongside with the cloud servers, provide services in a timely manner. However, due to the ever-increasing growth of IoT devices with resource-hungry applications, fog/edge servers with limited resources cannot efficiently satisfy the requirements of the IoT applications. Therefore, the application placement in the fog/edge computing environment, in which several distributed fog/edge servers and centralized cloud servers are available, is a challenging issue. In this article, we propose a weighted cost model to minimize the execution time and energy consumption of IoT applications, in a computing environment with multiple IoT devices, multiple fog/edge servers and cloud servers. Besides, a new application placement technique based on the Memetic Algorithm is proposed to make batch application placement decision for concurrent IoT applications. Due to the heterogeneity of IoT applications, we also propose a lightweight pre-scheduling algorithm to maximize the number of parallel tasks for the concurrent execution. The performance results demonstrate that our technique significantly improves the weighted cost of IoT applications up to 65 percent in comparison to its counterparts.

Journal ArticleDOI
TL;DR: A brief overview of the added features and key performance indicators of 5G NR is presented and a next-generation wireless communication architecture that acts as the platform for migration towards beyond 5G/6G networks is proposed.
Abstract: Nowadays, 5G is in its initial phase of commercialization. The 5G network will revolutionize the existing wireless network with its enhanced capabilities and novel features. 5G New Radio (5G NR), referred to as the global standardization of 5G, is presently under the $3^{\mathrm {rd}}$ Generation Partnership Project (3GPP) and can be operable over the wide range of frequency bands from less than 6GHz to mmWave (100GHz). 3GPP mainly focuses on the three major use cases of 5G NR that are comprised of Ultra-Reliable and Low Latency Communication (uRLLC), Massive Machine Type Communication (mMTC), Enhanced Mobile Broadband (eMBB). For meeting the targets of 5G NR, multiple features like scalable numerology, flexible spectrum, forward compatibility, and ultra-lean design are added as compared to the LTE systems. This paper presents a brief overview of the added features and key performance indicators of 5G NR. The issues related to the adaptation of higher modulation schemes and inter-RAT handover synchronization are well addressed in this paper. With the consideration of these challenges, a next-generation wireless communication architecture is proposed. The architecture acts as the platform for migration towards beyond 5G/6G networks. Along with this, various technologies and applications of 6G networks are also overviewed in this paper. 6G network will incorporate Artificial intelligence (AI) based services, edge computing, quantum computing, optical wireless communication, hybrid access, and tactile services. For enabling these diverse services, a virtualized network slicing based architecture of 6G is proposed. Various ongoing projects on 6G and its technologies are also listed in this paper.

Journal ArticleDOI
TL;DR: From the simulation results, the MADDPG-based method can converge within 200 training episodes, comparable to the single-agent DDPG (SADDPG)-based one, and can achieve higher delay/QoS satisfaction ratios than the SADDPg-based and random schemes.
Abstract: In this paper, we investigate multi-dimensional resource management for unmanned aerial vehicles (UAVs) assisted vehicular networks. To efficiently provide on-demand resource access, the macro eNodeB and UAV, both mounted with multi-access edge computing (MEC) servers, cooperatively make association decisions and allocate proper amounts of resources to vehicles. Since there is no central controller, we formulate the resource allocation at the MEC servers as a distributive optimization problem to maximize the number of offloaded tasks while satisfying their heterogeneous quality-of-service (QoS) requirements, and then solve it with a multi-agent deep deterministic policy gradient (MADDPG)-based method. Through centrally training the MADDPG model offline, the MEC servers, acting as learning agents, then can rapidly make vehicle association and resource allocation decisions during the online execution stage. From our simulation results, the MADDPG-based method can converge within 200 training episodes, comparable to the single-agent DDPG (SADDPG)-based one. Moreover, the proposed MADDPG-based resource management scheme can achieve higher delay/QoS satisfaction ratios than the SADDPG-based and random schemes.

Journal ArticleDOI
TL;DR: A Peer-to-Peer (P2P) computing resource trading system to balance computing resource spatio-temporal dynamic demands in IoV-assisted smart city and security analysis shows the security performance of the system and numerical simulations show that the strategies can encourage the collaboration between the buyer and smart vehicles.
Abstract: In a smart city, Mobile Edge Computing (MEC) are generally deployed in static fashion in base stations (BSs). While moving vehicles with advanced on-board equipment can be regarded as dynamic computing resource transporters ignoring geographical limitations. Thus Internet of Vehicle (IoV) could assist the smart city to achieve flexible computing resource demand response (DR) via paid sharing the idle vehicle computing resources. Motivated by this, we propose a Peer-to-Peer (P2P) computing resource trading system to balance computing resource spatio-temporal dynamic demands in IoV-assisted smart city. On one hand, to guarantee transaction security and privacy-preserving in our system, we employ a consortium blockchain approach and demonstrate the process of secure computing resource trading without involving a centralized trusted third-party. On the other hand, to encourage individual smart vehicles to participate in our system, we construct a two-stage Stackelberg game jointly optimizing the utilities of buyers and sellers. And we also derive the optimal computing pricing and trading amount strategies in this proposed game. Finally, security analysis shows the security performance of our system and numerical simulations show that our strategies can encourage the collaboration between the buyer and smart vehicles.

Journal ArticleDOI
TL;DR: The IoT/IIoT critical infrastructure in industry 4.0 is introduced, and then the blockchain and edge computing paradigms are briefly presented, and it is shown how the convergence of these two paradigm can enable secure and scalable critical infrastructures.
Abstract: Critical infrastructure systems are vital to underpin the functioning of a society and economy. Due to the ever-increasing number of Internet-connected Internet-of-Things (IoT)/Industrial IoT (IIoT), and the high volume of data generated and collected, security and scalability are becoming burning concerns for critical infrastructures in industry 4.0. The blockchain technology is essentially a distributed and secure ledger that records all the transactions into a hierarchically expanding chain of blocks. Edge computing brings the cloud capabilities closer to the computation tasks. The convergence of blockchain and edge computing paradigms can overcome the existing security and scalability issues. In this article, we first introduce the IoT/IIoT critical infrastructure in industry 4.0, and then we briefly present the blockchain and edge computing paradigms. After that, we show how the convergence of these two paradigms can enable secure and scalable critical infrastructures. Then, we provide a survey on the state of the art for security and privacy and scalability of IoT/IIoT critical infrastructures. A list of potential research challenges and open issues in this area is also provided, which can be used as useful resources to guide future research.

Journal ArticleDOI
TL;DR: This paper develops an intent-based traffic control system by investigating Deep Reinforcement Learning for 5G-envisioned IoCVs, which can dynamically orchestrate edge computing and content caching to improve the profits of Mobile Network Operator (MNO).
Abstract: Recent developments of edge computing and content caching in wireless networks enable the Intelligent Transportation System (ITS) to provide high-quality services for vehicles. However, a variety of vehicular applications and time-varying network status make it challenging for ITS to allocate resources efficiently. Artificial intelligence algorithms, owning the cognitive capability for diverse and time-varying features of Internet of Connected Vehicles (IoCVs), enable an intent-based networking for ITS to tackle the above-mentioned challenges. In this paper, we develop an intent-based traffic control system by investigating Deep Reinforcement Learning (DRL) for 5G-envisioned IoCVs, which can dynamically orchestrate edge computing and content caching to improve the profits of Mobile Network Operator (MNO). By jointly analyzing MNO’s revenue and users’ quality of experience, we define a profit function to calculate the MNO’s profits. After that, we formulate a joint optimization problem to maximize MNO’s profits, and develop an intelligent traffic control scheme by investigating DRL, which can improve system profits of the MNO and allocate network resources effectively. Experimental results based on real traffic data demonstrate our designed system is efficient and well-performed.

Journal ArticleDOI
TL;DR: Experimental results demonstrate that MPCF outperforms other baseline caching schemes in terms of the cache hit ratio in vehicular edge networks, and can greatly improve cache performance, effectively protect users’ privacy and significantly reduce communication costs.
Abstract: Content Caching at the edge of vehicular networks has been considered as a promising technology to satisfy the increasing demands of computation-intensive and latency-sensitive vehicular applications for intelligent transportation. The existing content caching schemes, when used in vehicular networks, face two distinct challenges: 1) Vehicles connected to an edge server keep moving, making the content popularity varying and hard to predict. 2) Cached content is easily out-of-date since each connected vehicle stays in the area of an edge server for a short duration. To address these challenges, we propose a Mobility-aware Proactive edge Caching scheme based on Federated learning (MPCF). This new scheme enables multiple vehicles to collaboratively learn a global model for predicting content popularity with the private training data distributed on local vehicles. MPCF also employs a Context-aware Adversarial AutoEncoder to predict the highly dynamic content popularity. Besides, MPCF integrates a mobility-aware cache replacement policy, which allows the network edges to add/evict contents in response to the mobility patterns and preferences of vehicles. MPCF can greatly improve cache performance, effectively protect users’ privacy and significantly reduce communication costs. Experimental results demonstrate that MPCF outperforms other baseline caching schemes in terms of the cache hit ratio in vehicular edge networks.

Journal ArticleDOI
TL;DR: A multi-agent deep reinforcement learning based trajectory control algorithm is proposed for managing the trajectory of each UAV independently, where the popular Multi-Agent Deep Deterministic Policy Gradient (MADDPG) method is applied.
Abstract: An unmanned aerial vehicle (UAV)-aided mobile edge computing (MEC) framework is proposed, where several UAVs having different trajectories fly over the target area and support the user equipments (UEs) on the ground. We aim to jointly optimize the geographical fairness among all the UEs, the fairness of each UAV’ UE-load and the overall energy consumption of UEs. The above optimization problem includes both integer and continues variables and it is challenging to solve. To address the above problem, a multi-agent deep reinforcement learning based trajectory control algorithm is proposed for managing the trajectory of each UAV independently, where the popular Multi-Agent Deep Deterministic Policy Gradient (MADDPG) method is applied. Given the UAVs’ trajectories, a low-complexity approach is introduced for optimizing the offloading decisions of UEs. We show that our proposed solution has considerable performance over other traditional algorithms, both in terms of the fairness for serving UEs, fairness of UE-load at each UAV and energy consumption for all the UEs.

Journal ArticleDOI
TL;DR: A blockchain-empowered federated learning scheme to strengthen communication security and data privacy protection in DITEN and an asynchronous aggregation scheme to improve the efficiency and use digital twin empowered reinforcement learning to schedule relaying users and allocate spectrum resources.
Abstract: Emerging technologies, such as mobile-edge computing (MEC) and next-generation communications are crucial for enabling rapid development and deployment of the Internet of Things (IoT). With the increasing scale of IoT networks, how to optimize the network and allocate the limited resources to provide high-quality services remains a major concern. The existing work in this direction mainly relies on models that are of less practical value for resource-limited IoT networks, and can hardly simulate the dynamic systems in real time. In this article, we integrate digital twins with edge networks and propose the digital twin edge networks (DITENs) to fill the gap between physical edge networks and digital systems. Then, we propose a blockchain-empowered federated learning scheme to strengthen communication security and data privacy protection in DITEN. Furthermore, to improve the efficiency of the integrated scheme, we propose an asynchronous aggregation scheme and use digital twin empowered reinforcement learning to schedule relaying users and allocate spectrum resources. Theoretical analysis and numerical results confirm that the proposed scheme can considerably enhance both communication efficiency and data security for IoT applications.

Journal ArticleDOI
TL;DR: A thorough investigation of the identification and the analysis of threat vectors in the ETSI standardized MEC architecture is introduced and the vulnerabilities leading to the identified threat vectors are analyzed and potential security solutions to overcome these vulnerabilities are proposed.
Abstract: The European Telecommunications Standards Institute (ETSI) has introduced the paradigm of Multi-Access Edge Computing (MEC) to enable efficient and fast data processing in mobile networks. Among other technological requirements, security and privacy are significant factors in the realization of MEC deployments. In this paper, we analyse the security and privacy of the MEC system. We introduce a thorough investigation of the identification and the analysis of threat vectors in the ETSI standardized MEC architecture. Furthermore, we analyse the vulnerabilities leading to the identified threat vectors and propose potential security solutions to overcome these vulnerabilities. The privacy issues of MEC are also highlighted, and clear objectives for preserving privacy are defined. Finally, we present future directives to enhance the security and privacy of MEC services.

Journal ArticleDOI
TL;DR: An energy-efficient dynamic task offloading algorithm is developed by choosing the optimal computing place in an online way, either on the IoT device, the MEC server or the MCC server with the goal of jointly minimizing the energy consumption and task response time.
Abstract: With the proliferation of compute-intensive and delay-sensitive mobile applications, large amounts of computational resources with stringent latency requirements are required on Internet-of-Things (IoT) devices. One promising solution is to offload complex computing tasks from IoT devices either to mobile-edge computing (MEC) or mobile cloud computing (MCC) servers. MEC servers are much closer to IoT devices and thus have lower latency, while MCC servers can provide flexible and scalable computing capability to support complicated applications. To address the tradeoff between limited computing capacity and high latency, and meanwhile, ensure the data integrity during the offloading process, we consider a blockchain scenario where edge computing and cloud computing can collaborate toward secure task offloading. We further propose a blockchain-enabled IoT-Edge-Cloud computing architecture that benefits both from MCC and MEC, where MEC servers offer lower latency computing services, while MCC servers provide stronger computation power. Moreover, we develop an energy-efficient dynamic task offloading (EEDTO) algorithm by choosing the optimal computing place in an online way, either on the IoT device, the MEC server or the MCC server with the goal of jointly minimizing the energy consumption and task response time. The Lyapunov optimization technique is applied to control computation and communication costs incurred by different types of applications and the dynamic changes of wireless environments. During the optimization, the best computing location for each task is chosen adaptively without requiring future system information as prior knowledge. Compared with previous offloading schemes with/without MEC and MCC cooperation, EEDTO can achieve energy-efficient offloading decisions with relatively lower computational complexity.

Journal ArticleDOI
TL;DR: An online algorithm, called CEDC-O, is proposed, developed based on Lyapunov optimization, works online without requiring future information, and achieves provable close-to-optimal performance.
Abstract: In the edge computing (EC) environment, edge servers are deployed at base stations to offer highly accessible computing and storage resources to nearby app users. From the app vendor's perspective, caching data on edge servers can ensure low latency in app users’ retrieval of app data. However, an edge server normally owns limited resources due to its limited size. In this article, we investigate the collaborative caching problem in the EC environment with the aim to minimize the system cost including data caching cost, data migration cost, and quality-of-service (QoS) penalty. We model this collaborative edge data caching problem (CEDC) as a constrained optimization problem and prove that it is $\mathcal {NP}$ NP -complete. We propose an online algorithm, called CEDC-O, to solve this CEDC problem during all time slots. CEDC-O is developed based on Lyapunov optimization, works online without requiring future information, and achieves provable close-to-optimal performance. CEDC-O is evaluated on a real-world data set, and the results demonstrate that it significantly outperforms four representative approaches.

Journal ArticleDOI
Shuai Yu1, Xu Chen1, Zhi Zhou1, Xiaowen Gong2, Di Wu1 
TL;DR: In this article, the authors proposed an intelligent UDEC (I-UDEC) framework, which integrates blockchain and artificial intelligence (AI) into 5G UDEC networks, and designed a novel two-timescale deep reinforcement learning (2Ts-DRL) approach.
Abstract: Recently, smart cities, healthcare system, and smart vehicles have raised challenges on the capability and connectivity of state-of-the-art Internet-of-Things (IoT) devices, especially for the devices in hotspots area. Multiaccess edge computing (MEC) can enhance the ability of emerging resource-intensive IoT applications and has attracted much attention. However, due to the time-varying network environments, as well as the heterogeneous resources of network devices, it is hard to achieve stable, reliable, and real-time interactions between edge devices and their serving edge servers, especially in the 5G ultradense network (UDN) scenarios. Ultradense edge computing (UDEC) has the potential to fill this gap, especially in the 5G era, but it still faces challenges in its current solutions, such as the lack of: 1) efficient utilization of multiple 5G resources (e.g., computation, communication, storage, and service resources); 2) low overhead offloading decision making and resource allocation strategies; and 3) privacy and security protection schemes. Thus, we first propose an intelligent UDEC (I-UDEC) framework, which integrates blockchain and artificial intelligence (AI) into 5G UDEC networks. Then, in order to achieve real-time and low overhead computation offloading decisions and resource allocation strategies, we design a novel two-timescale deep reinforcement learning (2Ts-DRL) approach, consisting of a fast-timescale and a slow-timescale learning process, respectively. The primary objective is to minimize the total offloading delay and network resource usage by jointly optimizing computation offloading, resource allocation, and service caching placement. We also leverage federated learning (FL) to train the 2Ts-DRL model in a distributed manner, aiming to protect the edge devices’ data privacy. Simulation results corroborate the effectiveness of both the 2Ts-DRL and FL in the I-UDEC framework and prove that our proposed algorithm can reduce task execution time up to 31.87%.

Journal ArticleDOI
TL;DR: A comprehensive survey of IoT-and IoMT-based edge-intelligent smart health care, mainly focusing on journal articles published between 2014 and 2020, is presented in this article.
Abstract: Smart health care is an important aspect of connected living. Health care is one of the basic pillars of human need, and smart health care is projected to produce several billion dollars in revenue in the near future. There are several components of smart health care, including the Internet of Things (IoT), the Internet of Medical Things (IoMT), medical sensors, artificial intelligence (AI), edge computing, cloud computing, and next-generation wireless communication technology. Many papers in the literature deal with smart health care or health care in general. Here, we present a comprehensive survey of IoT- and IoMT-based edge-intelligent smart health care, mainly focusing on journal articles published between 2014 and 2020. We survey this literature by answering several research areas on IoT and IoMT, AI, edge and cloud computing, security, and medical signals fusion. We also address current research challenges and offer some future research directions.

Journal ArticleDOI
TL;DR: The digital twin edge networks (DITENs) are proposed by incorporating digital twin into edge networks to fill the gap between physical systems and digital spaces and leverage the federated learning to construct digital twin models of IoT devices based on their running data.
Abstract: The rapid development of artificial intelligence and 5G paradigm, opens up new possibilities for emerging applications in industrial Internet of Things (IIoT). However, the large amount of data, the limited resources of Internet of Things devices, and the increasing concerns of data privacy, are major obstacles to improve the quality of services in IIoT. In this article, we propose the digital twin edge networks (DITENs) by incorporating digital twin into edge networks to fill the gap between physical systems and digital spaces. We further leverage the federated learning to construct digital twin models of IoT devices based on their running data. Moreover, to mitigate the communication overhead, we propose an asynchronous model update scheme and formulate the federated learning scheme as an optimization problem. We further decompose the problem and solve the subproblems based on the deep neural network model. Numerical results show that our proposed federated learning scheme for DITEN improves the communication efficiency and reduces the transmission energy cost.

Proceedings ArticleDOI
01 Feb 2021
TL;DR: In this paper, an in-depth analysis promotes a broad vision for bringing Serverless to the Edge Computing and issues major challenges for serverless to be met before entering Edge computing.
Abstract: Born from a need for a pure “pay-per-use” model and highly scalable platform, the “Serverless” paradigm emerged and has the potential to become a dominant way of building cloud applications Although it was originally designed for cloud environments, Serverless is finding its position in the Edge Computing landscape, aiming to bring computational resources closer to the data source That is, Serverless is crossing cloud borders to assess its merits in Edge computing, whose principal partner will be the Internet of Things (IoT) applications This move sounds promising as Serverless brings particular benefits such as eliminating always-on services causing high electricity usage, for instance However, the community is still hesitant to uptake Serverless Edge Computing because of the cloud-driven design of current Serverless platforms, and distinctive characteristics of edge landscape and IoT applications In this paper, we evaluate both sides to shed light on the Serverless new territory Our in-depth analysis promotes a broad vision for bringing Serverless to the Edge Computing It also issues major challenges for Serverless to be met before entering Edge computing

Journal ArticleDOI
TL;DR: To meet the performance requirements of IoT enabled services, context-based offloading can play a crucial role, according to the study of drawn results and limitations of the existing frameworks.
Abstract: The Internet of Things (IoT) applications and services are increasingly becoming a part of daily life; from smart homes to smart cities, industry, agriculture, it is penetrating practically in every domain. Data collected over the IoT applications, mostly through the sensors connected over the devices, and with the increasing demand, it is not possible to process all the data on the devices itself. The data collected by the device sensors are in vast amount and require high-speed computation and processing, which demand advanced resources. Various applications and services that are crucial require meeting multiple performance parameters like time-sensitivity and energy efficiency, computation offloading framework comes into play to meet these performance parameters and extreme computation requirements. Computation or data offloading tasks to nearby devices or the fog or cloud structure can aid in achieving the resource requirements of IoT applications. In this paper, the role of context or situation to perform the offloading is studied and drawn to a conclusion, that to meet the performance requirements of IoT enabled services, context-based offloading can play a crucial role. Some of the existing frameworks EMCO, MobiCOP-IoT, Autonomic Management Framework, CSOS, Fog Computing Framework, based on their novelty and optimum performance are taken for implementation analysis and compared with the MAUI, AnyRun Computing (ARC), AutoScaler, Edge computing and Context-Sensitive Model for Offloading System (CoSMOS) frameworks. Based on the study of drawn results and limitations of the existing frameworks, future directions under offloading scenarios are discussed.

Journal ArticleDOI
TL;DR: A collaborative method for the quantification and placement of ESs, named CQP, is developed for social media services in industrial CIoV, and is evaluated with a real-world ITS social media data set from China.
Abstract: The automotive industry, a key part of industrial Internet of Things, is now converging with cognitive computing (CC) and leading to industrial cognitive Internet of Vehicles (CIoV). As the major data source of industrial CIoV, social media has a significant impact on the quality of service (QoS) of the automotive industry. To provide vehicular social media services with low latency and high reliability, edge computing is adopted to complement cloud computing by offloading CC tasks to the edge of the network. Generally, task offloading is implemented based on the premise that edge servers (ESs) are appropriately quantified and located. However, the quantification of ESs is often offered according to empirical knowledge, lacking analysis on real condition of intelligent transportation system (ITS). To address the abovementioned problem, a c ollaborative method for the q uantification and p lacement of ESs, named CQP, is developed for social media services in industrial CIoV. Technically, CQP begins with a population initializing strategy by Canopy and K-medoids clustering to estimate the approximate ES quantity. Then, nondominated sorting genetic algorithm III is adopted to achieve solutions with higher QoS. Finally, CQP is evaluated with a real-world ITS social media data set from China.

Journal ArticleDOI
TL;DR: A newly designed deep neural network model called A-YONet, which is constructed by combining the advantages of YOLO and MTCNN is proposed to be deployed in an end–edge–cloud surveillance system, in order to realize the lightweight training and feature learning with limited computing sources.
Abstract: Along with the rapid development of cloud computing, IoT, and AI technologies, cloud video surveillance (CVS) has become a hotly discussed topic, especially when facing the requirement of real-time analysis in smart applications Object detection usually plays an important role for environment monitoring and activity tracking in surveillance system The emerging edge-cloud computing paradigm provides us an opportunity to deal with the continuously generated huge amount of surveillance data in an on-site manner across IoT systems However, the detection performance is still far away from satisfactions due to the complex surveilling environment In this study, we focus on the multitarget detection for real-time surveillance in smart IoT systems A newly designed deep neural network model called A-YONet, which is constructed by combining the advantages of YOLO and MTCNN, is proposed to be deployed in an end–edge–cloud surveillance system, in order to realize the lightweight training and feature learning with limited computing sources An intelligent detection algorithm is then developed based on a preadjusting scheme of anchor box and a multilevel feature fusion mechanism Experiments and evaluations using two data sets, including one public data set and one homemade data set obtained in a real surveillance system, demonstrate the effectiveness of our proposed method in enhancing training efficiency and detection precision, especially for multitarget detection in smart IoT application developments