scispace - formally typeset
Search or ask a question

Showing papers on "Edge computing published in 2020"


Journal ArticleDOI
29 Jan 2020-Nature
TL;DR: The fabrication of high-yield, high-performance and uniform memristor crossbar arrays for the implementation of CNNs and an effective hybrid-training method to adapt to device imperfections and improve the overall system performance are proposed.
Abstract: Memristor-enabled neuromorphic computing systems provide a fast and energy-efficient approach to training neural networks1–4. However, convolutional neural networks (CNNs)—one of the most important models for image recognition5—have not yet been fully hardware-implemented using memristor crossbars, which are cross-point arrays with a memristor device at each intersection. Moreover, achieving software-comparable results is highly challenging owing to the poor yield, large variation and other non-ideal characteristics of devices6–9. Here we report the fabrication of high-yield, high-performance and uniform memristor crossbar arrays for the implementation of CNNs, which integrate eight 2,048-cell memristor arrays to improve parallel-computing efficiency. In addition, we propose an effective hybrid-training method to adapt to device imperfections and improve the overall system performance. We built a five-layer memristor-based CNN to perform MNIST10 image recognition, and achieved a high accuracy of more than 96 per cent. In addition to parallel convolutions using different kernels with shared inputs, replication of multiple identical kernels in memristor arrays was demonstrated for processing different inputs in parallel. The memristor-based CNN neuromorphic system has an energy efficiency more than two orders of magnitude greater than that of state-of-the-art graphics-processing units, and is shown to be scalable to larger networks, such as residual neural networks. Our results are expected to enable a viable memristor-based non-von Neumann hardware solution for deep neural networks and edge computing. A fully hardware-based memristor convolutional neural network using a hybrid training method achieves an energy efficiency more than two orders of magnitude greater than that of graphics-processing units.

1,033 citations


Journal ArticleDOI
TL;DR: By consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge, i.e., Edge DL.
Abstract: Ubiquitous sensors and smart devices from factories and communities are generating massive amounts of data, and ever-increasing computing power is driving the core of computation and services from the cloud to the edge of the network. As an important enabler broadly changing people’s lives, from face recognition to ambitious smart factories and cities, developments of artificial intelligence (especially deep learning, DL) based applications and services are thriving. However, due to efficiency and latency issues, the current cloud computing service architecture hinders the vision of “providing artificial intelligence for every person and every organization at everywhere”. Thus, unleashing DL services using resources at the network edge near the data sources has emerged as a desirable solution. Therefore, edge intelligence , aiming to facilitate the deployment of DL services by edge computing, has received significant attention. In addition, DL, as the representative technique of artificial intelligence, can be integrated into edge computing frameworks to build intelligent edge for dynamic, adaptive edge maintenance and management. With regard to mutually beneficial edge intelligence and intelligent edge , this paper introduces and discusses: 1) the application scenarios of both; 2) the practical implementation methods and enabling technologies, namely DL training and inference in the customized edge computing framework; 3) challenges and future trends of more pervasive and fine-grained intelligence. We believe that by consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge , i.e., Edge DL.

611 citations


Journal ArticleDOI
TL;DR: This paper presents the IoT technology from a bird's eye view covering its statistical/architectural trends, use cases, challenges and future prospects, and discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.
Abstract: The Internet of Things (IoT)-centric concepts like augmented reality, high-resolution video streaming, self-driven cars, smart environment, e-health care, etc. have a ubiquitous presence now. These applications require higher data-rates, large bandwidth, increased capacity, low latency and high throughput. In light of these emerging concepts, IoT has revolutionized the world by providing seamless connectivity between heterogeneous networks (HetNets). The eventual aim of IoT is to introduce the plug and play technology providing the end-user, ease of operation, remotely access control and configurability. This paper presents the IoT technology from a bird’s eye view covering its statistical/architectural trends, use cases, challenges and future prospects. The paper also presents a detailed and extensive overview of the emerging 5G-IoT scenario. Fifth Generation (5G) cellular networks provide key enabling technologies for ubiquitous deployment of the IoT technology. These include carrier aggregation, multiple-input multiple-output (MIMO), massive-MIMO (M-MIMO), coordinated multipoint processing (CoMP), device-to-device (D2D) communications, centralized radio access network (CRAN), software-defined wireless sensor networking (SD-WSN), network function virtualization (NFV) and cognitive radios (CRs). This paper presents an exhaustive review for these key enabling technologies and also discusses the new emerging use cases of 5G-IoT driven by the advances in artificial intelligence, machine and deep learning, ongoing 5G initiatives, quality of service (QoS) requirements in 5G and its standardization issues. Finally, the paper discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.

591 citations


Journal ArticleDOI
TL;DR: The Internet of Nano Things and Tactile Internet are driving the innovation in the H-IoT applications and the future course for improving the Quality of Service (QoS) using these new technologies are identified.
Abstract: The impact of the Internet of Things (IoT) on the advancement of the healthcare industry is immense. The ushering of the Medicine 4.0 has resulted in an increased effort to develop platforms, both at the hardware level as well as the underlying software level. This vision has led to the development of Healthcare IoT (H-IoT) systems. The basic enabling technologies include the communication systems between the sensing nodes and the processors; and the processing algorithms for generating an output from the data collected by the sensors. However, at present, these enabling technologies are also supported by several new technologies. The use of Artificial Intelligence (AI) has transformed the H-IoT systems at almost every level. The fog/edge paradigm is bringing the computing power close to the deployed network and hence mitigating many challenges in the process. While the big data allows handling an enormous amount of data. Additionally, the Software Defined Networks (SDNs) bring flexibility to the system while the blockchains are finding the most novel use cases in H-IoT systems. The Internet of Nano Things (IoNT) and Tactile Internet (TI) are driving the innovation in the H-IoT applications. This paper delves into the ways these technologies are transforming the H-IoT systems and also identifies the future course for improving the Quality of Service (QoS) using these new technologies.

446 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the beneficial role of RISs in MEC systems, where single-antenna devices may opt for offloading a fraction of their computational tasks to the edge computing node via a multi-ANTenna access point with the aid of an RIS.
Abstract: Computation off-loading in mobile edge computing (MEC) systems constitutes an efficient paradigm of supporting resource-intensive applications on mobile devices. However, the benefit of MEC cannot be fully exploited, when the communications link used for off-loading computational tasks is hostile. Fortunately, the propagation-induced impairments may be mitigated by intelligent reflecting surfaces (IRS), which are capable of enhancing both the spectral- and energy-efficiency. Specifically, an IRS comprises an IRS controller and a large number of passive reflecting elements, each of which may impose a phase shift on the incident signal, thus collaboratively improving the propagation environment. In this paper, the beneficial role of IRSs is investigated in MEC systems, where single-antenna devices may opt for off-loading a fraction of their computational tasks to the edge computing node via a multi-antenna access point with the aid of an IRS. Pertinent latency-minimization problems are formulated for both single-device and multi-device scenarios, subject to practical constraints imposed on both the edge computing capability and the IRS phase shift design. To solve this problem, the block coordinate descent (BCD) technique is invoked to decouple the original problem into two subproblems, and then the computing and communications settings are alternatively optimized using low-complexity iterative algorithms. It is demonstrated that our IRS-aided MEC system is capable of significantly outperforming the conventional MEC system operating without IRSs. Quantitatively, about 20 % computational latency reduction is achieved over the conventional MEC system in a single cell of a 300 m radius and 5 active devices, relying on a 5-antenna access point.

403 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a comprehensive overview of mobile edge computing (MEC) and its potential use cases and applications, as well as discuss challenges and potential future directions for MEC research.
Abstract: Driven by the emergence of new compute-intensive applications and the vision of the Internet of Things (IoT), it is foreseen that the emerging 5G network will face an unprecedented increase in traffic volume and computation demands. However, end users mostly have limited storage capacities and finite processing capabilities, thus how to run compute-intensive applications on resource-constrained users has recently become a natural concern. Mobile edge computing (MEC), a key technology in the emerging fifth generation (5G) network, can optimize mobile resources by hosting compute-intensive applications, process large data before sending to the cloud, provide the cloud-computing capabilities within the radio access network (RAN) in close proximity to mobile users, and offer context-aware services with the help of RAN information. Therefore, MEC enables a wide variety of applications, where the real-time response is strictly required, e.g., driverless vehicles, augmented reality, robotics, and immerse media. Indeed, the paradigm shift from 4G to 5G could become a reality with the advent of new technological concepts. The successful realization of MEC in the 5G network is still in its infancy and demands for constant efforts from both academic and industry communities. In this survey, we first provide a holistic overview of MEC technology and its potential use cases and applications. Then, we outline up-to-date researches on the integration of MEC with the new technologies that will be deployed in 5G and beyond. We also summarize testbeds and experimental evaluations, and open source activities, for edge computing. We further summarize lessons learned from state-of-the-art research works as well as discuss challenges and potential future directions for MEC research.

402 citations


Journal ArticleDOI
TL;DR: A novel framework called HealthFog is proposed for integrating ensemble deep learning in Edge computing devices and deployed it for a real-life application of automatic Heart Disease analysis.

387 citations


Journal ArticleDOI
TL;DR: In this paper, the authors divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge), and provide insights into this new interdisciplinary field from a broader perspective.
Abstract: Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, edge computing, is surging in popularity. Meanwhile, the artificial intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate edge computing and AI, which gives birth to edge intelligence. In this article, we divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge). The former focuses on providing more optimal solutions to key problems in edge computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This article provides insights into this new interdisciplinary field from a broader perspective. It discusses the core concepts and the research roadmap, which should provide the necessary background for potential future research initiatives in edge intelligence.

343 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed Edgent, a framework that leverages edge computing for DNN collaborative inference through device-edge synergy, which adaptively partitions computation between device and edge for purpose of coordinating the powerful cloud resource and the proximal edge resource for real-time DNN inference.
Abstract: As a key technology of enabling Artificial Intelligence (AI) applications in 5G era, Deep Neural Networks (DNNs) have quickly attracted widespread attention. However, it is challenging to run computation-intensive DNN-based tasks on mobile devices due to the limited computation resources. What’s worse, traditional cloud-assisted DNN inference is heavily hindered by the significant wide-area network latency, leading to poor real-time performance as well as low quality of user experience. To address these challenges, in this paper, we propose Edgent , a framework that leverages edge computing for DNN collaborative inference through device-edge synergy. Edgent exploits two design knobs: (1) DNN partitioning that adaptively partitions computation between device and edge for purpose of coordinating the powerful cloud resource and the proximal edge resource for real-time DNN inference; (2) DNN right-sizing that further reduces computing latency via early exiting inference at an appropriate intermediate DNN layer. In addition, considering the potential network fluctuation in real-world deployment, Edgent is properly design to specialize for both static and dynamic network environment. Specifically, in a static environment where the bandwidth changes slowly, Edgent derives the best configurations with the assist of regression-based prediction models, while in a dynamic environment where the bandwidth varies dramatically, Edgent generates the best execution plan through the online change point detection algorithm that maps the current bandwidth state to the optimal configuration. We implement Edgent prototype based on the Raspberry Pi and the desktop PC and the extensive experimental evaluations demonstrate Edgent ’s effectiveness in enabling on-demand low-latency edge intelligence.

329 citations


Journal ArticleDOI
TL;DR: This work sets out to be a guide to the status of DT development and application in today’s academic and industrial environment by selecting 123 representative items together with 22 supplementary works to address those two perspectives, while considering technical aspects as a fundamental.
Abstract: With the rapid advancement of cyber-physical systems, Digital Twin (DT) is gaining ever-increasing attention owing to its great capabilities to realize Industry 4.0. Enterprises from different fields are taking advantage of its ability to simulate real-time working conditions and perform intelligent decision-making, where a cost-effective solution can be readily delivered to meet individual stakeholder demands. As a hot topic, many approaches have been designed and implemented to date. However, most approaches today lack a comprehensive review to examine DT benefits by considering both engineering product lifecycle management and business innovation as a whole. To fill this gap, this work conducts a state-of-the art survey of DT by selecting 123 representative items together with 22 supplementary works to address those two perspectives, while considering technical aspects as a fundamental. The systematic review further identifies eight future perspectives for DT, including modular DT, modeling consistency and accuracy, incorporation of Big Data analytics in DT models, DT simulation improvements, VR integration into DT, expansion of DT domains, efficient mapping of cyber-physical data and cloud/edge computing integration. This work sets out to be a guide to the status of DT development and application in today’s academic and industrial environment.

310 citations


Journal ArticleDOI
TL;DR: Some typical application scenarios of edge computing in IIoT, such as prognostics and health management, smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics, are introduced.
Abstract: The Industrial Internet of Things (IIoT) is a crucial research field spawned by the Internet of Things (IoT). IIoT links all types of industrial equipment through the network; establishes data acquisition, exchange, and analysis systems; and optimizes processes and services, so as to reduce cost and enhance productivity. The introduction of edge computing in IIoT can significantly reduce the decision-making latency, save bandwidth resources, and to some extent, protect privacy. This paper outlines the research progress concerning edge computing in IIoT. First, the concepts of IIoT and edge computing are discussed, and subsequently, the research progress of edge computing is discussed and summarized in detail. Next, the future architecture from the perspective of edge computing in IIoT is proposed, and its technical progress in routing, task scheduling, data storage and analytics, security, and standardization is analyzed. Furthermore, we discuss the opportunities and challenges of edge computing in IIoT in terms of 5G-based edge communication, load balancing and data offloading, edge intelligence, as well as data sharing security. Finally, we introduce some typical application scenarios of edge computing in IIoT, such as prognostics and health management (PHM), smart grids, manufacturing coordination, intelligent connected vehicles (ICV), and smart logistics.

Journal ArticleDOI
TL;DR: This article analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5Gs and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading.
Abstract: To satisfy the increasing demand of mobile data traffic and meet the stringent requirements of the emerging Internet-of-Things (IoT) applications such as smart city, healthcare, and augmented/virtual reality (AR/VR), the fifth-generation (5G) enabling technologies are proposed and utilized in networks As an emerging key technology of 5G and a key enabler of IoT, multiaccess edge computing (MEC), which integrates telecommunication and IT services, offers cloud computing capabilities at the edge of the radio access network (RAN) By providing computational and storage resources at the edge, MEC can reduce latency for end users Hence, this article investigates MEC for 5G and IoT comprehensively It analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5G and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading In addition, this article provides an overview of the role of MEC in 5G and IoT, bringing light into the different MEC-enabled 5G and IoT applications as well as the promising future directions of integrating MEC with 5G and IoT Moreover, this article further elaborates research challenges and open issues of MEC for 5G and IoT Last but not least, we propose a use case that utilizes MEC to achieve edge intelligence in IoT scenarios

Journal ArticleDOI
01 Feb 2020
TL;DR: Key design issues, methodologies, and hardware platforms are introduced, including edge-assisted perception, mapping, and localization for intelligent IoV, and typical use cases for intelligent vehicles are illustrated.
Abstract: The Internet of Vehicles (IoV) is an emerging paradigm that is driven by recent advancements in vehicular communications and networking. Meanwhile, the capability and intelligence of vehicles are being rapidly enhanced, and this will have the potential of supporting a plethora of new exciting applications that will integrate fully autonomous vehicles, the Internet of Things (IoT), and the environment. These trends will bring about an era of intelligent IoV, which will heavily depend on communications, computing, and data analytics technologies. To store and process the massive amount of data generated by intelligent IoV, onboard processing and cloud computing will not be sufficient due to resource/power constraints and communication overhead/latency, respectively. By deploying storage and computing resources at the wireless network edge, e.g., radio access points, the edge information system (EIS), including edge caching, edge computing, and edge AI, will play a key role in the future intelligent IoV. EIS will provide not only low-latency content delivery and computation services but also localized data acquisition, aggregation, and processing. This article surveys the latest development in EIS for intelligent IoV. Key design issues, methodologies, and hardware platforms are introduced. In particular, typical use cases for intelligent vehicles are illustrated, including edge-assisted perception, mapping, and localization. In addition, various open-research problems are identified.

Journal ArticleDOI
TL;DR: The concept of edge computing is summarized and compares it with cloud computing, the architecture of edge Computing, keyword technology, security and privacy protection, and the applications are summarized.
Abstract: With the rapid development of the Internet of Everything (IoE), the number of smart devices connected to the Internet is increasing, resulting in large-scale data, which has caused problems such as bandwidth load, slow response speed, poor security, and poor privacy in traditional cloud computing models. Traditional cloud computing is no longer sufficient to support the diverse needs of today's intelligent society for data processing, so edge computing technologies have emerged. It is a new computing paradigm for performing calculations at the edge of the network. Unlike cloud computing, it emphasizes closer to the user and closer to the source of the data. At the edge of the network, it is lightweight for local, small-scale data storage and processing. This article mainly reviews the related research and results of edge computing. First, it summarizes the concept of edge computing and compares it with cloud computing. Then summarize the architecture of edge computing, keyword technology, security and privacy protection, and finally summarize the applications of edge computing.

Journal ArticleDOI
TL;DR: This work proposes adapting FedAvg to use a distributed form of Adam optimization, greatly reducing the number of rounds to convergence, along with the novel compression techniques, to produce communication-efficient FedAvg (CE-FedAvg), which can converge to a target accuracy and is more robust to aggressive compression.
Abstract: The rapidly expanding number of Internet of Things (IoT) devices is generating huge quantities of data, but public concern over data privacy means users are apprehensive to send data to a central server for machine learning (ML) purposes. The easily changed behaviors of edge infrastructure that software-defined networking (SDN) provides makes it possible to collate IoT data at edge servers and gateways, where federated learning (FL) can be performed: building a central model without uploading data to the server. FedAvg is an FL algorithm which has been the subject of much study, however, it suffers from a large number of rounds to convergence with non-independent identically distributed (non-IID) client data sets and high communication costs per round. We propose adapting FedAvg to use a distributed form of Adam optimization, greatly reducing the number of rounds to convergence, along with the novel compression techniques, to produce communication-efficient FedAvg (CE-FedAvg). We perform extensive experiments with the MNIST/CIFAR-10 data sets, IID/non-IID client data, varying numbers of clients, client participation rates, and compression rates. These show that CE-FedAvg can converge to a target accuracy in up to $\mathbf {6\times }$ less rounds than similarly compressed FedAvg, while uploading up to $\mathbf {3\times }$ less data, and is more robust to aggressive compression. Experiments on an edge-computing-like testbed using Raspberry Pi clients also show that CE-FedAvg is able to reach a target accuracy in up to $\mathbf {1.7\times }$ less real time than FedAvg.

Journal ArticleDOI
TL;DR: This article incorporates local differential privacy into federated learning for protecting the privacy of updated local models and proposes a random distributed update scheme to get rid of the security threats led by a centralized curator.
Abstract: Driven by technologies such as mobile edge computing and 5G, recent years have witnessed the rapid development of urban informatics, where a large amount of data is generated. To cope with the growing data, artificial intelligence algorithms have been widely exploited. Federated learning is a promising paradigm for distributed edge computing, which enables edge nodes to train models locally without transmitting their data to a server. However, the security and privacy concerns of federated learning hinder its wide deployment in urban applications such as vehicular networks. In this article, we propose a differentially private asynchronous federated learning scheme for resource sharing in vehicular networks. To build a secure and robust federated learning scheme, we incorporate local differential privacy into federated learning for protecting the privacy of updated local models. We further propose a random distributed update scheme to get rid of the security threats led by a centralized curator. Moreover, we perform the convergence boosting in our proposed scheme by updates verification and weighted aggregation. We evaluate our scheme on three real-world datasets. Numerical results show the high accuracy and efficiency of our proposed scheme, whereas preserve the data privacy.

Journal ArticleDOI
TL;DR: This work proposed a new IoT layered model: generic and stretched with the privacy and security components and layers identification, and implemented security certificates to allow data transfer between the layers of the proposed cloud/edge enabled IoT model.
Abstract: Privacy and security are among the significant challenges of the Internet of Things (IoT). Improper device updates, lack of efficient and robust security protocols, user unawareness, and famous active device monitoring are among the challenges that IoT is facing. In this work, we are exploring the background of IoT systems and security measures, and identifying (a) different security and privacy issues, (b) approaches used to secure the components of IoT-based environments and systems, (c) existing security solutions, and (d) the best privacy models necessary and suitable for different layers of IoT driven applications. In this work, we proposed a new IoT layered model: generic and stretched with the privacy and security components and layers identification. The proposed cloud/edge supported IoT system is implemented and evaluated. The lower layer represented by the IoT nodes generated from the Amazon Web Service (AWS) as Virtual Machines. The middle layer (edge) implemented as a Raspberry Pi 4 hardware kit with support of the Greengrass Edge Environment in AWS. We used the cloud-enabled IoT environment in AWS to implement the top layer (the cloud). The security protocols and critical management sessions were between each of these layers to ensure the privacy of the users’ information. We implemented security certificates to allow data transfer between the layers of the proposed cloud/edge enabled IoT model. Not only is the proposed system model eliminating possible security vulnerabilities, but it also can be used along with the best security techniques to countermeasure the cybersecurity threats facing each one of the layers; cloud, edge, and IoT.

Journal ArticleDOI
TL;DR: A state-of-art survey on the integration of blockchain with 5G networks and beyond, including discussions on the potential of blockchain for enabling key 5G technologies, including cloud/edge computing, Software Defined Networks, Network Function Virtualization, Network Slicing, and D2D communications.

Journal ArticleDOI
TL;DR: This work proposes EUAGame, a game-theoretic approach that formulates the EUA problem as a potential game and designs a novel decentralized algorithm for finding a Nash equilibrium in the game as a solution to theEUA problem.
Abstract: Edge Computing provides mobile and Internet-of-Things (IoT) app vendors with a new distributed computing paradigm which allows an app vendor to deploy its app at hired edge servers distributed near app users at the edge of the cloud. This way, app users can be allocated to hired edge servers nearby to minimize network latency and energy consumption. A cost-effective edge user allocation (EUA) requires maximum app users to be served with minimum overall system cost. Finding a centralized optimal solution to this EUA problem is NP-hard. Thus, we propose EUAGame, a game-theoretic approach that formulates the EUA problem as a potential game. We analyze the game and show that it admits a Nash equilibrium. Then, we design a novel decentralized algorithm for finding a Nash equilibrium in the game as a solution to the EUA problem. The performance of this algorithm is theoretically analyzed and experimentally evaluated. The results show that the EUA problem can be solved effectively and efficiently.

Journal ArticleDOI
TL;DR: A blockchain-enabled computation offloading method, named BeCome, is proposed in this article, whereby Blockchain technology is employed in edge computing to ensure data integrity and simple additive weighting and multicriteria decision making are utilized to identify the optimal offloading strategy.
Abstract: Benefiting from the real-time processing ability of edge computing, computing tasks requested by smart devices in the Internet of Things are offloaded to edge computing devices (ECDs) for implementation. However, ECDs are often overloaded or underloaded with disproportionate resource requests. In addition, during the process of task offloading, the transmitted information is vulnerable, which can result in data incompleteness. In view of this challenge, a blockchain-enabled computation offloading method, named BeCome, is proposed in this article. Blockchain technology is employed in edge computing to ensure data integrity. Then, the nondominated sorting genetic algorithm III is adopted to generate strategies for balanced resource allocation. Furthermore, simple additive weighting and multicriteria decision making are utilized to identify the optimal offloading strategy. Finally, performance evaluations of BeCome are given through simulation experiments.

Journal ArticleDOI
TL;DR: A B5G framework is proposed that utilizes the 5G network's low-latency, high-bandwidth functionality to detect COVID-19 using chest X-ray or CT scan images, and to develop a mass surveillance system to monitor social distancing, mask wearing, and body temperature.
Abstract: Tactile edge technology that focuses on 5G or beyond 5G reveals an exciting approach to control infectious diseases such as COVID-19 internationally. The control of epidemics such as COVID-19 can be managed effectively by exploiting edge computation through the 5G wireless connectivity network. The implementation of a hierarchical edge computing system provides many advantages, such as low latency, scalability, and the protection of application and training model data, enabling COVID-19 to be evaluated by a dependable local edge server. In addition, many deep learning (DL) algorithms suffer from two crucial disadvantages: first, training requires a large COVID-19 dataset consisting of various aspects, which will pose challenges for local councils; second, to acknowledge the outcome, the findings of deep learning require ethical acceptance and clarification by the health care sector, as well as other contributors. In this article, we propose a B5G framework that utilizes the 5G network's low-latency, high-bandwidth functionality to detect COVID-19 using chest X-ray or CT scan images, and to develop a mass surveillance system to monitor social distancing, mask wearing, and body temperature. Three DL models, ResNet50, Deep tree, and Inception v3, are investigated in the proposed framework. Furthermore, blockchain technology is also used to ensure the security of healthcare data.

Journal ArticleDOI
TL;DR: A distributed and trusted authentication system based on blockchain and edge computing, aiming to improve authentication efficiency, and a caching strategy based on edge computing is proposed to improve hit ratio.
Abstract: As the great prevalence of various Internet of Things (IoT) terminals, how to solve the problem of isolated information among different IoT platforms attracts attention from both academia and industry. It is necessary to establish a trusted access system to achieve secure authentication and collaborative sharing. Therefore, this article proposes a distributed and trusted authentication system based on blockchain and edge computing, aiming to improve authentication efficiency. This system consists of physical network layer, blockchain edge layer and blockchain network layer. Through the blockchain network, an optimized practical Byzantine fault tolerance consensus algorithm is designed to construct a consortium blockchain for storing authentication data and logs. It guarantees trusted authentication and achieves activity traceability of terminals. Furthermore, edge computing is applied in blockchain edge nodes, to provide name resolution and edge authentication service based on smart contracts. Meanwhile, an asymmetric cryptography is designed, to prevent connection between nodes and terminals from being attacked. And a caching strategy based on edge computing is proposed to improve hit ratio. Our proposed authentication mechanism is evaluated with respect to communication and computation costs. Simulation results show that the caching strategy outperforms existing edge computing strategies by 6%–12% in terms of average delay, and 8%–14% in hit ratio.

Journal ArticleDOI
TL;DR: This article proposes a learning-based channel selection framework with service reliability awareness, energy awareness, backlog awareness, and conflict awareness, by leveraging the combined power of machine learning, Lyapunov optimization, and matching theory, and proves that the proposed framework can achieve guaranteed performance.
Abstract: Edge computing provides a promising paradigm to support the implementation of Industrial Internet of Things (IIoT) by offloading computational-intensive tasks from resource-limited machine-type devices (MTDs) to powerful edge servers. However, the performance gain of edge computing may be severely compromised due to limited spectrum resources, capacity-constrained batteries, and context unawareness. In this article, we consider the optimization of channel selection that is critical for efficient and reliable task delivery. We aim at maximizing the long-term throughput subject to long-term constraints of energy budget and service reliability. We propose a learning-based channel selection framework with service reliability awareness, energy awareness, backlog awareness, and conflict awareness, by leveraging the combined power of machine learning, Lyapunov optimization, and matching theory. We provide rigorous theoretical analysis, and prove that the proposed framework can achieve guaranteed performance with a bounded deviation from the optimal performance with global state information (GSI) based on only local and causal information. Finally, simulations are conducted under both single-MTD and multi-MTD scenarios to verify the effectiveness and reliability of the proposed framework.

Journal ArticleDOI
TL;DR: Simulation results demonstrate that the proposed cooperative caching system can reduce the system cost, as well as the content delivery latency, and improve content hit ratio, as compared to the noncooperative and random edge caching schemes.
Abstract: In this article, we propose a cooperative edge caching scheme, a new paradigm to jointly optimize the content placement and content delivery in the vehicular edge computing and networks, with the aid of the flexible trilateral cooperations among a macro-cell station, roadside units, and smart vehicles. We formulate the joint optimization problem as a double time-scale Markov decision process (DTS-MDP), based on the fact that the time-scale of content timeliness changes less frequently as compared to the vehicle mobility and network states during the content delivery process. At the beginning of the large time-scale, the content placement/updating decision can be obtained according to the content popularity, vehicle driving paths, and resource availability. On the small time-scale, the joint vehicle scheduling and bandwidth allocation scheme is designed to minimize the content access cost while satisfying the constraint on content delivery latency. To solve the long-term mixed integer linear programming (LT-MILP) problem, we propose a nature-inspired method based on the deep deterministic policy gradient (DDPG) framework to obtain a suboptimal solution with a low computation complexity. The simulation results demonstrate that the proposed cooperative caching system can reduce the system cost, as well as the content delivery latency, and improve content hit ratio, as compared to the noncooperative and random edge caching schemes.

Journal ArticleDOI
TL;DR: In this paper, the problem of joint computing, caching, communication, and control (4C) in big data MEC is formulated as an optimization problem whose goal is to jointly optimize a linear combination of the bandwidth consumption and network latency.
Abstract: The concept of Multi-access Edge Computing (MEC) has been recently introduced to supplement cloud computing by deploying MEC servers to the network edge so as to reduce the network delay and alleviate the load on cloud data centers. However, compared to the resourceful cloud, MEC server has limited resources. When each MEC server operates independently, it cannot handle all computational and big data demands stemming from users devices. Consequently, the MEC server cannot provide significant gains in overhead reduction of data exchange between users devices and remote cloud. Therefore, joint Computing, Caching, Communication, and Control (4C) at the edge with MEC server collaboration is needed. To address these challenges, in this paper, the problem of joint 4C in big data MEC is formulated as an optimization problem whose goal is to jointly optimize a linear combination of the bandwidth consumption and network latency. However, the formulated problem is shown to be non-convex. As a result, a proximal upper bound problem of the original formulated problem is proposed. To solve the proximal upper bound problem, the block successive upper bound minimization method is applied. Simulation results show that the proposed approach satisfies computation deadlines and minimizes bandwidth consumption and network latency.

Journal ArticleDOI
TL;DR: Passban is presented, an intelligent intrusion detection system (IDS) able to protect the IoT devices that are directly connected to it that can be deployed directly on very cheap IoT gateways, taking full advantage of the edge computing paradigm to detect cyber threats as close as possible to the corresponding data sources.
Abstract: Cyber-threat protection is today’s one of the most challenging research branches of information technology, while the exponentially increasing number of tiny, connected devices able to push personal data to the Internet is doing nothing but exacerbating the battle between the involved parties. Thus, this protection becomes crucial with a typical Internet-of-Things (IoT) setup, as it usually involves several IoT-based data sources interacting with the physical world within various application domains, such as agriculture, health care, home automation, critical industrial processes, etc. Unfortunately, contemporary IoT devices often offer very limited security features, laying themselves open to always new and more sophisticated attacks and also inhibiting the expected global adoption of IoT technologies, not to mention millions of IoT devices already deployed without any hardware security support. In this context, it is crucial to develop tools able to detect such cyber threats. In this article, we present Passban, an intelligent intrusion detection system (IDS) able to protect the IoT devices that are directly connected to it. The peculiarity of the proposed solution is that it can be deployed directly on very cheap IoT gateways (e.g., single-board PCs currently costing few tens of U.S. dollars), hence taking full advantage of the edge computing paradigm to detect cyber threats as close as possible to the corresponding data sources. We will demonstrate that Passban is able to detect various types of malicious traffic, including Port Scanning, HTTP and SSH Brute Force, and SYN Flood attacks with very low false positive rates and satisfactory accuracies.

Journal ArticleDOI
TL;DR: This paper provides the review of neuromorphic CMOS-memristive architectures that can be integrated into edge computing devices and discusses why the neuromorphic architectures are useful for edge devices and shows the advantages, drawbacks, and open problems in the field of neuromemristive circuits for edge computing.
Abstract: The volume, veracity, variability, and velocity of data produced from the ever increasing network of sensors connected to Internet pose challenges for power management, scalability, and sustainability of cloud computing infrastructure. Increasing the data processing capability of edge computing devices at lower power requirements can reduce several overheads for cloud computing solutions. This paper provides the review of neuromorphic CMOS-memristive architectures that can be integrated into edge computing devices. We discuss why the neuromorphic architectures are useful for edge devices and show the advantages, drawbacks, and open problems in the field of neuromemristive circuits for edge computing.

Journal ArticleDOI
TL;DR: A multi-UAV-aided mobile-edge computing (MEC) system is constructed, where multiple UAVs act as MEC nodes in order to provide computing offloading services for ground IoT nodes which have limited local computing capabilities.
Abstract: Unmanned aerial vehicles (UAVs) have been widely used to provide enhanced information coverage as well as relay services for ground Internet-of-Things (IoT) networks. Considering the substantially limited processing capability, the IoT devices may not be able to tackle with heavy computing tasks. In this article, a multi-UAV-aided mobile-edge computing (MEC) system is constructed, where multiple UAVs act as MEC nodes in order to provide computing offloading services for ground IoT nodes which have limited local computing capabilities. For the sake of balancing the load for UAVs, the differential evolution (DE)-based multi-UAV deployment mechanism is proposed, where we model the access problem as a generalized assignment problem (GAP), which is then solved by a near-optimal solution algorithm. Based on this, we are capable of achieving the load balance of these drones while guaranteeing the coverage constraint and satisfying the quality of service (QoS) of IoT nodes. Furthermore, a deep reinforcement learning (DRL) algorithm is conceived for the task scheduling in a certain UAV, which improves the efficiency of the task execution in each UAV. Finally, sufficient simulation results show the feasibility and superiority of our proposed load-balance-oriented UAV deployment scheme as well as the task scheduling algorithm.

Journal ArticleDOI
TL;DR: A general framework for knowledge-driven digital twin manufacturing cell (KDTMC) towards intelligent manufacturing, which could support autonomous manufacturing by an intelligent perceiving, simulating, understanding, predicting, optimising and controlling strategy is proposed.
Abstract: Rapid advances in new generation information technologies, such as big data analytics, internet of things (IoT), edge computing and artificial intelligence, have nowadays driven traditional manufac...

Journal ArticleDOI
TL;DR: The potentials of Physical reservoir computing are illustrated using examples from soft robotics and the basic motivations for introducing it are provided, which stem from a number of fields, including machine learning, nonlinear dynamical systems, biological science, materials science, and physics.
Abstract: Understanding the fundamental relationships between physics and its information-processing capability has been an active research topic for many years. Physical reservoir computing is a recently introduced framework that allows one to exploit the complex dynamics of physical systems as information-processing devices. This framework is particularly suited for edge computing devices, in which information processing is incorporated at the edge (e.g., into sensors) in a decentralized manner to reduce the adaptation delay caused by data transmission overhead. This paper aims to illustrate the potentials of the framework using examples from soft robotics and to provide a concise overview focusing on the basic motivations for introducing it, which stem from a number of fields, including machine learning, nonlinear dynamical systems, biological science, materials science, and physics.