scispace - formally typeset
Search or ask a question
Topic

Edge computing

About: Edge computing is a research topic. Over the lifetime, 11657 publications have been published within this topic receiving 148533 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This article designs two kinds of TTL in four cache replacement policies to cache data at the edge and test the performance of in-memory caching and the traditional method, finding that the former is able to obtain better energy efficiency in edge caching, and has stable and low backhaul rate.
Abstract: Recent years have witnessed billions of new manufactured sensors, equipments, and machines being connected to our almost omnipotent Internet. While enjoying the comfort and convenience brought by IoT, we also have to face tremendous energy consumption and carbon emissions that even cause climate deterioration. Extended from cloud computing, edge/fog computing and caching provide new thoughts on processing big data generated from distributed IoT devices. With the purpose of helping deal with the data explosion problem by edge caching, in this article we apply in-memory storage and processing to reduce energy consumption. We design two kinds of TTL in four cache replacement policies to cache data at the edge. We carry out a simulation experiment in a three-tier heterogeneous network structure using the RWP model and test the performance of in-memory caching and the traditional method. The analysis results manifest that our in-memory method is able to obtain better energy efficiency in edge caching, and has stable and low backhaul rate.

81 citations

Journal ArticleDOI
TL;DR: In this article, an innovative framework is developed to minimize the energy consumption of the IRS-aided WP-MEC network, by optimizing the power allocation of the WET signals, the local computing frequencies of wireless devices, both the sub-band-device association and the energy allocation used for computation offloading, as well as the IRS reflection coefficients.
Abstract: Wireless powered mobile edge computing (WP-MEC) has been recognized as a promising technique to provide both enhanced computational capability and sustainable energy supply to massive low-power wireless devices However, its energy consumption becomes substantial, when the transmission link used for wireless energy transfer (WET) and for computation offloading is hostile To mitigate this hindrance, we propose to employ the emerging technique of intelligent reflecting surface (IRS) in WP-MEC systems, which is capable of providing an additional link both for WET and for computation offloading Specifically, we consider a multi-user scenario where both the WET and the computation offloading are based on orthogonal frequency-division multiplexing (OFDM) systems Built on this model, an innovative framework is developed to minimize the energy consumption of the IRS-aided WP-MEC network, by optimizing the power allocation of the WET signals, the local computing frequencies of wireless devices, both the sub-band-device association and the power allocation used for computation offloading, as well as the IRS reflection coefficients The major challenges of this optimization lie in the strong coupling between the settings of WET and of computing as well as the unit-modules constraint on IRS reflection coefficients To tackle these issues, the technique of alternating optimization is invoked for decoupling the WET and computing designs, while two sets of locally optimal IRS reflection coefficients are provided for WET and for computation offloading separately relying on the successive convex approximation method The numerical results demonstrate that our proposed scheme is capable of monumentally outperforming the conventional WP-MEC network without IRSs Quantitatively, about 80% energy consumption reduction is attained over the conventional MEC system in a single cell, where 3 wireless devices are served via 16 sub-bands, with the aid of an IRS comprising of 50 elements

81 citations

Journal ArticleDOI
TL;DR: The first YANG models for fog nodes are created, for IoT services involving cloud, network, and/or fog, and the concept of "orchestrated assurance" is expanded to provision carrier-grade service assurance in IoT.
Abstract: The interplay between cloud and fog computing is crucial for the evolution of IoT, but the reach and specification of such interplay is an open problem. Meanwhile, the advances made in managing hyper-distributed infrastructures involving the cloud and the network edge are leading to the convergence of NFV and 5G, supported mainly by ETSI's MANO architecture. This article argues that fog computing will become part of that convergence, and introduces an open and converged architecture based on MANO that offers uniform management of IoT services spanning the continuum from the cloud to the edge. More specifically, we created the first YANG models for fog nodes, for IoT services involving cloud, network, and/or fog, and expanded the concept of "orchestrated assurance" to provision carrier-grade service assurance in IoT. The article also discusses the application of our model in a flagship pilot in the city of Barcelona.

81 citations

Book ChapterDOI
01 Feb 2018
TL;DR: The recent edge computing techniques along with the powerful caching strategies at the edge are surveyed and a roadmap for 5G and beyond wireless networks in the context of emerging applications is provided.
Abstract: The enormous increase in powerful mobile devices has created hype for mobile data traffic. The demand for high definition images and good quality video streaming for the mobile users has constantly being escalated over the recent decade. In particular, the newly emerging mobile Augmented Reality and Virtual Reality (AR/VR) applications are anticipated to be among the most demanding applications over wireless networks so far. The architecture of the cellular networks has been centralized over the years, which makes the wireless link capacity, bandwidth and backhaul network difficult to cope with the explosive growth in the mobile user traffic. Along with the rise in overall network traffic, mobile users tend to seek similar types of data at different time instants creating a bottleneck in the backhaul link. To overcome such challenges in a network, emerging techniques of caching the popular content and performing computation at the edge are gaining importance. The emergence of such techniques for near future 5G networks would pose less pressure on the backhaul links as well as the cloud servers, thereby, reducing the end-to-end latency of AR/VR applications. This paper surveys the recent edge computing techniques along with the powerful caching strategies at the edge and provides a roadmap for 5G and beyond wireless networks in the context of emerging applications.

81 citations

Posted Content
26 Jun 2019
TL;DR: An intelligent system to help IoT device manufacturers to take advantage of customers' data and build a machine learning model to predict customers' requirements and possible consumption behaviours with federated learning (FL) technology is designed.
Abstract: Internet-of-Things (IoT) companies strive to get feedback from users to improve their products and services. However, traditional surveys cannot reflect the actual conditions of customers' due to the limited questions. Besides, survey results are affected by various subjective factors. In contrast, the recorded usages of IoT devices reflect customers' behaviours more comprehensively and accurately. We design an intelligent system to help IoT device manufacturers to take advantage of customers' data and build a machine learning model to predict customers' requirements and possible consumption behaviours with federated learning (FL) technology. The FL consists of two stages: in the first stage, customers train the initial model using the phone and the edge computing server collaboratively. The mobile edge computing server's high computation power can assist customers' training locally. Customers first collect data from various IoT devices using phones, and then download and train the initial model with their data. During the training, customers first extract features using their mobiles, and then add the Laplacian noise to the extracted features based on differential privacy, a formal and popular notion to quantify privacy. After achieving the local model, customers sign on their models respectively and send them to the blockchain. We use the blockchain to replace the centralized aggregator which belongs to the third party in FL. In the second stage, miners calculate the averaged model using the collected models sent from customers. By the end of the crowdsourcing job, one of the miners, who is selected as the temporary leader, uploads the model to the blockchain. Besides, to attract more customers to participate in the crowdsourcing FL, we design an incentive mechanism, which awards participants with coins that can be used to purchase other services provided by the company.

81 citations


Network Information
Related Topics (5)
Wireless sensor network
142K papers, 2.4M citations
93% related
Network packet
159.7K papers, 2.2M citations
93% related
Wireless network
122.5K papers, 2.1M citations
93% related
Server
79.5K papers, 1.4M citations
93% related
Key distribution in wireless sensor networks
59.2K papers, 1.2M citations
92% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,471
20223,274
20212,978
20203,397
20192,698
20181,649