scispace - formally typeset
Search or ask a question
Topic

Edge computing

About: Edge computing is a research topic. Over the lifetime, 11657 publications have been published within this topic receiving 148533 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: An online algorithm, Lyapunov optimization on time and energy cost (LOTEC), based on the technique of Lyap unov optimization is described, which is able to make control decision on application offloading by adjusting the two-way tradeoff between average response time and average cost.
Abstract: Cloud computing has become the de facto computing platform for application processing in the era of the Internet of Things (IoT). However, limitations of the cloud model, such as the high transmission latency and high costs are giving birth to a new computing paradigm called edge computing (a.k.a fog computing). Fog computing aims to move the data processing close to the network edge so as to reduce Internet traffic. However, since the servers at the fog layer are not as powerful as the ones in the cloud, there is a need to balance the data processing in between the fog and the cloud. Moreover, besides the data offloading issue, the energy efficiency of fog computing nodes has become an increasing concern. Densely deployed fog nodes are a major source of carbon footprint in IoT systems. To reduce the usage of the brown energy resources (e.g., powered by energy produced through fossil fuels), green energy is an alternative option. In this paper, we propose employing dual energy sources for supporting the fog nodes, where solar power is the primary energy supply and grid power is the backup supply. Based on that, we present a comprehensive analytic framework for incorporating green energy sources to support the running of IoT and fog computing-based systems, and to handle the tradeoff in terms of average response time, average monetary, and energy costs in the IoT. This paper describes an online algorithm, Lyapunov optimization on time and energy cost (LOTEC), based on the technique of Lyapunov optimization. LOTEC is a quantified near optimal solution and is able to make control decision on application offloading by adjusting the two-way tradeoff between average response time and average cost. We evaluate the performance of our proposed algorithm by a number of experiments. Rigorous analysis and simulations have demonstrated its performance.

89 citations

Journal ArticleDOI
TL;DR: An edge IoMT system that uses DL to detect diversified types of health-related COVID-19 symptoms and generates reports and alerts that can be used for medical decision support and test results show the suitability of the system for in-home health management during a pandemic.
Abstract: Capturing psychological, emotional, and physiological states, especially during a pandemic, and leveraging the captured sensory data within the pandemic management ecosystem is challenging. Recent advancements for the Internet of Medical Things (IoMT) have shown promising results from collecting diversified types of such emotional and physical health-related data from the home environment. State-of-the-art deep learning (DL) applications can run in a resource-constrained edge environment, which allows data from IoMT devices to be processed locally at the edge, and performs inferencing related to in-home health. This allows health data to remain in the vicinity of the user edge while ensuring the privacy, security, and low latency of the inferencing system. In this article, we develop an edge IoMT system that uses DL to detect diversified types of health-related COVID-19 symptoms and generates reports and alerts that can be used for medical decision support. Several COVID-19 applications have been developed, tested, and deployed to support clinical trials. We present the design of the framework, a description of our implemented system, and the accuracy results. The test results show the suitability of the system for in-home health management during a pandemic.

89 citations

Proceedings ArticleDOI
01 Jan 2016
TL;DR: FogGIS as mentioned in this paper is a framework based on fog computing for mining analytics from geospatial data, it has been built a prototype using Intel Edison, an embedded microprocessor and has validated by doing preliminary analysis including compression and overlay analysis.
Abstract: Cloud Geographic Information Systems (GIS) has emerged as a tool for analysis, processing and transmission of geospatial data. The Fog computing is a paradigm where Fog devices help to increase throughput and reduce latency at the edge of the client. This paper developed a Fog Computing based framework named FogGIS for mining analytics from geospatial data. It has been built a prototype using Intel Edison, an embedded microprocessor. FogGIS has validated by doing preliminary analysis including compression and overlay analysis. Results showed that Fog Computing hold a great promise for analysis of geospatial data. Several open source compression techniques have been used for reducing the transmission to the cloud.

89 citations

Journal ArticleDOI
TL;DR: This article designs a data collection and preprocessing scheme based on deep learning, which adopts the semisupervised learning algorithm of data augmentation and label guessing, which significantly reduces the amount of data uploaded to the cloud, and meanwhile protects the user's data privacy effectively.
Abstract: The development of smart cities and deep learning technology is changing our physical world to a cyber world. As one of the main applications, the Internet of Vehicles has been developing rapidly. However, privacy leakage and delay problem for data collection remain as the key concerns behind the fast development of the cyber intelligence technologies. If the original data collected are directly uploaded to the cloud for processing, it will bring huge load pressure and delay to the network communication. Moreover, during this process, it will lead to the leakage of data privacy. To this end, in this article we design a data collection and preprocessing scheme based on deep learning, which adopts the semisupervised learning algorithm of data augmentation and label guessing. Data filtering is performed at the edge layer, and a large amount of similar data and irrelevant data are cleared. If the edge device cannot process some complex data independently, it will send the processed and reliable data to the cloud for further processing, which maximizes the protection of user privacy. Our method significantly reduces the amount of data uploaded to the cloud, and meanwhile protects the user's data privacy effectively.

89 citations

Journal ArticleDOI
TL;DR: This paper models how to calculate the task completion delay in MEC and proposes a Benders decomposition-based algorithm, which can achieve an (close-to-)optimal performance in terms of energy consumption and acceptance ratio compared with two benchmark heuristics.
Abstract: Mobile edge computing (MEC) offers a way to shorten the cloud servicing delay by building the small-scale cloud infrastructures, such as cloudlets at the network edge, which are in close proximity to end users. On one hand, it is energy consuming and costly to place each cloudlet on each access point (AP) to process the requested tasks. On the other hand, the service provider should provide delay-guaranteed service to end users, otherwise they may get revenue loss. In this paper, we first model how to calculate the task completion delay in MEC and mathematically analyze the energy consumption of different equipments in MEC. Subsequently, we study how to place cloudlets on the network and allocate each requested task to cloudlets and public cloud with the minimum total energy consumption without violating each task’s delay requirement. We prove that this problem is NP-hard and propose a Benders decomposition-based algorithm to solve it. We also present a software-defined network (SDN)-based framework to deploy the proposed algorithm. Extensive simulations reveal that the proposed algorithm can achieve an (close-to-)optimal performance in terms of energy consumption and acceptance ratio compared with two benchmark heuristics.

88 citations


Network Information
Related Topics (5)
Wireless sensor network
142K papers, 2.4M citations
93% related
Network packet
159.7K papers, 2.2M citations
93% related
Wireless network
122.5K papers, 2.1M citations
93% related
Server
79.5K papers, 1.4M citations
93% related
Key distribution in wireless sensor networks
59.2K papers, 1.2M citations
92% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,471
20223,274
20212,978
20203,397
20192,698
20181,649