scispace - formally typeset
Search or ask a question
Topic

Edge computing

About: Edge computing is a research topic. Over the lifetime, 11657 publications have been published within this topic receiving 148533 citations.


Papers
More filters
Journal ArticleDOI
Zhuoqing Chang1, Shubo Liu1, Xingxing Xiong1, Zhaohui Cai1, Guoqing Tu1 
TL;DR: An extensive survey of an end-edge-cloud orchestrated architecture for flexible AIoT systems and the emerging technologies for AI models regarding inference and training at the edge of the network are reviewed.
Abstract: The Internet of Things (IoT) has created a ubiquitously connected world powered by a multitude of wired and wireless sensors generating a variety of heterogeneous data over time in a myriad of fields and applications. To extract complete information from these data, advanced artificial intelligence (AI) technology, especially deep learning (DL), has proved successful in facilitating data analytics, future prediction and decision making. The collective integration of AI and the IoT has greatly promoted the rapid development of AI-of-Things (AIoT) systems that analyze and respond to external stimuli more intelligently without involvement by humans. However, it is challenging or infeasible to process massive amounts of data in the cloud due to the destructive impact of the volume, velocity, and veracity of data and fatal transmission latency on networking infrastructures. These critical challenges can be adequately addressed by introducing edge computing. This article conducts an extensive survey of an end-edge-cloud orchestrated architecture for flexible AIoT systems. Specifically, it begins with articulating fundamental concepts including the IoT, AI and edge computing. Guided by these concepts, it explores the general AIoT architecture, presents a practical AIoT example to illustrate how AI can be applied in real-world applications and summarizes promising AIoT applications. Then, the emerging technologies for AI models regarding inference and training at the edge of the network are reviewed. Finally, the open challenges and future directions in this promising area are outlined.

91 citations

Journal ArticleDOI
10 Nov 2018-Sensors
TL;DR: A novel mist computing testbed is presented and the importance of selecting a proper ECC curve is demonstrated, showing that, for the tested devices, some curves present worse energy consumption and data throughput than other curves that provide a higher security level.
Abstract: The latest Internet of Things (IoT) edge-centric architectures allow for unburdening higher layers from part of their computational and data processing requirements. In the specific case of fog computing systems, they reduce greatly the requirements of cloud-centric systems by processing in fog gateways part of the data generated by end devices, thus providing services that were previously offered by a remote cloud. Thanks to recent advances in System-on-Chip (SoC) energy efficiency, it is currently possible to create IoT end devices with enough computational power to process the data generated by their sensors and actuators while providing complex services, which in recent years derived into the development of the mist computing paradigm. To allow mist computing nodes to provide the previously mentioned benefits and guarantee the same level of security as in other architectures, end-to-end standard security mechanisms need to be implemented. In this paper, a high-security energy-efficient fog and mist computing architecture and a testbed are presented and evaluated. The testbed makes use of Transport Layer Security (TLS) 1.2 Elliptic Curve Cryptography (ECC) and Rivest-Shamir-Adleman (RSA) cipher suites (that comply with the yet to come TLS 1.3 standard requirements), which are evaluated and compared in terms of energy consumption and data throughput for a fog gateway and two mist end devices. The obtained results allow a conclusion that ECC outperforms RSA in both energy consumption and data throughput for all the tested security levels. Moreover, the importance of selecting a proper ECC curve is demonstrated, showing that, for the tested devices, some curves present worse energy consumption and data throughput than other curves that provide a higher security level. As a result, this article not only presents a novel mist computing testbed, but also provides guidelines for future researchers to find out efficient and secure implementations for advanced IoT devices.

91 citations

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed edge computing based video pre-processing to eliminate the redundant frames, so that they migrate the partial or all the video processing task to the edge, thereby diminishing the computing, storage and network bandwidth requirements of the cloud center, and enhancing the effectiveness of video analyzes.

91 citations

Journal ArticleDOI
TL;DR: The Optimal Resource Provisioning (ORP) algorithms with different instances are developed, so as to optimize the computation capacity of edge hosts and meanwhile dynamically adjust the cloud tenancy strategy, and are proved to be with polynomial computational complexity.
Abstract: Mobile edge computing is emerging as a new computing paradigm that provides enhanced experience to mobile users via low latency connections and augmented computation capacity. As the amount of user requests is time-varying, while the computation capacity of edge hosts is limited, Cloud Assisted Mobile Edge (CAME) computing framework is introduced to improve the scalability of the edge platform. By outsourcing mobile requests to clouds with various types of instances, the CAME framework can accommodate dynamic mobile requests with diverse quality of service requirements. In order to provide guaranteed services at minimal system cost, the edge resource provisioning and cloud outsourcing of the CAME framework should be carefully designed in a cost-efficient manner. Specifically, two fundamental issues should be answered: (1) what is the optimal edge computation capacity configuration? and (2) what types of cloud instances should be tenanted and what is the amount of each type? To solve these issues, we formulate the resource provisioning in CAME framework as an optimization problem. By exploiting the piecewise convex property of this problem, the Optimal Resource Provisioning (ORP) algorithms with different instances are proposed, so as to optimize the computation capacity of edge hosts and meanwhile dynamically adjust the cloud tenancy strategy. The proposed algorithms are proved to be with polynomial computational complexity. To evaluate the performance of the ORP algorithms, extensive simulations and experiments are conducted based on both the widely-used traffic models and the Google cluster usage tracelogs, respectively. It is shown that the proposed ORP algorithms outperform the local-first and cloud-first benchmark algorithms in system flexibility and cost-efficiency.

91 citations

Proceedings ArticleDOI
18 Apr 2017
TL;DR: This system is able to accurately identify bears, deer, coyotes, and empty images and significantly reduces the time and bandwidth requirements for image transfer, as well as end-user analysis time, since WTB automatically filters the images on-site.
Abstract: We investigate the design and implementation of Where's The Bear (WTB), an end-to-end, distributed, IoT system for wildlife monitoring.WTB implements a multi-tier (cloud, edge, sensing) system that integrates recent advances in machine learning based image processing to automatically classify animals in images from remote, motion-triggered camera traps.We use non-local, resource-rich, public/private cloud systems to train the machine learning models, and ``in-the-field,'' resource-constrained edge systems to perform classification near the IoT sensing devices (cameras).We deploy WTB at the UCSB Sedgwick Reserve, a 6000 acre site for environmental research and use it to aggregate, manage, and analyze over 1.12M images.WTB integrates Google TensorFlow and OpenCV applications to perform automatic classification and tagging for a subset of these images.To avoid transferring large numbers of training images for TensorFlow over a low-bandwidth network linking Sedgwick to the public/private clouds, we devise a technique that uses stock Google Images to construct a synthetic training set using only a small number of empty, background images from Sedgwick.Our system is able to accurately identify bears, deer, coyotes, and empty images and significantly reduces the time and bandwidth requirements for image transfer, as well as end-user analysis time, since WTB automatically filters the images on-site.

91 citations


Network Information
Related Topics (5)
Wireless sensor network
142K papers, 2.4M citations
93% related
Network packet
159.7K papers, 2.2M citations
93% related
Wireless network
122.5K papers, 2.1M citations
93% related
Server
79.5K papers, 1.4M citations
93% related
Key distribution in wireless sensor networks
59.2K papers, 1.2M citations
92% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,471
20223,274
20212,978
20203,397
20192,698
20181,649