scispace - formally typeset
Search or ask a question
Topic

Edge computing

About: Edge computing is a research topic. Over the lifetime, 11657 publications have been published within this topic receiving 148533 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: An exhaustive survey about utilizing AI in edge service optimization in IoV is conducted and a number of open issues in optimizing edge services with AI are discussed.

103 citations

Journal ArticleDOI
TL;DR: A voice disorder assessment and treatment system using a deep learning approach that achieves 98.5 percent accuracy and 99.3 percent sensitivity using the Saarbrucken Voice Disorder database is proposed.
Abstract: The advancement of next-generation network technologies provides a huge improvement in healthcare facilities. Technologies such as 5G, edge computing, cloud computing, and the Internet of Things realize smart healthcare that a client can have anytime, anywhere, and in real time. Edge computing offers useful computing resources at the edge of the network to maintain low-latency and real-time computing. In this article, we propose a smart healthcare framework using edge computing. In the framework, we develop a voice disorder assessment and treatment system using a deep learning approach. A client provides his or her voice sample captured by smart sensors, and the sample goes to the edge computing for initial processing. Then the edge computing sends data to a core cloud for further processing. The assessment and management are controlled by a service provider through a cloud manager. Once the automatic assessment is done, the decision is sent to specialists, who prescribe appropriate treatment to the clients. The proposed system achieves 98.5 percent accuracy and 99.3 percent sensitivity using the Saarbrucken Voice Disorder database.

103 citations

Journal ArticleDOI
TL;DR: In this article, a new model is proposed to collect trustworthy data on the basis of edge computing in the IoT, where the sensor nodes are evaluated from multiple dimensions to obtain accurately quantified trust values.
Abstract: It is generally accepted that the edge computing paradigm is regarded as capable of satisfying the resource requirements for the emerging mobile applications such as the Internet of Things (IoT) ones. Undoubtedly, the data collected by underlying sensor networks are the foundation of both the IoT systems and IoT applications. However, due to the weakness and vulnerability to attacks of underlying sensor networks, the data collected are usually untrustworthy, which may cause disastrous consequences. In this article, a new model is proposed to collect trustworthy data on the basis of edge computing in the IoT. In this model, the sensor nodes are evaluated from multiple dimensions to obtain accurately quantified trust values. Besides, by mapping the trust value of a node onto a force for the mobile data collector, the best mobility path is generated with high trust. Moreover, a mobile edge data collector is used to visit both the sensors with quantified trust values and collect trustworthy data. The extensive experiment validates that the IoT systems based on trustworthy data collection model gain a significant improvement in their performance, in terms of both system security and energy conservation.

103 citations

Proceedings ArticleDOI
16 Apr 2018
TL;DR: This paper considers IoT applications that receive continuous data streams from multiple sources in the network, and study joint application placement and data routing to support all data streams with both bandwidth and delay guarantees.
Abstract: S-The emergence of the Internet-of-Things (IoT) has inspired numerous new applications. However, due to the limited resources in current IoT infrastructures and the stringent quality-of-service requirements of the applications, providing computing and communication supports for the applications is becoming increasingly difficult. In this paper, we consider IoT applications that receive continuous data streams from multiple sources in the network, and study joint application placement and data routing to support all data streams with both bandwidth and delay guarantees. We formulate the application provisioning problem both for a single application and for multiple applications, with both cases proved to be NP-hard. For the case with a single application, we propose a fully polynomial-time approximation scheme. For the multi-application scenario, if the applications can be parallelized among multiple distributed instances, we propose a fully polynomial-time approximation scheme; for general non-parallelizable applications, we propose a randomized algorithm and analyze its performance. Simulations show that the proposed algorithms greatly improve the quality-of-service of the IoT applications compared to the heuristics.

103 citations

Journal ArticleDOI
TL;DR: A joint computation offloading and multiuser scheduling algorithm in NB-IoT edge computing system that minimizes the long-term average weighted sum of delay and power consumption under stochastic traffic arrival is proposed.
Abstract: The Internet of Things (IoT) connects a huge number of resource-constraint IoT devices to the Internet, which generate massive amount of data that can be offloaded to the cloud for computation. As some of the applications may require very low latency, the emerging mobile edge computing (MEC) architecture offers cloud services by deploying MEC servers at the mobile base stations (BSs). The IoT devices can transmit the offloaded data to the BS for computation at the MEC server. Narrowband-IoT (NB-IoT) is a new cellular technology for the transmission of IoT data to the BS. In this paper, we propose a joint computation offloading and multiuser scheduling algorithm in NB-IoT edge computing system that minimizes the long-term average weighted sum of delay and power consumption under stochastic traffic arrival. We formulate the dynamic optimization problem into an infinite-horizon average-reward continuous-time Markov decision process (CTMDP) model. In order to deal with the curse-of-dimensionality problem, we use the approximate dynamic programming techniques, i.e., the linear value-function approximation and temporal-difference learning with post-decision state and semi-gradient descent method, to derive a simple algorithm for the solution of the CTMDP model. The proposed algorithm is semi-distributed, where the offloading algorithm is performed locally at the IoT devices, while the scheduling algorithm is auction-based where the IoT devices submit bids to the BS to make the scheduling decision centrally. Simulation results show that the proposed algorithm provides significant performance improvement over the two baseline algorithms and the MUMTO algorithm which is designed based on the deterministic task model.

103 citations


Network Information
Related Topics (5)
Wireless sensor network
142K papers, 2.4M citations
93% related
Network packet
159.7K papers, 2.2M citations
93% related
Wireless network
122.5K papers, 2.1M citations
93% related
Server
79.5K papers, 1.4M citations
93% related
Key distribution in wireless sensor networks
59.2K papers, 1.2M citations
92% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,471
20223,274
20212,978
20203,397
20192,698
20181,649