scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Internet of Things Journal in 2020"


Journal ArticleDOI
TL;DR: This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT, such as 5G new radio, multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail.
Abstract: Recently, wireless technologies have been growing actively all around the world. In the context of wireless technology, fifth-generation (5G) technology has become a most challenging and interesting topic in wireless research. This article provides an overview of the Internet of Things (IoT) in 5G wireless systems. IoT in the 5G system will be a game changer in the future generation. It will open a door for new wireless architecture and smart services. Recent cellular network LTE (4G) will not be sufficient and efficient to meet the demands of multiple device connectivity and high data rate, more bandwidth, low-latency quality of service (QoS), and low interference. To address these challenges, we consider 5G as the most promising technology. We provide a detailed overview of challenges and vision of various communication industries in 5G IoT systems. The different layers in 5G IoT systems are discussed in detail. This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT. We consider the technology drivers for 5G wireless technology, such as 5G new radio (NR), multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail. We also provide a review on low-power wide-area networks (LPWANs), security challenges, and its control measure in the 5G IoT scenario. This article introduces the role of AR in the 5G IoT scenario. This article also discusses the research gaps and future directions. The focus is also on application areas of IoT in 5G systems. We, therefore, outline some of the important research directions in 5G IoT.

896 citations


Journal ArticleDOI
TL;DR: In this paper, the authors divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge), and provide insights into this new interdisciplinary field from a broader perspective.
Abstract: Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, edge computing, is surging in popularity. Meanwhile, the artificial intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate edge computing and AI, which gives birth to edge intelligence. In this article, we divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge). The former focuses on providing more optimal solutions to key problems in edge computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This article provides insights into this new interdisciplinary field from a broader perspective. It discusses the core concepts and the research roadmap, which should provide the necessary background for potential future research initiatives in edge intelligence.

343 citations


Journal ArticleDOI
TL;DR: The incentive mechanism for federated learning to motivate edge nodes to contribute model training is studied and a deep reinforcement learning-based (DRL) incentive mechanism has been designed to determine the optimal pricing strategy for the parameter server and the optimal training strategies for edge nodes.
Abstract: Internet of Things (IoT) generates large amounts of data at the network edge. Machine learning models are often built on these data, to enable the detection, classification, and prediction of the future events. Due to network bandwidth, storage, and especially privacy concerns, it is often impossible to send all the IoT data to the data center for centralized model training. To address these issues, federated learning has been proposed to let nodes use the local data to train models, which are then aggregated to synthesize a global model. Most of the existing work has focused on designing learning algorithms with provable convergence time, but other issues, such as incentive mechanism, are unexplored. Although incentive mechanisms have been extensively studied in network and computation resource allocation, yet they cannot be applied to federated learning directly due to the unique challenges of information unsharing and difficulties of contribution evaluation. In this article, we study the incentive mechanism for federated learning to motivate edge nodes to contribute model training. Specifically, a deep reinforcement learning-based (DRL) incentive mechanism has been designed to determine the optimal pricing strategy for the parameter server and the optimal training strategies for edge nodes. Finally, numerical experiments have been implemented to evaluate the efficiency of the proposed DRL-based incentive mechanism.

327 citations


Journal ArticleDOI
TL;DR: This article analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5Gs and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading.
Abstract: To satisfy the increasing demand of mobile data traffic and meet the stringent requirements of the emerging Internet-of-Things (IoT) applications such as smart city, healthcare, and augmented/virtual reality (AR/VR), the fifth-generation (5G) enabling technologies are proposed and utilized in networks As an emerging key technology of 5G and a key enabler of IoT, multiaccess edge computing (MEC), which integrates telecommunication and IT services, offers cloud computing capabilities at the edge of the radio access network (RAN) By providing computational and storage resources at the edge, MEC can reduce latency for end users Hence, this article investigates MEC for 5G and IoT comprehensively It analyzes the main features of MEC in the context of 5G and IoT and presents several fundamental key technologies which enable MEC to be applied in 5G and IoT, such as cloud computing, software-defined networking/network function virtualization, information-centric networks, virtual machine (VM) and containers, smart devices, network slicing, and computation offloading In addition, this article provides an overview of the role of MEC in 5G and IoT, bringing light into the different MEC-enabled 5G and IoT applications as well as the promising future directions of integrating MEC with 5G and IoT Moreover, this article further elaborates research challenges and open issues of MEC for 5G and IoT Last but not least, we propose a use case that utilizes MEC to achieve edge intelligence in IoT scenarios

303 citations


Journal ArticleDOI
TL;DR: A background on the challenges which may be encountered when applying anomaly detection techniques to IoT data is provided, with examples of applications for the IoT anomaly detection taken from the literature.
Abstract: Anomaly detection is a problem with applications for a wide variety of domains; it involves the identification of novel or unexpected observations or sequences within the data being captured. The majority of current anomaly detection methods are highly specific to the individual use case, requiring expert knowledge of the method as well as the situation to which it is being applied. The Internet of Things (IoT) as a rapidly expanding field offers many opportunities for this type of data analysis to be implemented, however, due to the nature of the IoT, this may be difficult. This review provides a background on the challenges which may be encountered when applying anomaly detection techniques to IoT data, with examples of applications for the IoT anomaly detection taken from the literature. We discuss a range of approaches that have been developed across a variety of domains, not limited to IoT due to the relative novelty of this application. Finally, we summarize the current challenges being faced in the anomaly detection domain with a view to identifying potential research opportunities for the future.

271 citations


Journal ArticleDOI
TL;DR: This work proposes adapting FedAvg to use a distributed form of Adam optimization, greatly reducing the number of rounds to convergence, along with the novel compression techniques, to produce communication-efficient FedAvg (CE-FedAvg), which can converge to a target accuracy and is more robust to aggressive compression.
Abstract: The rapidly expanding number of Internet of Things (IoT) devices is generating huge quantities of data, but public concern over data privacy means users are apprehensive to send data to a central server for machine learning (ML) purposes. The easily changed behaviors of edge infrastructure that software-defined networking (SDN) provides makes it possible to collate IoT data at edge servers and gateways, where federated learning (FL) can be performed: building a central model without uploading data to the server. FedAvg is an FL algorithm which has been the subject of much study, however, it suffers from a large number of rounds to convergence with non-independent identically distributed (non-IID) client data sets and high communication costs per round. We propose adapting FedAvg to use a distributed form of Adam optimization, greatly reducing the number of rounds to convergence, along with the novel compression techniques, to produce communication-efficient FedAvg (CE-FedAvg). We perform extensive experiments with the MNIST/CIFAR-10 data sets, IID/non-IID client data, varying numbers of clients, client participation rates, and compression rates. These show that CE-FedAvg can converge to a target accuracy in up to $\mathbf {6\times }$ less rounds than similarly compressed FedAvg, while uploading up to $\mathbf {3\times }$ less data, and is more robust to aggressive compression. Experiments on an edge-computing-like testbed using Raspberry Pi clients also show that CE-FedAvg is able to reach a target accuracy in up to $\mathbf {1.7\times }$ less real time than FedAvg.

271 citations


Journal ArticleDOI
TL;DR: This work proposes a federated deep-reinforcement-learning-based cooperative edge caching (FADE) framework that enables base stations to cooperatively learn a shared predictive model, and proves the expectation convergence of FADE.
Abstract: Edge caching is an emerging technology for addressing massive content access in mobile networks to support rapidly growing Internet-of-Things (IoT) services and applications. However, most current optimization-based methods lack a self-adaptive ability in dynamic environments. To tackle these challenges, current learning-based approaches are generally proposed in a centralized way. However, network resources may be overconsumed during the training and data transmission process. To address the complex and dynamic control issues, we propose a federated deep-reinforcement-learning-based cooperative edge caching (FADE) framework. FADE enables base stations (BSs) to cooperatively learn a shared predictive model by considering the first-round training parameters of the BSs as the initial input of the local training, and then uploads near-optimal local parameters to the BSs to participate in the next round of global training. Furthermore, we prove the expectation convergence of FADE. Trace-driven simulation results demonstrate the effectiveness of the proposed FADE framework on reducing the performance loss and average delay, offloading backhaul traffic, and improving the hit rate.

252 citations


Journal ArticleDOI
TL;DR: This article surveys the existing and emerging technologies that can enable this vision for the future of healthcare, particularly, in the clinical practice of healthcare and discusses the emerging directions, open issues, and challenges.
Abstract: In combination with current sociological trends, the maturing development of Internet of Things devices is projected to revolutionize healthcare. A network of body-worn sensors, each with a unique ID, can collect health data, that is, orders-of-magnitude richer than what is available today from sporadic observations in clinical/hospital environments. When databased, analyzed, and compared against information from other individuals using data analytics, Healthcare Internet of Things data enables the personalization and modernization of care with radical improvements in outcomes and reductions in cost. In this article, we survey the existing and emerging technologies that can enable this vision for the future of healthcare, particularly, in the clinical practice of healthcare. Three main technology areas underlie the development of this field: 1) sensing, where there is an increased drive for miniaturization and power efficiency; 2) communications, where the enabling factors are ubiquitous connectivity, standardized protocols, and the wide availability of cloud infrastructure; and 3) data analytics and inference, where the availability of large amounts of data and computational resources is revolutionizing algorithms for individualizing inference and actions in health management. Throughout this article, we use a case study to concretely illustrate the impact of these trends. We conclude this article with a discussion of the emerging directions, open issues, and challenges.

243 citations


Journal ArticleDOI
TL;DR: This article develops an asynchronous advantage actor–critic-based cooperation computation offloading and resource allocation algorithm to solve the MDP problem and designs a multiobjective function to maximize the computation rate of MEC systems and the transaction throughput of blockchain systems.
Abstract: Mobile-edge computing (MEC) is a promising paradigm to improve the quality of computation experience of mobile devices because it allows mobile devices to offload computing tasks to MEC servers, benefiting from the powerful computing resources of MEC servers. However, the existing computation-offloading works have also some open issues: 1) security and privacy issues; 2) cooperative computation offloading; and 3) dynamic optimization. To address the security and privacy issues, we employ the blockchain technology that ensures the reliability and irreversibility of data in MEC systems. Meanwhile, we jointly design and optimize the performance of blockchain and MEC. In this article, we develop a cooperative computation offloading and resource allocation framework for blockchain-enabled MEC systems. In the framework, we design a multiobjective function to maximize the computation rate of MEC systems and the transaction throughput of blockchain systems by jointly optimizing offloading decision, power allocation, block size, and block interval. Due to the dynamic characteristics of the wireless fading channel and the processing queues at MEC servers, the joint optimization is formulated as a Markov decision process (MDP). To tackle the dynamics and complexity of the blockchain-enabled MEC system, we develop an asynchronous advantage actor–critic-based cooperation computation offloading and resource allocation algorithm to solve the MDP problem. In the algorithm, deep neural networks are optimized by utilizing asynchronous gradient descent and eliminating the correlation of data. The simulation results show that the proposed algorithm converges fast and achieves significant performance improvements over existing schemes in terms of total reward.

241 citations


Journal ArticleDOI
TL;DR: The requirements of the basic road safety and advanced applications, the architecture, the key technologies, and the standards of C-V 2X are introduced, highlighting the technical evolution path from LTE-V2X to NR-V1X.
Abstract: Cellular vehicle-to-everything (C-V2X) is an important enabling technology for autonomous driving and intelligent transportation systems. It evolves from long-term evolution (LTE)-V2X to new radio (NR)-V2X, which will coexist and be complementary with each other to provide low-latency, high-reliability, and high-throughput communications for various C-V2X applications. In this article, a vision of C-V2X is presented. The requirements of the basic road safety and advanced applications, the architecture, the key technologies, and the standards of C-V2X are introduced, highlighting the technical evolution path from LTE-V2X to NR-V2X. Especially, based on the continual and active promotion of C-V2X research, field testing, and development in China, the related works and progresses are also presented. Finally, the trends of C-V2X applications with technical challenges are envisioned.

237 citations


Journal ArticleDOI
TL;DR: An innovative UAV-enabled MEC system involving the interactions among IoT devices, UAV, and edge clouds (ECs) and an efficient algorithm based on the successive convex approximation to obtain suboptimal solutions is proposed.
Abstract: Mobile edge computing (MEC) is an emerging technology to support resource-intensive yet delay-sensitive applications using small cloud-computing platforms deployed at the mobile network edges. However, the existing MEC techniques are not applicable to the situation where the number of mobile users increases explosively or the network facilities are sparely distributed. In view of this insufficiency, unmanned aerial vehicles (UAVs) have been employed to improve the connectivity of ground Internet of Things (IoT) devices due to their high altitude. This article proposes an innovative UAV-enabled MEC system involving the interactions among IoT devices, UAV, and edge clouds (ECs). The system deploys and operates a UAV properly to facilitate the MEC service provisioning to a set of IoT devices in regions where the existing ECs cannot be accessible to IoT devices due to terrestrial signal blockage or shadowing. The UAV and ECs in the system collaboratively provide MEC services to the IoT devices. For optimal service provisioning in this system, we formulate an optimization problem aiming at minimizing the weighted sum of the service delay of all IoT devices and UAV energy consumption by jointly optimizing UAV position, communication and computing resource allocation, and task splitting decisions. However, the resulting optimization problem is highly nonconvex and thus, difficult to solve optimally. To tackle this problem, we develop an efficient algorithm based on the successive convex approximation to obtain suboptimal solutions. Numerical experiments demonstrate that our proposed collaborative UAV-EC offloading scheme largely outperforms baseline schemes that solely rely on UAV or ECs for MEC in IoT.

Journal ArticleDOI
TL;DR: This article aims to provide theoretical, methodological, and technical guidance for IoT search access control mechanisms in large-scale dynamic heterogeneous environments based on a literature review and analyzed the future development direction of access control in the age of IoT.
Abstract: With the development of Internet-of-Things (IoT) technology, various types of information, such as social resources and physical resources, are deeply integrated for different comprehensive applications. Social networking, car networking, medical services, video surveillance, and other forms of the IoT information service model gradually change people’s daily lives. Facing the vast amounts of IoT information data, the IoT search technology is used to quickly find accurate information to meet the real-time search needs of users. However, IoT search requires using a large amount of user private information, such as personal health information, location information, and social relations information, to provide personalized services. Employing private information from users will encounter security problems if an effective access control mechanism is missing during the IoT search process. An access control mechanism can effectively monitor the access activities of resources and ensure that authorized users access information resources under legitimate conditions. This survey examines the growing literature on access control for an IoT search. Problems and challenges of access control mechanisms are analyzed to facilitate the adoption of access control solutions in real-life settings. This article aims to provide theoretical, methodological, and technical guidance for IoT search access control mechanisms in large-scale dynamic heterogeneous environments. Based on a literature review, we also analyzed the future development direction of access control in the age of IoT.

Journal ArticleDOI
TL;DR: A novel UAV-assisted IoT network is proposed, in which a low-altitude UAV platform is employed as both a mobile data collector and an aerial anchor node to assist terrestrial BSs in data collection and device positioning.
Abstract: The Internet of Things (IoT) will significantly change both industrial manufacturing and our daily lives. Data collection and 3-D positioning of IoT devices are two indispensable services of such networks. However, in conventional networks, only terrestrial base stations (BSs) are used to provide these two services. On the one hand, this leads to high energy consumption for devices transmitting at cell edges. On the other hand, terrestrial BSs are relatively close in height, resulting in poor performance of device positioning in elevation. Due to their high maneuverability and flexible deployment, unmanned aerial vehicles (UAVs) could be a promising technology to overcome the above shortcomings. In this article, we propose a novel UAV-assisted IoT network, in which a low-altitude UAV platform is employed as both a mobile data collector and an aerial anchor node to assist terrestrial BSs in data collection and device positioning. We aim to minimize the maximum energy consumption of all devices by jointly optimizing the UAV trajectory and devices’ transmission schedule over time, while ensuring the reliability of data collection and required 3-D positioning performance. This formulation is a mixed-integer nonconvex optimization problem, and an efficient differential evolution (DE)-based method is proposed for solving it. Numerical results demonstrate that the proposed network and the optimization method achieve significant performance gains in both energy-efficient data collection and 3-D device positioning, as compared with a conventional terrestrial IoT network.

Journal ArticleDOI
TL;DR: This article proposes a learning-based channel selection framework with service reliability awareness, energy awareness, backlog awareness, and conflict awareness, by leveraging the combined power of machine learning, Lyapunov optimization, and matching theory, and proves that the proposed framework can achieve guaranteed performance.
Abstract: Edge computing provides a promising paradigm to support the implementation of Industrial Internet of Things (IIoT) by offloading computational-intensive tasks from resource-limited machine-type devices (MTDs) to powerful edge servers. However, the performance gain of edge computing may be severely compromised due to limited spectrum resources, capacity-constrained batteries, and context unawareness. In this article, we consider the optimization of channel selection that is critical for efficient and reliable task delivery. We aim at maximizing the long-term throughput subject to long-term constraints of energy budget and service reliability. We propose a learning-based channel selection framework with service reliability awareness, energy awareness, backlog awareness, and conflict awareness, by leveraging the combined power of machine learning, Lyapunov optimization, and matching theory. We provide rigorous theoretical analysis, and prove that the proposed framework can achieve guaranteed performance with a bounded deviation from the optimal performance with global state information (GSI) based on only local and causal information. Finally, simulations are conducted under both single-MTD and multi-MTD scenarios to verify the effectiveness and reliability of the proposed framework.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated an energy cost minimization problem for a smart home in the absence of a building thermal dynamics model with the consideration of a comfortable temperature range, and proposed an energy management algorithm based on deep deterministic policy gradients.
Abstract: In this article, we investigate an energy cost minimization problem for a smart home in the absence of a building thermal dynamics model with the consideration of a comfortable temperature range. Due to the existence of model uncertainty, parameter uncertainty (e.g., renewable generation output, nonshiftable power demand, outdoor temperature, and electricity price), and temporally coupled operational constraints, it is very challenging to design an optimal energy management algorithm for scheduling heating, ventilation, and air conditioning systems and energy storage systems in the smart home. To address the challenge, we first formulate the above problem as a Markov decision process, and then propose an energy management algorithm based on deep deterministic policy gradients. It is worth mentioning that the proposed algorithm does not require the prior knowledge of uncertain parameters and building the thermal dynamics model. The simulation results based on real-world traces demonstrate the effectiveness and robustness of the proposed algorithm.

Journal ArticleDOI
TL;DR: Simulation results demonstrate that the proposed cooperative caching system can reduce the system cost, as well as the content delivery latency, and improve content hit ratio, as compared to the noncooperative and random edge caching schemes.
Abstract: In this article, we propose a cooperative edge caching scheme, a new paradigm to jointly optimize the content placement and content delivery in the vehicular edge computing and networks, with the aid of the flexible trilateral cooperations among a macro-cell station, roadside units, and smart vehicles. We formulate the joint optimization problem as a double time-scale Markov decision process (DTS-MDP), based on the fact that the time-scale of content timeliness changes less frequently as compared to the vehicle mobility and network states during the content delivery process. At the beginning of the large time-scale, the content placement/updating decision can be obtained according to the content popularity, vehicle driving paths, and resource availability. On the small time-scale, the joint vehicle scheduling and bandwidth allocation scheme is designed to minimize the content access cost while satisfying the constraint on content delivery latency. To solve the long-term mixed integer linear programming (LT-MILP) problem, we propose a nature-inspired method based on the deep deterministic policy gradient (DDPG) framework to obtain a suboptimal solution with a low computation complexity. The simulation results demonstrate that the proposed cooperative caching system can reduce the system cost, as well as the content delivery latency, and improve content hit ratio, as compared to the noncooperative and random edge caching schemes.

Journal ArticleDOI
TL;DR: Passban is presented, an intelligent intrusion detection system (IDS) able to protect the IoT devices that are directly connected to it that can be deployed directly on very cheap IoT gateways, taking full advantage of the edge computing paradigm to detect cyber threats as close as possible to the corresponding data sources.
Abstract: Cyber-threat protection is today’s one of the most challenging research branches of information technology, while the exponentially increasing number of tiny, connected devices able to push personal data to the Internet is doing nothing but exacerbating the battle between the involved parties. Thus, this protection becomes crucial with a typical Internet-of-Things (IoT) setup, as it usually involves several IoT-based data sources interacting with the physical world within various application domains, such as agriculture, health care, home automation, critical industrial processes, etc. Unfortunately, contemporary IoT devices often offer very limited security features, laying themselves open to always new and more sophisticated attacks and also inhibiting the expected global adoption of IoT technologies, not to mention millions of IoT devices already deployed without any hardware security support. In this context, it is crucial to develop tools able to detect such cyber threats. In this article, we present Passban, an intelligent intrusion detection system (IDS) able to protect the IoT devices that are directly connected to it. The peculiarity of the proposed solution is that it can be deployed directly on very cheap IoT gateways (e.g., single-board PCs currently costing few tens of U.S. dollars), hence taking full advantage of the edge computing paradigm to detect cyber threats as close as possible to the corresponding data sources. We will demonstrate that Passban is able to detect various types of malicious traffic, including Port Scanning, HTTP and SSH Brute Force, and SYN Flood attacks with very low false positive rates and satisfactory accuracies.

Journal ArticleDOI
TL;DR: A multi-UAV-aided mobile-edge computing (MEC) system is constructed, where multiple UAVs act as MEC nodes in order to provide computing offloading services for ground IoT nodes which have limited local computing capabilities.
Abstract: Unmanned aerial vehicles (UAVs) have been widely used to provide enhanced information coverage as well as relay services for ground Internet-of-Things (IoT) networks. Considering the substantially limited processing capability, the IoT devices may not be able to tackle with heavy computing tasks. In this article, a multi-UAV-aided mobile-edge computing (MEC) system is constructed, where multiple UAVs act as MEC nodes in order to provide computing offloading services for ground IoT nodes which have limited local computing capabilities. For the sake of balancing the load for UAVs, the differential evolution (DE)-based multi-UAV deployment mechanism is proposed, where we model the access problem as a generalized assignment problem (GAP), which is then solved by a near-optimal solution algorithm. Based on this, we are capable of achieving the load balance of these drones while guaranteeing the coverage constraint and satisfying the quality of service (QoS) of IoT nodes. Furthermore, a deep reinforcement learning (DRL) algorithm is conceived for the task scheduling in a certain UAV, which improves the efficiency of the task execution in each UAV. Finally, sufficient simulation results show the feasibility and superiority of our proposed load-balance-oriented UAV deployment scheme as well as the task scheduling algorithm.

Journal ArticleDOI
TL;DR: This work introduces a privacy-preserving machine learning technique named federated learning (FL) and proposes an FL-based gated recurrent unit neural network algorithm (FedGRU) for traffic flow prediction (TFP) that differs from current centralized learning methods and updates universal learning models through a secure parameter aggregation mechanism.
Abstract: Existing traffic flow forecasting approaches by deep learning models achieve excellent success based on a large volume of data sets gathered by governments and organizations. However, these data sets may contain lots of user’s private data, which is challenging the current prediction approaches as user privacy is calling for the public concern in recent years. Therefore, how to develop accurate traffic prediction while preserving privacy is a significant problem to be solved, and there is a tradeoff between these two objectives. To address this challenge, we introduce a privacy-preserving machine learning technique named federated learning (FL) and propose an FL-based gated recurrent unit neural network algorithm (FedGRU) for traffic flow prediction (TFP). FedGRU differs from current centralized learning methods and updates universal learning models through a secure parameter aggregation mechanism rather than directly sharing raw data among organizations. In the secure parameter aggregation mechanism, we adopt a federated averaging algorithm to reduce the communication overhead during the model parameter transmission process. Furthermore, we design a joint announcement protocol to improve the scalability of FedGRU. We also propose an ensemble clustering-based scheme for TFP by grouping the organizations into clusters before applying the FedGRU algorithm. Extensive case studies on a real-world data set demonstrate that FedGRU can produce predictions that are merely 0.76 km/h worse than the state of the art in terms of mean average error under the privacy preservation constraint, confirming that the proposed model develops accurate traffic predictions without compromising the data privacy.

Journal ArticleDOI
TL;DR: The role of IoT and big data analysis in agriculture, supply chain modernization, social media in food industry, food quality assessment, and food safety are discussed, along with the commercial status of applications and translational research outcomes.
Abstract: Internet of things (IoT) results in massive amount of streaming data, often referred to as “big data”, which brings new opportunities to monitor agricultural and food processes. Besides sensors, big data from social media is also becoming important for the food industry. In this review we present an overview of IoT, big data, and artificial intelligence (AI) and their disruptive role in shaping the future of agri-food systems. Following an introduction to the fields of IoT, big data, and AI, we discuss the role of IoT and big data analysis in agriculture (including greenhouse monitoring, intelligent farm machines, and drone-based crop imaging), supplychain modernization, social media (for open innovation and sentiment analysis) in food industry, food quality assessment (using spectral methods and sensor fusion), and finally, food safety (using gene sequencing and blockchain based digital traceability). A special emphasis is laid on the commercial status of applications and translational research outcomes.

Journal ArticleDOI
TL;DR: Performance evaluation results validate that the proposed scheme is indeed capable of reducing the latency as well as improving the reliability of the EC-SDIoV.
Abstract: Internet of Vehicles (IoV) has drawn great interest recent years. Various IoV applications have emerged for improving the safety, efficiency, and comfort on the road. Cloud computing constitutes a popular technique for supporting delay-tolerant entertainment applications. However, for advanced latency-sensitive applications (e.g., auto/assisted driving and emergency failure management), cloud computing may result in excessive delay. Edge computing, which extends computing and storage capabilities to the edge of the network, emerges as an attractive technology. Therefore, to support these computationally intensive and latency-sensitive applications in IoVs, in this article, we integrate mobile-edge computing nodes (i.e., mobile vehicles) and fixed edge computing nodes (i.e., fixed road infrastructures) to provide low-latency computing services cooperatively. For better exploiting these heterogeneous edge computing resources, the concept of software-defined networking (SDN) and edge-computing-aided IoV (EC-SDIoV) is conceived. Moreover, in a complex and dynamic IoV environment, the outage of both processing nodes and communication links becomes inevitable, which may have life-threatening consequences. In order to ensure the completion with high reliability of latency-sensitive IoV services, we introduce both partial computation offloading and reliable task allocation with the reprocessing mechanism to EC-SDIoV. Since the optimization problem is nonconvex and NP-hard, a heuristic algorithm, fault-tolerant particle swarm optimization algorithm is designed for maximizing the reliability (FPSO-MR) with latency constraints. Performance evaluation results validate that the proposed scheme is indeed capable of reducing the latency as well as improving the reliability of the EC-SDIoV.

Journal ArticleDOI
TL;DR: In this article, the role of edge computing in realizing the vision of smart cities is highlighted, and several indispensable open challenges along with their causes and guidelines are discussed, serving as future research directions.
Abstract: Recent years have disclosed a remarkable proliferation of compute-intensive applications in smart cities. Such applications continuously generate enormous amounts of data which demand strict latency-aware computational processing capabilities. Although edge computing is an appealing technology to compensate for stringent latency-related issues, its deployment engenders new challenges. In this article, we highlight the role of edge computing in realizing the vision of smart cities. First, we analyze the evolution of edge computing paradigms. Subsequently, we critically review the state-of-the-art literature focusing on edge computing applications in smart cities. Later, we categorize and classify the literature by devising a comprehensive and meticulous taxonomy. Furthermore, we identify and discuss key requirements, and enumerate recently reported synergies of edge computing-enabled smart cities. Finally, several indispensable open challenges along with their causes and guidelines are discussed, serving as future research directions.

Journal ArticleDOI
TL;DR: A novel framework based on computer propped diagnosis and IoT to detect and observe type-2 diabetes patients is suggested and the recommended healthcare system aims to obtain a better accuracy of diagnosis with mysterious data.
Abstract: Internet of Things (IoT) has gain the importance with the growing applications in the fields of ubiquitous and context-aware computing. In IoT, anything can be a portion of it, whether it is unintelligent objects or sensor nodes; thus extremely different kinds of services can be developed. In this regard, data storage, resource management, service creation and discovery, and resource and power management would facilitate advanced mechanism and much better infrastructure. Cloud computing and fog computing play an important role when the quantity of data and information IoT are critical. Thus, it would not be potential for standalone strength forced IoT to handle. Cloud of things is an integration of IoT with cloud computing or fog computing which can aid to realize the objectives of evolving IoT and future Internet. Fog computing is an expansion to the notion of cloud computing to the network brim, making it suitable for IoT and other implementations that need real-time and fundamental interactions. Regardless of many virtually and services unlimited resources presented by cloud-like intelligent building monitoring and others, it yet countenances various difficulties when interfering many smart things in human’s life. Mobility, response time, and location consciousness are the most prominent problems. Fog and mobile edge computing have been established, to get rid of these difficulties of cloud computing. In this article, we suggest a novel framework based on computer propped diagnosis and IoT to detect and observe type-2 diabetes patients. The recommended healthcare system aims to obtain a better accuracy of diagnosis with mysterious data. The overall experimental results indicate the validity and robustness of our proposed algorithms.

Journal ArticleDOI
TL;DR: A novel blockchain-enabled federated learning (FL-Block) scheme that enables the autonomous machine learning without any centralized authority to maintain the global model and coordinates by using a Proof-of-Work consensus mechanism of the blockchain.
Abstract: As the extension of cloud computing and a foundation of IoT, fog computing is experiencing fast prosperity because of its potential to mitigate some troublesome issues, such as network congestion, latency, and local autonomy. However, privacy issues and the subsequent inefficiency are dragging down the performances of fog computing. The majority of existing works hardly consider a reasonable balance between them while suffering from poisoning attacks. To address the aforementioned issues, we propose a novel blockchain-enabled federated learning (FL-Block) scheme to close the gap. FL-Block allows local learning updates of end devices exchanges with a blockchain-based global learning model, which is verified by miners. Built upon this, FL-Block enables the autonomous machine learning without any centralized authority to maintain the global model and coordinates by using a Proof-of-Work consensus mechanism of the blockchain. Furthermore, we analyze the latency performance of FL-Block and further derive the optimal block generation rate by taking communication, consensus delays, and computation cost into consideration. Extensive evaluation results show the superior performances of FL-Block from the aspects of privacy protection, efficiency, and resistance to the poisoning attack.

Journal ArticleDOI
TL;DR: The results show that the multiattribute decision making based on SDN and NFV can select the appropriate MEC center, further reduce the server response time and improve the quality of user service experience.
Abstract: In order to improve the stability of mobile network system for application of the next generation of Internet of Things (IoT), balance the network load and guarantee the quality of user service experience, this article first introduces the computing migration framework for the network of the next generation, and summarizes the concept and content of mobile edge computing (MEC) using software-defined network (SDN) and network function virtualization (NFV). And then, this article proceeds to introduce the MEC strategy based on SDN and NFV technology as well as multiattribute decision making, computing migration, multiattribute decision, the MEC decision model based on SDN and NFV technology and the solving process of the MEC decision model based on SDN and NFV. Finally, the three sets of simulation experiments based on MATLAB are designed to validate the multiattribute decision of MEC migration strategy based on SDN and NFV. The results show that the multiattribute decision making based on SDN and NFV can select the appropriate MEC center, further reduce the server response time and improve the quality of user service experience. This article is of great significance to the application of IoT terminal in the next generation of network environment.

Journal ArticleDOI
TL;DR: The threats, security requirements, challenges, and the attack vectors pertinent to IoT networks are reviewed, and a novel paradigm that combines a network-based deployment of IoT architecture through software-defined networking (SDN) is proposed.
Abstract: Internet of Things (IoT) is transforming everyone’s life by providing features, such as controlling and monitoring of the connected smart objects. IoT applications range over a broad spectrum of services including smart cities, homes, cars, manufacturing, e-healthcare, smart control system, transportation, wearables, farming, and much more. The adoption of these devices is growing exponentially, that has resulted in generation of a substantial amount of data for processing and analyzing. Thus, besides bringing ease to the human lives, these devices are susceptible to different threats and security challenges, which do not only worry the users for adopting it in sensitive environments, such as e-health, smart home, etc., but also pose hazards for the advancement of IoT in coming days. This article thoroughly reviews the threats, security requirements, challenges, and the attack vectors pertinent to IoT networks. Based on the gap analysis, a novel paradigm that combines a network-based deployment of IoT architecture through software-defined networking (SDN) is proposed. This article presents an overview of the SDN along with a thorough discussion on SDN-based IoT deployment models, i.e., centralized and decentralized. We further elaborated SDN-based IoT security solutions to present a comprehensive overview of the software-defined security (SDSec) technology. Furthermore, based on the literature, core issues are highlighted that are the main hurdles in unifying all IoT stakeholders on one platform and few findings that emphases on a network-based security solution for IoT paradigm. Finally, some future research directions of SDN-based IoT security technologies are discussed.

Journal ArticleDOI
TL;DR: This article focuses on the deep-learning-enhanced HAR in IoHT environments, and a semisupervised deep learning framework is designed and built for more accurate HAR, which efficiently uses and analyzes the weakly labeled sensor data to train the classifier learning model.
Abstract: Along with the advancement of several emerging computing paradigms and technologies, such as cloud computing, mobile computing, artificial intelligence, and big data, Internet of Things (IoT) technologies have been applied in a variety of fields. In particular, the Internet of Healthcare Things (IoHT) is becoming increasingly important in human activity recognition (HAR) due to the rapid development of wearable and mobile devices. In this article, we focus on the deep-learning-enhanced HAR in IoHT environments. A semisupervised deep learning framework is designed and built for more accurate HAR, which efficiently uses and analyzes the weakly labeled sensor data to train the classifier learning model. To better solve the problem of the inadequately labeled sample, an intelligent autolabeling scheme based on deep $Q$ -network (DQN) is developed with a newly designed distance-based reward rule which can improve the learning efficiency in IoT environments. A multisensor based data fusion mechanism is then developed to seamlessly integrate the on-body sensor data, context sensor data, and personal profile data together, and a long short-term memory (LSTM)-based classification method is proposed to identify fine-grained patterns according to the high-level features contextually extracted from the sequential motion data. Finally, experiments and evaluations are conducted to demonstrate the usefulness and effectiveness of the proposed method using real-world data.

Journal ArticleDOI
TL;DR: This article proposes to minimize the long-term energy consumption of a THz wireless access-based MEC system for high quality immersive VR video services support by jointly optimizing the viewport rendering offloading and downlink transmit power control policies and an asynchronous advantage actor–critic (A3C)-based joint optimization algorithm.
Abstract: Immersive virtual reality (VR) video is becoming increasingly popular owing to its enhanced immersive experience To enjoy ultrahigh resolution immersive VR video with wireless user equipments, such as head-mounted displays (HMDs), ultralow-latency viewport rendering, and data transmission are the core prerequisites, which could not be achieved without a huge bandwidth and superior processing capabilities Besides, potentially very high energy consumption at the HMD may impede the rapid development of wireless panoramic VR video Multiaccess edge computing (MEC) has emerged as a promising technology to reduce both the task processing latency and the energy consumption for HMD, while bandwidth-rich terahertz (THz) communication is expected to enable ultrahigh-speed wireless data transmission In this article, we propose to minimize the long-term energy consumption of a THz wireless access-based MEC system for high quality immersive VR video services support by jointly optimizing the viewport rendering offloading and downlink transmit power control Considering the time-varying nature of wireless channel conditions, we propose a deep reinforcement learning-based approach to learn the optimal viewport rendering offloading and transmit power control policies and an asynchronous advantage actor–critic (A3C)-based joint optimization algorithm is proposed The simulation results demonstrate that the proposed algorithm converges fast under different learning rates, and outperforms existing algorithms in terms of minimized energy consumption and maximized reward

Journal ArticleDOI
TL;DR: In this article, a symbiotic radio (SR) system is proposed to support passive Internet of Things (IoT), in which a backscatter device (BD), also called IoT device, is parasitic in a primary transmission.
Abstract: In this article, a symbiotic radio (SR) system is proposed to support passive Internet of Things (IoT), in which a backscatter device (BD), also called IoT device, is parasitic in a primary transmission. The primary transmitter (PT) is designed to assist both the primary and BD transmissions, and the primary receiver (PR) is used to decode the information from the PT as well as the BD. The symbol period for BD transmission is assumed to be either equal to or much greater than that of the primary one, resulting in parasitic SR (PSR) or commensal SR (CSR) setup. We consider a basic SR system which consists of three nodes: 1) a multiantenna PT; 2) a single-antenna BD; and 3) a single-antenna PR. We first derive the achievable rates for the primary and BD transmissions for each setup. Then, we formulate two transmit beamforming optimization problems, i.e., the weighted sum-rate maximization (WSRM) problem and the transmit power minimization (TPM) problem, and solve these nonconvex problems by applying the semidefinite relaxation (SDR) technique. In addition, a novel transmit beamforming structure is proposed to reduce the computational complexity of the solutions. The simulation results show that for CSR setup, the proposed solution enables the opportunistic transmission for the BD via energy-efficient passive backscattering without any loss in spectral efficiency, by properly exploiting the additional signal path from the BD.

Journal ArticleDOI
TL;DR: This article considers a UAV-enabled mobile-edge computing system for Internet-of-Things (IoT) computation offloading with limited or no common cloud/edge infrastructure and proposes a Pareto-optimal solution that balances the tradeoff between the UAV energy and completion time.
Abstract: Completion time and energy consumption of the unmanned aerial vehicle (UAV) are two important design aspects in UAV-enabled applications. In this article, we consider a UAV-enabled mobile-edge computing (MEC) system for Internet-of-Things (IoT) computation offloading with limited or no common cloud/edge infrastructure. We study the joint design of computation offloading and resource allocation, as well as UAV trajectory for minimization of energy consumption and completion time of the UAV, subject to the IoT devices’ task and energy budget constraints. We first consider the UAV energy minimization problem without predetermined completion time, a discretized nonconvex equivalent problem is obtained by using the path discretization technique. An efficient alternating optimization algorithm for the discretized problem is proposed by decoupling it into two subproblems and addressing the two subproblems with successive convex approximation (SCA)-based algorithms iteratively. Subsequently, we focus on the completion time minimization problem, which is nonconvex and challenging to solve. By using the same path discretization approximation model to reformulate problem, a similar alternating optimization algorithm is proposed. Furthermore, we study the Pareto-optimal solution that balances the tradeoff between the UAV energy and completion time. The simulation results are provided to corroborate this article and show that the proposed designs outperform the baseline schemes. Our results unveil the tradeoff between completion time and energy consumption of the UAV for the MEC system, and the proposed solution can provide the performance close to the lower bound.