scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Internet of Things Journal in 2021"


Journal ArticleDOI
TL;DR: A federated learning (FL) system leveraging a reputation mechanism to assist home appliance manufacturers to train a machine learning model based on customers’ data so that manufacturers can predict customers' requirements and consumption behaviors in the future.
Abstract: Home appliance manufacturers strive to obtain feedback from users to improve their products and services to build a smart home system. To help manufacturers develop a smart home system, we design a federated learning (FL) system leveraging a reputation mechanism to assist home appliance manufacturers to train a machine learning model based on customers’ data. Then, manufacturers can predict customers’ requirements and consumption behaviors in the future. The working flow of the system includes two stages: in the first stage, customers train the initial model provided by the manufacturer using both the mobile phone and the mobile-edge computing (MEC) server. Customers collect data from various home appliances using phones, and then they download and train the initial model with their local data. After deriving local models, customers sign on their models and send them to the blockchain. In case customers or manufacturers are malicious, we use the blockchain to replace the centralized aggregator in the traditional FL system. Since records on the blockchain are untampered, malicious customers or manufacturers’ activities are traceable. In the second stage, manufacturers select customers or organizations as miners for calculating the averaged model using received models from customers. By the end of the crowdsourcing task, one of the miners, who is selected as the temporary leader, uploads the model to the blockchain. To protect customers’ privacy and improve the test accuracy, we enforce differential privacy (DP) on the extracted features and propose a new normalization technique. We experimentally demonstrate that our normalization technique outperforms batch normalization when features are under DP protection. In addition, to attract more customers to participate in the crowdsourcing FL task, we design an incentive mechanism to award participants.

274 citations


Journal ArticleDOI
TL;DR: A use case of fully autonomous driving is presented to show 6G supports massive IoT and some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized.
Abstract: Nowadays, many disruptive Internet-of-Things (IoT) applications emerge, such as augmented/virtual reality online games, autonomous driving, and smart everything, which are massive in number, data intensive, computation intensive, and delay sensitive. Due to the mismatch between the fifth generation (5G) and the requirements of such massive IoT-enabled applications, there is a need for technological advancements and evolutions for wireless communications and networking toward the sixth-generation (6G) networks. 6G is expected to deliver extended 5G capabilities at a very high level, such as Tbps data rate, sub-ms latency, cm-level localization, and so on, which will play a significant role in supporting massive IoT devices to operate seamlessly with highly diverse service requirements. Motivated by the aforementioned facts, in this article, we present a comprehensive survey on 6G-enabled massive IoT. First, we present the drivers and requirements by summarizing the emerging IoT-enabled applications and the corresponding requirements, along with the limitations of 5G. Second, visions of 6G are provided in terms of core technical requirements, use cases, and trends. Third, a new network architecture provided by 6G to enable massive IoT is introduced, i.e., space–air–ground–underwater/sea networks enhanced by edge computing. Fourth, some breakthrough technologies, such as machine learning and blockchain, in 6G are introduced, where the motivations, applications, and open issues of these technologies for massive IoT are summarized. Finally, a use case of fully autonomous driving is presented to show 6G supports massive IoT.

263 citations


Journal ArticleDOI
TL;DR: In this article, a survey of federated learning (FL) topics and research fields is presented, including core system models and designs, application areas, privacy and security, and resource management.
Abstract: Driven by privacy concerns and the visions of deep learning, the last four years have witnessed a paradigm shift in the applicability mechanism of machine learning (ML). An emerging model, called federated learning (FL), is rising above both centralized systems and on-site analysis, to be a new fashioned design for ML implementation. It is a privacy-preserving decentralized approach, which keeps raw data on devices and involves local ML training while eliminating data communication overhead. A federation of the learned and shared models is then performed on a central server to aggregate and share the built knowledge among participants. This article starts by examining and comparing different ML-based deployment architectures, followed by in-depth and in-breadth investigation on FL. Compared to the existing reviews in the field, we provide in this survey a new classification of FL topics and research fields based on thorough analysis of the main technical challenges and current related work. In this context, we elaborate comprehensive taxonomies covering various challenging aspects, contributions, and trends in the literature, including core system models and designs, application areas, privacy and security, and resource management. Furthermore, we discuss important challenges and open research directions toward more robust FL systems.

252 citations


Journal ArticleDOI
TL;DR: A new framework model based on a novel feature selection metric approach named CorrAUC is proposed, and a new feature selection algorithm based on the wrapper technique to filter the features accurately and select effective features for the selected ML algorithm by using the area under the curve (AUC) metric.
Abstract: Identification of anomaly and malicious traffic in the Internet-of-Things (IoT) network is essential for the IoT security to keep eyes and block unwanted traffic flows in the IoT network. For this purpose, numerous machine-learning (ML) technique models are presented by many researchers to block malicious traffic flows in the IoT network. However, due to the inappropriate feature selection, several ML models prone misclassify mostly malicious traffic flows. Nevertheless, the significant problem still needs to be studied more in-depth that is how to select effective features for accurate malicious traffic detection in the IoT network. To address the problem, a new framework model is proposed. First, a novel feature selection metric approach named CorrAUC is proposed, and then based on CorrAUC, a new feature selection algorithm named CorrAUC is developed and designed, which is based on the wrapper technique to filter the features accurately and select effective features for the selected ML algorithm by using the area under the curve (AUC) metric. Then, we applied the integrated TOPSIS and Shannon entropy based on a bijective soft set to validate selected features for malicious traffic identification in the IoT network. We evaluate our proposed approach by using the Bot-IoT data set and four different ML algorithms. The experimental results analysis showed that our proposed method is efficient and can achieve >96% results on average.

244 citations


Journal ArticleDOI
TL;DR: Several main issues in FLchain design are identified, including communication cost, resource allocation, incentive mechanism, security and privacy protection, and the applications of FLchain in popular MEC domains, such as edge data sharing, edge content caching and edge crowdsensing are investigated.
Abstract: Mobile-edge computing (MEC) has been envisioned as a promising paradigm to handle the massive volume of data generated from ubiquitous mobile devices for enabling intelligent services with the help of artificial intelligence (AI). Traditionally, AI techniques often require centralized data collection and training in a single entity, e.g., an MEC server, which is now becoming a weak point due to data privacy concerns and high overhead of raw data communications. In this context, federated learning (FL) has been proposed to provide collaborative data training solutions, by coordinating multiple mobile devices to train a shared AI model without directly exposing their underlying data, which enjoys considerable privacy enhancement. To improve the security and scalability of FL implementation, blockchain as a ledger technology is attractive for realizing decentralized FL training without the need for any central server. Particularly, the integration of FL and blockchain leads to a new paradigm, called FLchain , which potentially transforms intelligent MEC networks into decentralized, secure, and privacy-enhancing systems. This article presents an overview of the fundamental concepts and explores the opportunities of FLchain in MEC networks. We identify several main issues in FLchain design, including communication cost, resource allocation, incentive mechanism, security and privacy protection. The key solutions and the lessons learned along with the outlooks are also discussed. Then, we investigate the applications of FLchain in popular MEC domains, such as edge data sharing, edge content caching and edge crowdsensing. Finally, important research challenges and future directions are also highlighted.

238 citations


Journal ArticleDOI
TL;DR: A comprehensive survey of DTN is presented to explore the potentiality of DT and depict the typical application scenarios such as manufacturing, aviation, healthcare, 6G networks, Intelligent Transportation Systems and urban intelligence in smart cities.
Abstract: Digital twin network (DTN) is an emerging network that utilizes digital twin (DT) technology to create the virtual twins of physical objects. DTN realizes co-evolution between physical and virtual spaces through DT modeling, communication, computing, data processing technologies. In this article, we present a comprehensive survey of DTN to explore the potentiality of DT. First, we elaborate key features and definitions of DTN. Next, the key technologies and the technical challenges in DTN are discussed. Furthermore, we depict the typical application scenarios, such as manufacturing, aviation, healthcare, 6G networks, intelligent transportation systems, and urban intelligence in smart cities. Finally, the new trends and open research issues related to DTN are pointed out.

232 citations


Journal ArticleDOI
TL;DR: The major purpose of this work is to create a novel and secure cache decision system (CDS) in a wireless network that operates over an SB, which will offer the users safer and efficient environment for browsing the Internet, sharing and managing large-scale data in the fog.
Abstract: This work proposes an innovative infrastructure of secure scenario which operates in a wireless-mobile 6G network for managing big data (BD) on smart buildings (SBs). Count on the rapid growth of telecommunication field new challenges arise. Furthermore, a new type of wireless network infrastructure, the sixth generation (6G), provides all the benefits of its past versions and also improves some issues which its predecessors had. In addition, relative technologies to the telecommunications filed, such as Internet of Things, cloud computing (CC) and edge computing (EC), can operate through a 6G wireless network. Take into account all these, we propose a scenario that try to combine the functions of the Internet of Things with CC, EC and BD in order to achieve a Smart and Secure environment. The major purpose of this work is to create a novel and secure cache decision system (CDS) in a wireless network that operates over an SB, which will offer the users safer and efficient environment for browsing the Internet, sharing and managing large-scale data in the fog. This CDS consisted of two types of servers, one cloud server and one edge server. In order to come up with our proposal, we study related cache scenarios systems which are listed, presented, and compared in this work.

229 citations


Journal ArticleDOI
TL;DR: This article investigates the multicast communication of a satellite and aerial-integrated network with rate-splitting multiple access with RSMA, where both satellite and unmanned aerial vehicle (UAV) components are controlled by network management center and operate in the same frequency band.
Abstract: To satisfy the explosive access demands of Internet-of-Things (IoT) devices, various kinds of multiple access techniques have received much attention. In this article, we investigate the multicast communication of a satellite and aerial-integrated network (SAIN) with rate-splitting multiple access (RSMA), where both satellite and unmanned aerial vehicle (UAV) components are controlled by network management center and operate in the same frequency band. Considering a content delivery scenario, the UAV subnetwork adopts the RSMA to support massive access of IoT devices (IoTDs) and achieve desired performances of interference suppression, spectral efficiency, and hardware complexity. We first formulate an optimization problem to maximize the sum rate of the considered system subject to the signal-interference-plus-noise-ratio requirements of IoTDs and per-antenna power constraints at the UAV and satellite. To solve this nonconvex optimization problem, we exploit the sequential convex approximation and the first-order Taylor expansion to convert the original optimization problem into a solvable one with the rank-one constraint, and then propose an iterative penalty function-based algorithm to solve it. Finally, simulation results verify that the proposed method can effectively suppress the mutual interference and improve the system sum rate compared to the benchmark schemes.

218 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a comprehensive survey on AIoT to show how AI can empower the IoT to make it faster, smarter, greener, and safer, and highlight the challenges facing AI-oT and some potential research opportunities.
Abstract: In the Internet-of-Things (IoT) era, billions of sensors and devices collect and process data from the environment, transmit them to cloud centers, and receive feedback via the Internet for connectivity and perception. However, transmitting massive amounts of heterogeneous data, perceiving complex environments from these data, and then making smart decisions in a timely manner are difficult. Artificial intelligence (AI), especially deep learning, is now a proven success in various areas, including computer vision, speech recognition, and natural language processing. AI introduced into the IoT heralds the era of AI of things (AIoT). This article presents a comprehensive survey on AIoT to show how AI can empower the IoT to make it faster, smarter, greener, and safer. Specifically, we briefly present the AIoT architecture in the context of cloud computing, fog computing, and edge computing. Then, we present progress in AI research for IoT from four perspectives: 1) perceiving; 2) learning; 3) reasoning; and 4) behaving. Next, we summarize some promising applications of AIoT that are likely to profoundly reshape our world. Finally, we highlight the challenges facing AIoT and some potential research opportunities.

216 citations


Journal ArticleDOI
TL;DR: A deep blockchain framework (DBF) designed to offer security-based distributed intrusion detection and privacy-based blockchain with smart contracts in IoT networks and is compared with peer privacy-preserving intrusion detection techniques, and the experimental outcomes reveal that DBF outperforms the other competing models.
Abstract: There has been significant research in incorporating both blockchain and intrusion detection to improve data privacy and detect existing and emerging cyberattacks, respectively. In these approaches, learning-based ensemble models can facilitate the identification of complex malicious events and concurrently ensure data privacy. Such models can also be used to provide additional security and privacy assurances during the live migration of virtual machines (VMs) in the cloud and to protect Internet-of-Things (IoT) networks. This would allow the secure transfer of VMs between data centers or cloud providers in real time. This article proposes a deep blockchain framework (DBF) designed to offer security-based distributed intrusion detection and privacy-based blockchain with smart contracts in IoT networks. The intrusion detection method is employed by a bidirectional long short-term memory (BiLSTM) deep learning algorithm to deal with sequential network data and is assessed using the data sets of UNSW-NB15 and BoT-IoT. The privacy-based blockchain and smart contract methods are developed using the Ethereum library to provide privacy to the distributed intrusion detection engines. The DBF framework is compared with peer privacy-preserving intrusion detection techniques, and the experimental outcomes reveal that DBF outperforms the other competing models. The framework has the potential to be used as a decision support system that can assist users and cloud providers in securely migrating their data in a timely and reliable manner.

199 citations


Journal ArticleDOI
TL;DR: This survey article proposes to answer the question: how to train distributed machine learning models for resource-constrained IoT devices, and highlights an overview of FL and provides a comprehensive survey of the problem statements and emerging challenges.
Abstract: Federated learning (FL) is a distributed machine learning strategy that generates a global model by learning from multiple decentralized edge clients. FL enables on-device training, keeping the client’s local data private, and further, updating the global model based on the local model updates. While FL methods offer several advantages, including scalability and data privacy, they assume there are available computational resources at each edge-device/client. However, the Internet-of-Things (IoTs) enabled devices, e.g., robots, drone swarms, and low-cost computing devices (e.g., Raspberry Pi), may have limited processing ability, low bandwidth and power, or limited storage capacity. In this survey paper, we propose to answer this question: how to train distributed machine learning models for resource-constrained IoT devices? To this end, we first explore the existing studies on FL, relative assumptions for distributed implementation using IoT devices, and explore their drawbacks. We then discuss the implementation challenges and issues when applying FL to an IoT environment. We highlight an overview of FL and provide a comprehensive survey of the problem statements and emerging challenges, particularly during applying FL within heterogeneous IoT environments. Finally, we point out the future research directions for scientists and researchers who are interested in working at the intersection of FL and resource-constrained IoT environments.

Journal ArticleDOI
TL;DR: In this paper, the authors explore the emerging opportunities brought by 6G technologies in IoT networks and applications, by conducting a holistic survey on the convergence of 6G and IoT, and highlight interesting research challenges and point out potential directions to spur further research in this promising area.
Abstract: The sixth generation (6G) wireless communication networks are envisioned to revolutionize customer services and applications via the Internet of Things (IoT) towards a future of fully intelligent and autonomous systems. In this article, we explore the emerging opportunities brought by 6G technologies in IoT networks and applications, by conducting a holistic survey on the convergence of 6G and IoT. We first shed light on some of the most fundamental 6G technologies that are expected to empower future IoT networks, including edge intelligence, reconfigurable intelligent surfaces, space-air-ground-underwater communications, Terahertz communications, massive ultra-reliable and low-latency communications, and blockchain. Particularly, compared to the other related survey papers, we provide an in-depth discussion of the roles of 6G in a wide range of prospective IoT applications via five key domains, namely Healthcare Internet of Things, Vehicular Internet of Things and Autonomous Driving, Unmanned Aerial Vehicles, Satellite Internet of Things, and Industrial Internet of Things. Finally, we highlight interesting research challenges and point out potential directions to spur further research in this promising area.

Journal ArticleDOI
TL;DR: A contemporary survey on the latest advancement in blockchain for IoV is presented, highlighting the different application scenarios of IoV after carefully reviewing the recent literature and investigating several key challenges.
Abstract: Internet of Vehicles (IoV) is an emerging concept that is believed to help realize the vision of intelligent transportation systems (ITSs). IoV has become an important research area of impactful applications in recent years due to the rapid advancements in vehicular technologies, high throughput satellite communication, the Internet of Things, and cyber–physical systems. IoV enables the integration of smart vehicles with the Internet and system components attributing to their environments, such as public infrastructures, sensors, computing nodes, pedestrians, and other vehicles. By allowing the development of a common information exchange platform between vehicles and heterogeneous vehicular networks, this integration aims to create a better environment and public space for the people as well as to enhance safety for all road users. Being a participatory data exchange and storage, the underlying information exchange platform of IoV needs to be secure, transparent, and immutable in order to achieve the intended objectives of ITS. In this connection, the adoption of blockchain as a system platform for supporting the information exchange needs of IoV has been explored. Due to their decentralized and immutable nature, IoV applications enabled by blockchain are believed to have a number of desirable properties, such as decentralization, security, transparency, immutability, and automation. In this article, we present a contemporary survey on the latest advancement in blockchain for IoV. Particularly, we highlight the different application scenarios of IoV after carefully reviewing the recent literature. We also investigate several key challenges where blockchain is applied in IoV. Furthermore, we present the future opportunities and explore further research directions of IoV as a key enabler of ITS.

Journal ArticleDOI
TL;DR: The experimental results demonstrate that the federated-learning (FL)-based anomaly detection approach outperforms the classic/centralized machine learning (non-FL) versions in securing the privacy of user data and provides an optimal accuracy rate in attack detection.
Abstract: The Internet of Things (IoT) is made up of billions of physical devices connected to the Internet via networks that perform tasks independently with less human intervention. Such brilliant automation of mundane tasks requires a considerable amount of user data in digital format, which in turn makes IoT networks an open-source of Personally Identifiable Information data for malicious attackers to steal, manipulate and perform nefarious activities. Huge interest has developed over the past years in applying machine learning (ML)-assisted approaches in the IoT security space. However, the assumption in many current works is that big training data is widely available and transferable to the main server because data is born at the edge and is generated continuously by IoT devices. This is to say that classic ML works on the legacy set of entire data located on a central server, which makes it the least preferred option for domains with privacy concerns on user data. To address this issue, we propose federated learning (FL)-based anomaly detection approach to proactively recognize intrusion in IoT networks using decentralized on-device data. Our approach uses federated training rounds on Gated Recurrent Units (GRUs) models and keeps the data intact on local IoT devices by sharing only the learned weights with the central server of the FL. Also, the approach’s ensembler part aggregates the updates from multiple sources to optimize the global ML model’s accuracy. Our experimental results demonstrate that our approach outperforms the classic/centralized machine learning (non-FL) versions in securing the privacy of user data and provides an optimal accuracy rate in attack detection.

Journal ArticleDOI
TL;DR: The IoT/IIoT critical infrastructure in industry 4.0 is introduced, and then the blockchain and edge computing paradigms are briefly presented, and it is shown how the convergence of these two paradigm can enable secure and scalable critical infrastructures.
Abstract: Critical infrastructure systems are vital to underpin the functioning of a society and economy. Due to the ever-increasing number of Internet-connected Internet-of-Things (IoT)/Industrial IoT (IIoT), and the high volume of data generated and collected, security and scalability are becoming burning concerns for critical infrastructures in industry 4.0. The blockchain technology is essentially a distributed and secure ledger that records all the transactions into a hierarchically expanding chain of blocks. Edge computing brings the cloud capabilities closer to the computation tasks. The convergence of blockchain and edge computing paradigms can overcome the existing security and scalability issues. In this article, we first introduce the IoT/IIoT critical infrastructure in industry 4.0, and then we briefly present the blockchain and edge computing paradigms. After that, we show how the convergence of these two paradigms can enable secure and scalable critical infrastructures. Then, we provide a survey on the state of the art for security and privacy and scalability of IoT/IIoT critical infrastructures. A list of potential research challenges and open issues in this area is also provided, which can be used as useful resources to guide future research.

Journal ArticleDOI
TL;DR: Li et al. as mentioned in this paper designed an adversarial attack against DL-based network intrusion detection systems (NIDSs) in the Internet-of-Things environment, with only black-box accesses to the DL model in such NIDSs.
Abstract: Deep learning (DL) has gained popularity in network intrusion detection, due to its strong capability of recognizing subtle differences between normal and malicious network activities. Although a variety of methods have been designed to leverage DL models for security protection, whether these systems are vulnerable to adversarial examples (AEs) is unknown. In this article, we design a novel adversarial attack against DL-based network intrusion detection systems (NIDSs) in the Internet-of-Things environment, with only black-box accesses to the DL model in such NIDS. We introduce two techniques: 1) model extraction is adopted to replicate the black-box model with a small amount of training data and 2) a saliency map is then used to disclose the impact of each packet attribute on the detection results, and the most critical features. This enables us to efficiently generate AEs using conventional methods. With these tehniques, we successfully compromise one state-of-the-art NIDS, Kitsune: the adversary only needs to modify less than 0.005% of bytes in the malicious packets to achieve an average 94.31% attack success rate.

Journal ArticleDOI
TL;DR: In this article, the authors proposed an attention mechanism-based convolutional neural network-long short-term memory (AMCNN-LSTM) model to accurately detect anomalies.
Abstract: Since edge device failures (i.e., anomalies) seriously affect the production of industrial products in Industrial IoT (IIoT), accurately and timely detecting anomalies are becoming increasingly important. Furthermore, data collected by the edge device contain massive user’s private data, which is challenging current detection approaches as user privacy has attracted more and more public concerns. With this focus, this article proposes a new communication-efficient on-device federated learning (FL)-based deep anomaly detection framework for sensing time-series data in IIoT. Specifically, we first introduce an FL framework to enable decentralized edge devices to collaboratively train an anomaly detection model, which can improve its generalization ability. Second, we propose an attention mechanism-based convolutional neural network-long short-term memory (AMCNN-LSTM) model to accurately detect anomalies. The AMCNN-LSTM model uses attention mechanism-based convolutional neural network units to capture important fine-grained features, thereby preventing memory loss and gradient dispersion problems. Furthermore, this model retains the advantages of the long short-term memory unit in predicting time-series data. Third, to adapt the proposed framework to the timeliness of industrial anomaly detection, we propose a gradient compression mechanism based on Top- ${k}$ selection to improve communication efficiency. Extensive experimental studies on four real-world data sets demonstrate that our framework accurately and timely detects anomalies and also reduces the communication overhead by 50% compared to the FL framework that does not use the gradient compression scheme.

Journal ArticleDOI
TL;DR: To ensure client data privacy, a blockchain-based federated learning approach for device failure detection in IIoT is proposed, and a novel centroid distance weighted federated averaging algorithm taking into account the distance between positive class and negative class of each client data set is proposed.
Abstract: Device failure detection is one of most essential problems in Industrial Internet of Things (IIoT). However, in conventional IIoT device failure detection, client devices need to upload raw data to the central server for model training, which might lead to disclosure of sensitive business data. Therefore, in this article, to ensure client data privacy, we propose a blockchain-based federated learning approach for device failure detection in IIoT. First, we present a platform architecture of blockchain-based federated learning systems for failure detection in IIoT, which enables verifiable integrity of client data. In the architecture, each client periodically creates a Merkle tree in which each leaf node represents a client data record, and stores the tree root on a blockchain. Furthermore, to address the data heterogeneity issue in IIoT failure detection, we propose a novel centroid distance weighted federated averaging (CDW_FedAvg) algorithm taking into account the distance between positive class and negative class of each client data set. In addition, to motivate clients to participate in federated learning, a smart contact-based incentive mechanism is designed depending on the size and the centroid distance of client data used in local model training. A prototype of the proposed architecture is implemented with our industry partner, and evaluated in terms of feasibility, accuracy, and performance. The results show that the approach is feasible, and has satisfactory accuracy and performance.

Journal ArticleDOI
TL;DR: An ant colony optimization (ACO) approach is presented by adopting multiple objectives as well as using transaction deletion to secure confidential and sensitive information and shows that the designed approach achieves fewer side effects while maintaining low computational cost overall.
Abstract: The next revolution of the smart industry relies on the emergence of the Industrial Internet of Things (IoT) and 5G/6G technology. The properties of such sophisticated communication technologies will change our perspective of information and communication by enabling seamless connectivity and bring closer entities, data, and “things.” Terahertz-based 6G networks promise the best speed and reliability, but they will face new man-in-the-middle attacks. In such critical and high-sensitive environments, the security of data and privacy of information still a big challenge. Without privacy-preserving considerations, the configuration state may be attacked or modified, thus causing security problems and damage to data. In this article, motivated by the need to secure 6G IoT networks, an ant colony optimization (ACO) approach is presented by adopting multiple objectives as well as using transaction deletion to secure confidential and sensitive information. Each ant in the population is represented as a set of possible deletion transactions for hiding sensitive information. We utilize the use of a prelarge concept to assist in the reduction of multiple database scans in the evaluation progress. We then also adopt external solutions to maintain discovered Pareto solutions, thus improving effectiveness to find optimized solutions. Experiments are conducted comparing our methodology to state-of-the-art bioinspired particle swarm optimization (PSO) as well as genetic algorithm (GA). Our strong results clearly show that the designed approach achieves fewer side effects while maintaining low computational cost overall (Chen et al. , 2020).

Journal ArticleDOI
TL;DR: An electrocardiogram (ECG) heart rhythms classifier model was built using machine learning to diagnose heart disease and detect heart problems, and neural-network-based algorithms deal better with ECG data than traditional ML algorithms.
Abstract: Since the emergence of digital and smart healthcare, the world has hastened to apply various technologies in this field to promote better health operation and patients’ well being, increase life expectancy, and reduce healthcare costs. One promising technology and game changer in this domain is digital twin (DT). DT is expected to change the concept of digital healthcare and take this field to another level that has never been seen before. DT is a virtual replica of a physical asset that reflects the current status through real-time transformed data. This article proposes and implements an intelligent context-aware healthcare system using the DT framework. This framework is a beneficial contribution to digital healthcare and to improve healthcare operations. Accordingly, an electrocardiogram (ECG) heart rhythms classifier model was built using machine learning to diagnose heart disease and detect heart problems. The implemented models successfully predicted a particular heart condition with high accuracy in different algorithms. The collected results have shown that integrating DT with the healthcare field would improve healthcare processes by bringing patients and healthcare professionals together in an intelligent, comprehensive, and scalable health ecosystem. Also, implementing an ECG classifier that detects heart conditions gives the inspiration for applying ML and artificial intelligence with different human body metrics for continuous monitoring and abnormalities detection. Finally, neural-network-based algorithms deal better with ECG data than traditional ML algorithms.

Journal ArticleDOI
TL;DR: Through breaking the privacy concerns of the public, the proposed BeepTrace solution can provide a timely framework for authorities, companies, software developers, and researchers to fast develop and deploy effective digital contact tracing applications, to conquer the COVID-19 pandemic soon.
Abstract: The outbreak of the coronavirus disease 2019 (COVID-19) pandemic has exposed an urgent need for effective contact tracing solutions through mobile phone applications to prevent the infection from spreading further. However, due to the nature of contact tracing, public concern on privacy issues has been a bottleneck to the existing solutions, which is significantly affecting the uptake of contact tracing applications across the globe. In this article, we present a blockchain-enabled privacy-preserving contact tracing scheme: BeepTrace, where we propose to adopt blockchain bridging the user/patient and the authorized solvers to desensitize the user ID and location information. Compared with recently proposed contact tracing solutions, our approach shows higher security and privacy with the additional advantages of being battery friendly and globally accessible. Results show viability in terms of the required resource at both server and mobile phone perspectives. Through breaking the privacy concerns of the public, the proposed BeepTrace solution can provide a timely framework for authorities, companies, software developers, and researchers to fast develop and deploy effective digital contact tracing applications, to conquer the COVID-19 pandemic soon. Meanwhile, the open initiative of BeepTrace allows worldwide collaborations, integrate existing tracing and positioning solutions with the help of blockchain technology.

Journal ArticleDOI
TL;DR: A novel, polytope-based method from the class of direct search methods (DSMs) named Nelder–Mead simplex (NMS) is used to solve the optimization problem based on its computational efficiency and yields better convergence performance than the traditional gradient-descent optimization algorithm and a lower computation time and equivalent performance for the blocklength variable as the exhaustive search.
Abstract: Upcoming fifth-generation (5G) networks need to support novel ultrareliable and low-latency (URLLC) traffic that utilizes short packets. This requires a paradigm shift as traditional communication systems are designed to transmit only long data packets based on Shannon’s capacity formula, which poses a challenge for system designers. To address this challenge, this article relies on an unmanned aerial vehicle (UAV) and a reconfigurable intelligent surface (RIS) to deliver short URLLC instruction packets between ground Internet-of-Things (IoT) devices. In this context, we perform passive beamforming of RIS antenna elements as well as nonlinear and nonconvex optimization to minimize the total decoding error rate and find the UAV’s optimal position and blocklength. In this article, a novel, polytope-based method from the class of direct search methods (DSMs) named Nelder–Mead simplex (NMS) is used to solve the optimization problem based on its computational efficiency; in terms of lesser number of required iterations to evaluate objective function. The proposed approach yields better convergence performance than the traditional gradient-descent optimization algorithm and a lower computation time and equivalent performance for the blocklength variable as the exhaustive search. Moreover, the proposed approach allows ultrahigh reliability, which can be attained by increasing the number of antenna elements in RIS as well as increasing the allocated blocklengths. Simulations demonstrate the RIS’s performance gain and conclusively show that the UAV’s position is crucial for achieving ultrahigh reliability in short packet transmission.

Journal ArticleDOI
TL;DR: In this article, a secure data sharing scheme in the blockchain-enabled mobile edge computing system using an asynchronous learning approach is presented, and an adaptive privacy-preserving mechanism according to available system resources and privacy demands of users is presented.
Abstract: Mobile-edge computing (MEC) plays a significant role in enabling diverse service applications by implementing efficient data sharing. However, the unique characteristics of MEC also bring data privacy and security problem, which impedes the development of MEC. Blockchain is viewed as a promising technology to guarantee the security and traceability of data sharing. Nonetheless, how to integrate blockchain into MEC system is quite challenging because of dynamic characteristics of channel conditions and network loads. To this end, we propose a secure data sharing scheme in the blockchain-enabled MEC system using an asynchronous learning approach in this article. First, a blockchain-enabled secure data sharing framework in the MEC system is presented. Then, we present an adaptive privacy-preserving mechanism according to available system resources and privacy demands of users. Next, an optimization problem of secure data sharing is formulated in the blockchain-enabled MEC system with the aim to maximize the system performance with respect to the decreased energy consumption of MEC system and the increased throughput of blockchain system. Especially, an asynchronous learning approach is employed to solve the formulated problem. The numerical results demonstrate the superiority of our proposed secure data sharing scheme when compared with some popular benchmark algorithms in terms of average throughput, average energy consumption, and reward.

Journal ArticleDOI
TL;DR: This review analyzes security and privacy features consisting of data protection, network architecture, Quality of Services (QoS), app development, and continuous monitoring of healthcare that are facing difficulties in many IoT-based healthcare architectures.
Abstract: The Internet of Things (IoT) is a methodology or a system that encompasses real-world things to interact and communicate with each other with the assistance of networking technologies. This article describes surveys on advances in IoT-based healthcare methods and reviews the state-of-the-art technologies in detail. Moreover, this review classifies an existing IoT-based healthcare network and represents a summary of all perspective networks. IoT healthcare protocols are analyzed in this context and provide a broad discussion on it. It also initiates a comprehensive survey on IoT healthcare applications and services. Extensive insights into IoT healthcare security, its requirements, challenges, and privacy issues are visualized in IoT surrounding healthcare. In this review, we analyze security and privacy features consisting of data protection, network architecture, Quality of Services (QoS), app development, and continuous monitoring of healthcare that are facing difficulties in many IoT-based healthcare architectures. To mitigate the security problems, an IoT-based security architectural model has been proposed in this review. Furthermore, this review discloses the market opportunity that will enhance the IoT healthcare market development. To conduct the survey, we searched through established journal and conference databases using specific keywords to find scholarly works. We applied a filtering mechanism to collect only papers that were relevant to our research works. The selected papers were then examined carefully to understand their contributions/research focus. Eventually, the paper reviews were analyzed to identify any existing research gaps and untouched areas of research and to discover possible features for sustainable IoT healthcare development.

Journal ArticleDOI
TL;DR: A blockchain-empowered federated learning scheme to strengthen communication security and data privacy protection in DITEN and an asynchronous aggregation scheme to improve the efficiency and use digital twin empowered reinforcement learning to schedule relaying users and allocate spectrum resources.
Abstract: Emerging technologies, such as mobile-edge computing (MEC) and next-generation communications are crucial for enabling rapid development and deployment of the Internet of Things (IoT). With the increasing scale of IoT networks, how to optimize the network and allocate the limited resources to provide high-quality services remains a major concern. The existing work in this direction mainly relies on models that are of less practical value for resource-limited IoT networks, and can hardly simulate the dynamic systems in real time. In this article, we integrate digital twins with edge networks and propose the digital twin edge networks (DITENs) to fill the gap between physical edge networks and digital systems. Then, we propose a blockchain-empowered federated learning scheme to strengthen communication security and data privacy protection in DITEN. Furthermore, to improve the efficiency of the integrated scheme, we propose an asynchronous aggregation scheme and use digital twin empowered reinforcement learning to schedule relaying users and allocate spectrum resources. Theoretical analysis and numerical results confirm that the proposed scheme can considerably enhance both communication efficiency and data security for IoT applications.

Journal ArticleDOI
TL;DR: This article investigates the unmanned aerial vehicle (UAV)-assisted wireless powered Internet-of-Things system, where a UAV takes off from a data center, flies to each of the ground sensor nodes (SNs) in order to transfer energy and collect data from the SNs, and then returns to the data center.
Abstract: This article investigates the unmanned aerial vehicle (UAV)-assisted wireless powered Internet-of-Things system, where a UAV takes off from a data center, flies to each of the ground sensor nodes (SNs) in order to transfer energy and collect data from the SNs, and then returns to the data center. For such a system, an optimization problem is formulated to minimize the average Age of Information (AoI) of the data collected from all ground SNs. Since the average AoI depends on the UAV’s trajectory, the time required for energy harvesting (EH) and data collection for each SN, these factors need to be optimized jointly. Moreover, instead of the traditional linear EH model, we employ a nonlinear model because the behavior of the EH circuits is nonlinear by nature. To solve this nonconvex problem, we propose to decompose it into two subproblems, i.e., a joint energy transfer and data collection time allocation problem and a UAV’s trajectory planning problem. For the first subproblem, we prove that it is convex and give an optimal solution by using Karush–Kuhn–Tucker (KKT) conditions. This solution is used as the input for the second subproblem, and we solve optimally it by designing dynamic programming (DP) and ant colony (AC) heuristic algorithms. The simulation results show that the DP-based algorithm obtains the minimal average AoI of the system, and the AC-based heuristic finds solutions with near-optimal average AoI. The results also reveal that the average AoI increases as the flying altitude of the UAV increases and linearly with the size of the collected data at each ground SN.

Journal ArticleDOI
TL;DR: In this paper, a hybrid metaheuristic algorithm named genetic simulated annealing-based particle swarm optimization (GSPO) was proposed to minimize the total energy consumed by mobile devices and edge servers by jointly optimizing the offloading ratio of tasks, CPU speeds of mobile devices, allocated bandwidth of available channels, and transmission power of each mobile device in each time slot.
Abstract: Smart mobile devices (SMDs) can meet users’ high expectations by executing computational intensive applications but they only have limited resources, including CPU, memory, battery power, and wireless medium. To tackle this limitation, partial computation offloading can be used as a promising method to schedule some tasks of applications from resource-limited SMDs to high-performance edge servers. However, it brings communication overhead issues caused by limited bandwidth and inevitably increases the latency of tasks offloaded to edge servers. Therefore, it is highly challenging to achieve a balance between high-resource consumption in SMDs and high communication cost for providing energy-efficient and latency-low services to users. This work proposes a partial computation offloading method to minimize the total energy consumed by SMDs and edge servers by jointly optimizing the offloading ratio of tasks, CPU speeds of SMDs, allocated bandwidth of available channels, and transmission power of each SMD in each time slot. It jointly considers the execution time of tasks performed in SMDs and edge servers, and transmission time of data. It also jointly considers latency limits, CPU speeds, transmission power limits, available energy of SMDs, and the maximum number of CPU cycles and memories in edge servers. Considering these factors, a nonlinear constrained optimization problem is formulated and solved by a novel hybrid metaheuristic algorithm named genetic simulated annealing-based particle swarm optimization (GSP) to produce a close-to-optimal solution. GSP achieves joint optimization of computation offloading between a cloud data center and the edge, and resource allocation in the data center. Real-life data-based experimental results prove that it achieves lower energy consumption in less convergence time than its three typical peers.

Journal ArticleDOI
TL;DR: An energy-efficient dynamic task offloading algorithm is developed by choosing the optimal computing place in an online way, either on the IoT device, the MEC server or the MCC server with the goal of jointly minimizing the energy consumption and task response time.
Abstract: With the proliferation of compute-intensive and delay-sensitive mobile applications, large amounts of computational resources with stringent latency requirements are required on Internet-of-Things (IoT) devices. One promising solution is to offload complex computing tasks from IoT devices either to mobile-edge computing (MEC) or mobile cloud computing (MCC) servers. MEC servers are much closer to IoT devices and thus have lower latency, while MCC servers can provide flexible and scalable computing capability to support complicated applications. To address the tradeoff between limited computing capacity and high latency, and meanwhile, ensure the data integrity during the offloading process, we consider a blockchain scenario where edge computing and cloud computing can collaborate toward secure task offloading. We further propose a blockchain-enabled IoT-Edge-Cloud computing architecture that benefits both from MCC and MEC, where MEC servers offer lower latency computing services, while MCC servers provide stronger computation power. Moreover, we develop an energy-efficient dynamic task offloading (EEDTO) algorithm by choosing the optimal computing place in an online way, either on the IoT device, the MEC server or the MCC server with the goal of jointly minimizing the energy consumption and task response time. The Lyapunov optimization technique is applied to control computation and communication costs incurred by different types of applications and the dynamic changes of wireless environments. During the optimization, the best computing location for each task is chosen adaptively without requiring future system information as prior knowledge. Compared with previous offloading schemes with/without MEC and MCC cooperation, EEDTO can achieve energy-efficient offloading decisions with relatively lower computational complexity.

Journal ArticleDOI
TL;DR: A many-objective intelligent algorithm with sine function is presented to implement the model, which considers the variation tendency of diversity strategy in the population is similar to the sinefunction, and demonstrates excellent scheduling efficiency and hence enhancing the security.
Abstract: Internet of Things (IoT) is a huge network and establishes ubiquitous connections between smart devices and objects. The flourishing of IoT leads to an unprecedented data explosion, traditional data storing or processing techniques have the problem of low efficiency, and if the data are used maliciously, the security loss may be further caused. Multicloud is a high-performance secure computing platform, which combines multiple cloud providers for data processing, and the distributed multicloud platform ensures the security of data to some extent. Based on multicloud and task scheduling in IoT, this article constructs a many-objective distributed scheduling model, which includes six objectives of total time, cost, cloud throughput, energy consumption, resource utilization, and balancing load. Furthermore, this article presents a many-objective intelligent algorithm with sine function to implement the model, which considers the variation tendency of diversity strategy in the population is similar to the sine function. The experimental results demonstrate excellent scheduling efficiency and hence enhancing the security. This work provides a new idea for addressing the difficult problem of data processing in IoT.

Journal ArticleDOI
TL;DR: A UAV-supported clustered nonorthogonal multiple access (C-NOMA) system that provides services to IoT terminals as an aerial BS based on the wireless-powered communication (WPC) technique and a synergetic scheme for UAV trajectory planning and subslot allocation is proposed.
Abstract: The sixth-generation (6G) communication requires supporting massive Internet of Things (IoT) devices and extremely differentiated IoT applications for the air–space–ground integrated network Relying on the aerial superiority, unmanned aerial vehicle (UAV) is capable of acting as an aerial base station (BS) and supporting IoT deployment in remote and disaster areas A UAV-supported clustered nonorthogonal multiple access (C-NOMA) system is put forward in this article Specifically, the UAV provides services to IoT terminals as an aerial BS based on the wireless-powered communication (WPC) technique According to this system, we propose a synergetic scheme for UAV trajectory planning and subslot allocation Our goal is to maximize the uplink average achievable sum rate of IoT terminals by synergistically planning UAV trajectory and subslot duration, while guaranteeing the uplink achievable sum rate and the UAV mobility constraints As the formulated problem suffers nonconvexity and complication, an efficient iterative algorithm is proposed to address it First, for fixed UAV trajectory, all the terminals are clustered and a subslot allocation algorithm based on the Lagrange multiplier and bisection method is proposed Then, for a fixed clustering state and subslot duration, we optimize the UAV trajectory Finally, we solve these two subproblems alternatively until the objective function converges The effectiveness of the proposed scheme in the UAV-supported C-NOMA system is verified by the numerical results