scispace - formally typeset
Search or ask a question

Showing papers on "Heterogeneous network published in 2020"


15 Mar 2020
TL;DR: This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.
Abstract: Federated Learning is a distributed learning paradigm with two key challenges that differentiate it from traditional distributed optimization: (1) significant variability in terms of the systems characteristics on each device in the network (systems heterogeneity), and (2) non-identically distributed data across the network (statistical heterogeneity). In this work, we introduce a framework, FedProx, to tackle heterogeneity in federated networks. FedProx can be viewed as a generalization and re-parametrization of FedAvg, the current state-of-the-art method for federated learning. While this re-parameterization makes only minor modifications to the method itself, these modifications have important ramifications both in theory and in practice. Theoretically, we provide convergence guarantees for our framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work (systems heterogeneity). Practically, we demonstrate that FedProx allows for more robust convergence than FedAvg across a suite of realistic federated datasets. In particular, in highly heterogeneous settings, FedProx demonstrates significantly more stable and accurate convergence behavior relative to FedAvg---improving absolute test accuracy by 22% on average.

1,490 citations


Journal ArticleDOI
TL;DR: This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT, such as 5G new radio, multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail.
Abstract: Recently, wireless technologies have been growing actively all around the world. In the context of wireless technology, fifth-generation (5G) technology has become a most challenging and interesting topic in wireless research. This article provides an overview of the Internet of Things (IoT) in 5G wireless systems. IoT in the 5G system will be a game changer in the future generation. It will open a door for new wireless architecture and smart services. Recent cellular network LTE (4G) will not be sufficient and efficient to meet the demands of multiple device connectivity and high data rate, more bandwidth, low-latency quality of service (QoS), and low interference. To address these challenges, we consider 5G as the most promising technology. We provide a detailed overview of challenges and vision of various communication industries in 5G IoT systems. The different layers in 5G IoT systems are discussed in detail. This article provides a comprehensive review on emerging and enabling technologies related to the 5G system that enables IoT. We consider the technology drivers for 5G wireless technology, such as 5G new radio (NR), multiple-input–multiple-output antenna with the beamformation technology, mm-wave commutation technology, heterogeneous networks (HetNets), the role of augmented reality (AR) in IoT, which are discussed in detail. We also provide a review on low-power wide-area networks (LPWANs), security challenges, and its control measure in the 5G IoT scenario. This article introduces the role of AR in the 5G IoT scenario. This article also discusses the research gaps and future directions. The focus is also on application areas of IoT in 5G systems. We, therefore, outline some of the important research directions in 5G IoT.

896 citations


Journal ArticleDOI
TL;DR: This paper presents the IoT technology from a bird's eye view covering its statistical/architectural trends, use cases, challenges and future prospects, and discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.
Abstract: The Internet of Things (IoT)-centric concepts like augmented reality, high-resolution video streaming, self-driven cars, smart environment, e-health care, etc. have a ubiquitous presence now. These applications require higher data-rates, large bandwidth, increased capacity, low latency and high throughput. In light of these emerging concepts, IoT has revolutionized the world by providing seamless connectivity between heterogeneous networks (HetNets). The eventual aim of IoT is to introduce the plug and play technology providing the end-user, ease of operation, remotely access control and configurability. This paper presents the IoT technology from a bird’s eye view covering its statistical/architectural trends, use cases, challenges and future prospects. The paper also presents a detailed and extensive overview of the emerging 5G-IoT scenario. Fifth Generation (5G) cellular networks provide key enabling technologies for ubiquitous deployment of the IoT technology. These include carrier aggregation, multiple-input multiple-output (MIMO), massive-MIMO (M-MIMO), coordinated multipoint processing (CoMP), device-to-device (D2D) communications, centralized radio access network (CRAN), software-defined wireless sensor networking (SD-WSN), network function virtualization (NFV) and cognitive radios (CRs). This paper presents an exhaustive review for these key enabling technologies and also discusses the new emerging use cases of 5G-IoT driven by the advances in artificial intelligence, machine and deep learning, ongoing 5G initiatives, quality of service (QoS) requirements in 5G and its standardization issues. Finally, the paper discusses challenges in the implementation of 5G-IoT due to high data-rates requiring both cloud-based platforms and IoT devices based edge computing.

591 citations


Journal ArticleDOI
TL;DR: In this article, the authors review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning and investigate their employment in the compelling applications of wireless networks, including heterogeneous networks, cognitive radios (CR), Internet of Things (IoT), machine to machine networks (M2M), and so on.
Abstract: Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of Things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.

413 citations


Journal ArticleDOI
TL;DR: A state-of-art survey on the integration of blockchain with 5G networks and beyond, including discussions on the potential of blockchain for enabling key 5G technologies, including cloud/edge computing, Software Defined Networks, Network Function Virtualization, Network Slicing, and D2D communications.

244 citations


Proceedings ArticleDOI
Danqing Wang1, Pengfei Liu1, Yining Zheng1, Xipeng Qiu1, Xuanjing Huang1 
26 Apr 2020
TL;DR: This paper presents a heterogeneous graph-based neural network for extractive summarization (HETERSUMGRAPH), which contains semantic nodes of different granularity levels apart from sentences that act as the intermediary between sentences and enrich the cross-sentence relations.
Abstract: As a crucial step in extractive document summarization, learning cross-sentence relations has been explored by a plethora of approaches. An intuitive way is to put them in the graph-based neural network, which has a more complex structure for capturing inter-sentence relationships. In this paper, we present a heterogeneous graph-based neural network for extractive summarization (HETERSUMGRAPH), which contains semantic nodes of different granularity levels apart from sentences. These additional nodes act as the intermediary between sentences and enrich the cross-sentence relations. Besides, our graph structure is flexible in natural extension from a single-document setting to multi-document via introducing document nodes. To our knowledge, we are the first one to introduce different types of nodes into graph-based neural networks for extractive document summarization and perform a comprehensive qualitative analysis to investigate their benefits. The code will be released on Github.

218 citations


Proceedings Article
30 Apr 2020
TL;DR: In this article, the authors proposed q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair (i.e., more uniform) accuracy distribution across devices in federated networks.
Abstract: Federated learning involves training statistical models in massive, heterogeneous networks. Naively minimizing an aggregate loss function in such a network may disproportionately advantage or disadvantage some of the devices. In this work, we propose q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair (i.e., more uniform) accuracy distribution across devices in federated networks. To solve q-FFL, we devise a communication-efficient method, q-FedAvg, that is suited to federated networks. We validate both the effectiveness of q-FFL and the efficiency of q-FedAvg on a suite of federated datasets with both convex and non-convex models, and show that q-FFL (along with q-FedAvg) outperforms existing baselines in terms of the resulting fairness, flexibility, and efficiency.

202 citations


Journal ArticleDOI
TL;DR: In this paper, the authors conduct a systematic and in-depth survey of the ML- and DL-based resource management mechanisms in cellular wireless and IoT networks, and identify the future research directions in using ML and DL for resource allocation and management in IoT networks.
Abstract: Internet-of-Things (IoT) refers to a massively heterogeneous network formed through smart devices connected to the Internet. In the wake of disruptive IoT with a huge amount and variety of data, Machine Learning (ML) and Deep Learning (DL) mechanisms will play a pivotal role to bring intelligence to the IoT networks. Among other aspects, ML and DL can play an essential role in addressing the challenges of resource management in large-scale IoT networks. In this article, we conduct a systematic and in-depth survey of the ML- and DL-based resource management mechanisms in cellular wireless and IoT networks. We start with the challenges of resource management in cellular IoT and low-power IoT networks, review the traditional resource management mechanisms for IoT networks, and motivate the use of ML and DL techniques for resource management in these networks. Then, we provide a comprehensive survey of the existing ML- and DL-based resource management techniques in wireless IoT networks and the techniques specifically designed for HetNets, MIMO and D2D communications, and NOMA networks. To this end, we also identify the future research directions in using ML and DL for resource allocation and management in IoT networks.

169 citations


Journal ArticleDOI
TL;DR: This work provides a generic paradigm for the systematic categorization and analysis over the merits of various existing HNE algorithms, and creates four benchmark datasets with various properties regarding scale, structure, attribute/label availability, and \etc.~from different sources towards handy and fair evaluations of H NE algorithms.
Abstract: Since real-world objects and their interactions are often multi-modal and multi-typed, heterogeneous networks have been widely used as a more powerful, realistic, and generic superclass of traditional homogeneous networks (graphs). Meanwhile, representation learning (a.k.a. embedding) has recently been intensively studied and shown effective for various network mining and analytical tasks. In this work, we aim to provide a unified framework to deeply summarize and evaluate existing research on heterogeneous network embedding (HNE), which includes but goes beyond a normal survey. Since there has already been a broad body of HNE algorithms, as the first contribution of this work, we provide a generic paradigm for the systematic categorization and analysis over the merits of various existing HNE algorithms. Moreover, existing HNE algorithms, though mostly claimed generic, are often evaluated on different datasets. As the second contribution, we create four benchmark datasets with various properties regarding scale, structure, attribute/label availability, and etc. from different sources, towards handy and fair evaluations of HNE algorithms. As the third contribution, we carefully refactor and amend the implementations and create friendly interfaces for eleven popular HNE algorithms, and provide all-around comparisons among them over multiple tasks and experimental settings.

132 citations


Journal ArticleDOI
TL;DR: A heterogeneous radio frequency (RF)/visible light communication (VLC) industrial network architecture to guarantee the different QoS requirements, where RF is capable of offering wide-area coverage and VLC has the ability to provide high transmission data rate is presented.
Abstract: Smart factory under Industry 4.0 and industrial Internet of Things (IoT) has attracted much attention from both academia and industry. In wireless industrial networks, industrial IoT and IoT devices have different quality-of-service (QoS) requirements, ranging from ultra-reliable low-latency communications (URLLC) to high transmission data rates. These industrial networks will be highly complex and heterogeneous, as well as the spectrum and energy resources are severely limited. Hence, this article presents a heterogeneous radio frequency (RF)/visible light communication (VLC) industrial network architecture to guarantee the different QoS requirements, where RF is capable of offering wide-area coverage and VLC has the ability to provide high transmission data rate. A joint uplink and downlink energy-efficient resource management decision-making problem (network selection, subchannel assignment, and power management) is formulated as a Markov decision process. In addition, a new deep post-decision state (PDS)-based experience replay and transfer (PDS-ERT) reinforcement learning algorithm is proposed to learn the optimal policy. Simulation results corroborate the superiority in performance of the presented heterogeneous network, and verify that the proposed PDS-ERT learning algorithm outperforms other existing algorithms in terms of meeting the energy efficiency and the QoS requirements.

131 citations


Journal ArticleDOI
TL;DR: An improvement of the existing stable election protocol (SEP) that implements a threshold-based cluster head (CH) selection for a heterogeneous network that outperforms SEP and DEEC protocols with an improvement of 300% in network lifetime and 56% in throughput.
Abstract: Wireless sensor networks (WSNs) is a virtual layer in the paradigm of the Internet of Things (IoT). It inter-relates information associated with the physical domain to the IoT drove computational systems. WSN provides an ubiquitous access to location, the status of different entities of the environment, and data acquisition for long-term IoT monitoring. Since energy is a major constraint in the design process of a WSN, recent advances have led to project various energy-efficient protocols. Routing of data involves energy expenditure in considerable amount. In recent times, various heuristic clustering protocols have been discussed to solve the purpose. This article is an improvement of the existing stable election protocol (SEP) that implements a threshold-based cluster head (CH) selection for a heterogeneous network. The threshold maintains uniform energy distribution between member and CH nodes. The sensor nodes are also categorized into three different types called normal, intermediate, and advanced depending on the initial energy supply to distribute the network load evenly. The simulation result shows that the proposed scheme outperforms SEP and DEEC protocols with an improvement of 300% in network lifetime and 56% in throughput.

Journal ArticleDOI
TL;DR: This paper introduces various link prediction approaches and addresses how researchers combined link prediction as a base method to perform other applications in social networks such as recommender systems, community detection, anomaly detection and influence analysis.

Journal ArticleDOI
TL;DR: A privacy-preserved, incentive-compatible, and spectrum-efficient framework based on blockchain is developed, which is implemented in two stages, and the operation details of secure spectrum sharing, incentive mechanism design, and efficient spectrum allocation are elaborate.
Abstract: In the future 5G paradigm, billions of machinetype devices will be deployed to enable wide-area and ubiquitous data sensing, collection, and transmission. Considering the traffic characteristics of machine-to-machine (M2M) communications and the spectrum shortage dilemma, a cost-efficient solution is to share the underutilized spectrum allocated to human-to-human (H2H) users with M2M devices in an opportunistic manner. However, the implementation of large-scale spectrum sharing in 5G heterogeneous networks confronts many challenges, including lack of incentive mechanism, privacy leakage, security threats, and so on. This motivates us to develop a privacy-preserved, incentive-compatible, and spectrum-efficient framework based on blockchain, which is implemented in two stages. First, H2H users sign a contract with the base station for spectrum sharing, and receive dedicated payments based on their contributions. Next, the shared spectrum is allocated to M2M devices to maximize the total throughput. We elaborate the operation details of secure spectrum sharing, incentive mechanism design, and efficient spectrum allocation. A case study is presented to demonstrate the security and efficiency of the proposed framework. Finally, we outline several open issues and conclude this article.

Journal ArticleDOI
TL;DR: The motivation behind why LiFi is a very timely technology, especially for 6th generation (6G) cellular communications is provided and results from a LiFi deployment in a school classroom are provided, which show that Wi-Fi network performance can be improved significantly by offloading traffic to the LiFi.
Abstract: LiFi is networked, bidirectional wireless communication with light. It is used to connect fixed and mobile devices at very high data rates by harnessing the visible light and infrared spectrum. Combined, these spectral resources are 2600 times larger than the entire radio frequency (RF) spectrum. This paper provides the motivation behind why LiFi is a very timely technology, especially for 6th generation (6G) cellular communications. It discusses and reviews essential networking technologies, such as interference mitigation and hybrid LiFi/Wi-Fi networking topologies. We also consider the seamless integration of LiFi into existing wireless networks to form heterogeneous networks across the optical and RF domains and discuss implications and solutions in terms of load balancing. Finally, we provide the results of a real-world hybrid LiFi/Wi-Fi network deployment in a software defined networking testbed. In addition, results from a LiFi deployment in a school classroom are provided, which show that Wi-Fi network performance can be improved significantly by offloading traffic to the LiFi.

Journal ArticleDOI
TL;DR: This article presents a systematic and comprehensive review of virtualization techniques explicitly designed for IoT networks, and classified the literature into software-defined networks designed for Internet of Things, function virtualization for IoT Networks, and software- defined IoT networks.
Abstract: Internet of Things (IoT) and Network Softwarization are fast becoming core technologies of information systems and network management for the next-generation Internet. The deployment and applications of IoT range from smart cities to urban computing and from ubiquitous healthcare to tactile Internet. For this reason, the physical infrastructure of heterogeneous network systems has become more complicated and thus requires efficient and dynamic solutions for management, configuration, and flow scheduling. Network softwarization in the form of Software Defined Networks and Network Function Virtualization has been extensively researched for IoT in the recent past. In this article, we present a systematic and comprehensive review of virtualization techniques explicitly designed for IoT networks. We have classified the literature into software-defined networks designed for IoT, function virtualization for IoT networks, and software-defined IoT networks. These categories are further divided into works that present architectural, security, and management solutions. Besides, the article highlights several short-term and long-term research challenges and open issues related to the adoption of software-defined Internet of Things.

Journal ArticleDOI
TL;DR: This paper investigates the channel model in the high mobility and heterogeneous network and proposed a novel deep reinforcement learning based intelligent TDD configuration algorithm to dynamically allocate radio resources in an online manner and achieves significant network performance improvement in terms of both network throughput and packet loss rate.
Abstract: Recently, the 5G is widely deployed for supporting communications of high mobility nodes including train, vehicular and unmanned aerial vehicles (UAVs) largely emerged as the main components for constructing the wireless heterogeneous network (HetNet). To further improve the radio utilization, the Time Division Duplex (TDD) is considered to be the potential full-duplex communication technology in the high mobility 5G network. However, the high mobility of users leads to the high dynamic network traffic and unpredicted link state change. A new method to predict the dynamic traffic and channel condition and schedule the TDD configuration in real-time is essential for the high mobility environment. In this paper, we investigate the channel model in the high mobility and heterogeneous network and proposed a novel deep reinforcement learning based intelligent TDD configuration algorithm to dynamically allocate radio resources in an online manner. In the proposal, the deep neural network is employed to extract the features of the complex network information, and the dynamic Q-value iteration based reinforcement learning with experience replay memory mechanism is proposed to adaptively change TDD Up/Down-link ratio by evaluated rewards. The simulation results show that the proposal achieves significant network performance improvement in terms of both network throughput and packet loss rate, comparing with conventional TDD resource allocation algorithms.

Journal ArticleDOI
TL;DR: A model of the three-layer heterogeneous satellite network is constructed and a low-complexity method for calculating the capacity between satellites is proposed and a long-term optimal capacity allocation algorithm is proposed to optimize the long- term utility of the system.
Abstract: The development of satellite networks is drawing much more attention in recent years due to the wide coverage ability. Composed of geosynchronous orbit (GEO), medium earth orbit (MEO), and low earth orbit (LEO) satellites, the satellite network is a three-layer heterogeneous network of high complexity, for which comprehensive theoretical analysis is still missing. In this paper, we investigate the problem of capacity management in the three-layer heterogeneous satellite network. We first construct the model of the network and propose a low-complexity method for calculating the capacity between satellites. Based on the time structure of the time expanded graph, the searching space is greatly reduced compared to traditional augmenting path searching strategies, which can significantly reduce the computing complexity. Then, based on Q-learning, we proposed a long-term optimal capacity allocation algorithm to optimize the long-term utility of the system. In order to reduce the storage and computing complexity, a learning framework with low-complexity is constructed while taking the properties of satellite systems into account. Finally, we analyze the capacity performance of the three-layer heterogeneous satellite network and also evaluate the proposed algorithms with numerical results.

Journal ArticleDOI
TL;DR: This paper investigates the resource optimization problem of NOMA heterogeneous small cell networks with simultaneous wireless information and power transfer (SWIPT) by decoupling subchannel allocation and power control, and proposes a low-complexity subchannel matching algorithm.
Abstract: Non-orthogonal multiple access (NOMA) in heterogeneous network (HetNet) is a very promising scheme to meet the exponential growth of mobile data expected in the coming years. However, since wireless networks are becoming denser, the energy consumption of such networks is increasingly severe. Therefore, it is necessary to design novel energy efficiency (EE) maximization technologies under the constraint of limited energy supply. This paper investigates the resource optimization problem of NOMA heterogeneous small cell networks with simultaneous wireless information and power transfer (SWIPT). By decoupling subchannel allocation and power control, a low-complexity subchannel matching algorithm is designed. Furthermore, to maximize the energy efficiency, a power optimization algorithm is proposed using Langrangian duality. Aiming at the power allocation problem, the original non-convex and non-linear energy efficiency optimization problem is transformed into a more tractable one. Simulation results demonstrate the effectiveness and convergence of the proposed optimization scheme in terms of system energy efficiency.

Posted Content
TL;DR: In this article, a distributed caching optimization algorithm via belief propagation (BP) for minimizing the downloading latency is proposed, where the authors derive the delay minimization objective function and formulate an optimization problem.
Abstract: Heterogeneous cellular networks (HCN) with embedded small cells are considered, where multiple mobile users wish to download network content of different popularity. By caching data into the small-cell base stations (SBS), we will design distributed caching optimization algorithms via belief propagation (BP) for minimizing the downloading latency. First, we derive the delay-minimization objective function (OF) and formulate an optimization problem. Then we develop a framework for modeling the underlying HCN topology with the aid of a factor graph. Furthermore, distributed BP algorithm is proposed based on the network's factor graph. Next, we prove that a fixed point of convergence exists for our distributed BP algorithm. In order to reduce the complexity of the BP, we propose a heuristic BP algorithm. Furthermore, we evaluate the average downloading performance of our HCN for different numbers and locations of the base stations (BS) and mobile users (MU), with the aid of stochastic geometry theory. By modeling the nodes distributions using a Poisson point process, we develop the expressions of the average factor graph degree distribution, as well as an upper bound of the outage probability for random caching schemes. We also improve the performance of random caching. Our simulations show that (1) the proposed distributed BP algorithm has a near-optimal delay performance, approaching that of the high-complexity exhaustive search method, (2) the modified BP offers a good delay performance at a low communication complexity, (3) both the average degree distribution and the outage upper bound analysis relying on stochastic geometry match well with our Monte-Carlo simulations, and (4) the optimization based on the upper bound provides both a better outage and a better delay performance than the benchmarks.

Journal ArticleDOI
TL;DR: This paper proposes sFlow and adaptive polling based sampling with Snort Intrusion Detection System (IDS) and deep learning based model, which helps to lower down the various types of prevalent DDoS attacks inside the IoT network.

Journal ArticleDOI
TL;DR: A deep-reinforcement-learning-based quality-of-service (QoS)-aware secure routing protocol (DQSP) is proposed, which can extract knowledge from history traffic demands by interacting with the underlying network environment, and dynamically optimize the routing policy.
Abstract: Recently, with the proliferation of communication devices, Internet of Things (IoT) has become an emerging technology which facilitates massive devices to be enabled with connectivity by heterogeneous networks. However, it is usually a technical challenge for traditional networks to handle such a huge number of devices in an efficient manner. Recently, the software-defined network (SDN) technique with its agility and elasticity has been incorporated into IoT to meet the potential scale and flexibility requirements and form a novel IoT architecture also known as SDN-IoT. As the size of SDN-IoT increases, efficient routing protocols with low latency and high security are required, while the default routing protocols of SDN are still vulnerable to dynamic change of flow control rules especially when the network is under attack. To address the above issues, a deep-reinforcement-learning-based quality-of-service (QoS)-aware secure routing protocol (DQSP) is proposed in this article. While guaranteeing the QoS, our method can extract knowledge from history traffic demands by interacting with the underlying network environment, and dynamically optimize the routing policy. Extensive simulation experiments have been conducted with respect to several network performance metrics, demonstrating that our DQSP has good convergence and high effectiveness. Moreover, DQSP outperforms the traditional OSPF routing protocol, at least 10% relative performance gains in most cases.

Journal ArticleDOI
TL;DR: A distributed $Q-learning aided power allocation algorithm for two-layer heterogeneous IIoT networks is proposed and the spirit of designing reward functions is discussed, followed by four delicately defined reward functions considering both the QoS of femtocell IoT user equipments and macrocell IoT users and their fairness.
Abstract: To achieve the goal of “Industrial 4.0,” cellular network with wide coverage has gradually become an intensely important carrier for industrial Internet of Things (IIoT). The fifth generation cellular network is expected to be a unifying network that may connect billions of IIoT devices for the sake of supporting advanced IIoT business. In order to realize wide and seamless information coverage, heterogeneous network architecture becomes a beneficial method, which can also improve the near-ceiling network capacity. In order to guarantee the quality of service (QoS) as well as the fairness of different IIoT devices with limited network resources, the network association in IIoT should be performed in a more intelligent manner. In this article, we propose a distributed $Q$ -learning aided power allocation algorithm for two-layer heterogeneous IIoT networks. Moreover, we discuss the spirit of designing reward functions, followed by four delicately defined reward functions considering both the QoS of femtocell IoT user equipments and macrocell IoT user equipments and their fairness. Also, both fixed and dynamic learning rates and different kinds of multiagent cooperation modes are investigated. Finally, simulation results show the effectiveness and superiority of our proposed $Q$ -learning based power allocation algorithm.

Journal ArticleDOI
TL;DR: A network-based computational framework, termed AOPEDF, an Arbitrary-Order Proximity Embedded Deep Forest approach, for prediction of DTIs achieves high accuracy in identifying molecular targets among known drugs on two external validation sets, outperforming several state-of-the-art methods.
Abstract: Motivation Systematic identification of molecular targets among known drugs plays an essential role in drug repurposing and understanding of their unexpected side effects. Computational approaches for prediction of drug-target interactions (DTIs) are highly desired in comparison to traditional experimental assays. Furthermore, recent advances of multiomics technologies and systems biology approaches have generated large-scale heterogeneous, biological networks, which offer unexpected opportunities for network-based identification of new molecular targets among known drugs. Results In this study, we present a network-based computational framework, termed AOPEDF, an arbitrary-order proximity embedded deep forest approach, for prediction of DTIs. AOPEDF learns a low-dimensional vector representation of features that preserve arbitrary-order proximity from a highly integrated, heterogeneous biological network connecting drugs, targets (proteins) and diseases. In total, we construct a heterogeneous network by uniquely integrating 15 networks covering chemical, genomic, phenotypic and network profiles among drugs, proteins/targets and diseases. Then, we build a cascade deep forest classifier to infer new DTIs. Via systematic performance evaluation, AOPEDF achieves high accuracy in identifying molecular targets among known drugs on two external validation sets collected from DrugCentral [area under the receiver operating characteristic curve (AUROC) = 0.868] and ChEMBL (AUROC = 0.768) databases, outperforming several state-of-the-art methods. In a case study, we showcase that multiple molecular targets predicted by AOPEDF are associated with mechanism-of-action of substance abuse disorder for several marketed drugs (such as aripiprazole, risperidone and haloperidol). Availability and implementation Source code and data can be downloaded from https://github.com/ChengF-Lab/AOPEDF.

Journal ArticleDOI
TL;DR: In this paper, a secure framework for SDN-based Edge computing in IoT-enabled healthcare system is designed using a lightweight authentication scheme and the results demonstrate that the proposed framework provides better solutions for IoT- enabled healthcare systems.
Abstract: The Internet of Things (IoT) consists of resource-constrained smart devices capable to sense and process data. It connects a huge number of smart sensing devices, i.e., things, and heterogeneous networks. The IoT is incorporated into different applications, such as smart health, smart home, smart grid, etc. The concept of smart healthcare has emerged in different countries, where pilot projects of healthcare facilities are analyzed. In IoT-enabled healthcare systems, the security of IoT devices and associated data is very important, whereas Edge computing is a promising architecture that solves their computational and processing problems. Edge computing is economical and has the potential to provide low latency data services by improving the communication and computation speed of IoT devices in a healthcare system. In Edge-based IoT-enabled healthcare systems, load balancing, network optimization, and efficient resource utilization are accurately performed using artificial intelligence (AI), i.e., intelligent software-defined network (SDN) controller. SDN-based Edge computing is helpful in the efficient utilization of limited resources of IoT devices. However, these low powered devices and associated data (private sensitive data of patients) are prone to various security threats. Therefore, in this paper, we design a secure framework for SDN-based Edge computing in IoT-enabled healthcare system. In the proposed framework, the IoT devices are authenticated by the Edge servers using a lightweight authentication scheme. After authentication, these devices collect data from the patients and send them to the Edge servers for storage, processing, and analyses. The Edge servers are connected with an SDN controller, which performs load balancing, network optimization, and efficient resource utilization in the healthcare system. The proposed framework is evaluated using computer-based simulations. The results demonstrate that the proposed framework provides better solutions for IoT-enabled healthcare systems.

Journal ArticleDOI
TL;DR: A security analysis on the proposed authentication protocol, including the automatic encryption protocol tool ProVerif, BAN logic, and informal security analysis, proved that the protocol is secure and proved the efficiency of the scheme and achieved higher security standards.
Abstract: Currently, the popularity of the Internet of Things (IoT) has brought about an increase in the amount of data, so multi-server distributed cloud computing has been widely used in various applications that have brought convenience to our daily lives. At the same time, the development of the fifth generation (5G) of mobile communication technology has gradually become the main driving force for the popularization of the IoT. Because the 5G network is a heterogeneous network with multiple servers and small cells, the mutual authentication protocol under multiple servers is also applicable to the 5G network environment. However, much of the data will have serious storage and security issues during transmission. Aiming at the security issues in a multi-server (M-S) architecture, in 2018, Wu et al. proposed an authentication protocol in a distributed cloud environment. They claimed that their protocol is secure and resistant to various known types of attacks. However, we found that their protocol does not guarantee perfect forward secrecy (PFS) and suffers from privileged insider (PI) attacks. Such attacks will cause data to be out of sync. Therefore, we improved Wu et al. ’s protocol and proposed an improvement in the 5G network environment. Finally, we performed a security analysis on the proposed protocol, including the automatic encryption protocol tool ProVerif, BAN logic, and informal security analysis, which proved that our protocol is secure. Compared with similar existing schemes, we have proved the efficiency of the scheme and achieved higher security standards.

Journal ArticleDOI
TL;DR: This paper proposes a reconfigurable service provisioning framework based on service function chaining (SFC) for SAGIN, and forms the SFC planning problem as an integer non-linear programming problem, which is NP-hard.
Abstract: Space-air-ground integrated networks (SAGIN) extend the capability of wireless networks and will be the essential building block for many advanced applications, like autonomous driving, earth monitoring, and etc. However, coordinating heterogeneous physical resources is very challenging in such a large-scale dynamic network. In this paper, we propose a reconfigurable service provisioning framework based on service function chaining (SFC) for SAGIN. In SFC, the network functions are virtualized and the service data needs to flow through specific network functions in a predefined sequence. The inherent issue is how to plan the service function chains over large-scale heterogeneous networks, subject to the resource limitations of both communication and computation. Specifically, we must jointly consider the virtual network functions (VNFs) embedding and service data routing. We formulate the SFC planning problem as an integer non-linear programming problem, which is NP-hard. Then, a heuristic greedy algorithm is proposed, which concentrates on leveraging different features of aerial and ground nodes and balancing the resource consumptions. Furthermore, a new metric, aggregation ratio (AR) is proposed to elaborate the communication-computation tradeoff. Extensive simulations shows that our proposed algorithm achieves near-optimal performance. We also find that the SAGIN significantly reduces the service blockage probability and improves the efficiency of resource utilization. Finally, a case study on multiple intersection traffic scheduling is provided to demonstrate the effectiveness of our proposed SFC-based service provisioning framework.

Proceedings ArticleDOI
09 Jul 2020
TL;DR: This survey examines and review the problem of representation learning with the focus on heterogeneous networks, which consists of different types of vertices and relations and builds the Heterogeneous Graph Benchmark1 to facilitate open research for this rapidly-developing topic.
Abstract: Representation learning has offered a revolutionary learning paradigm for various AI domains. In this survey, we examine and review the problem of representation learning with the focus on heterogeneous networks, which consists of different types of vertices and relations. The goal of this problem is to automatically project objects, most commonly, vertices, in an input heterogeneous network into a latent embedding space such that both the structural and relational properties of the network can be encoded and preserved. The embeddings (representations) can be then used as the features to machine learning algorithms for addressing corresponding network tasks. To learn expressive embeddings, current research developments can fall into two major categories: shallow embedding learning and graph neural networks. After a thorough review of the existing literature, we identify several critical challenges that remain unaddressed and discuss future directions. Finally, we build the Heterogeneous Graph Benchmark1 to facilitate open research for this rapidly-developing topic.

Journal ArticleDOI
TL;DR: A novel framework is proposed that is built based on deep learning and can realize the detection of attacks via classification and make it possible to provide high-quality real-time forensics services on edge consumer devices such as cell phone and laptops, which brings colossal practical value.
Abstract: The upcoming 5G heterogeneous networks (HetNets) have attracted much attention worldwide. Large amounts of high velocity data can be transported by using the bandwidth spectrum of HetNets, yielding both great benefits and several concerning issues. In particular, great harm to our community could occur if the main visual information channels, such as images and videos, are maliciously attacked and uploaded to the internet, where they can be spread quickly. Therefore, we propose a novel framework as a digital forensics tool to protect end users. It is built based on deep learning and can realize the detection of attacks via classification. Compared with the conventional methods and justified by our experiments, the data collection efficiency, robustness, and detection performance of the proposed model are all refined. In addition, assisted by 5G HetNets, our proposed framework makes it possible to provide high-quality real-time forensics services on edge consumer devices (ECE) such as cell phones and laptops, which brings colossal practical value. Some discussions are also carried out to outline potential future threats.

Journal ArticleDOI
TL;DR: This paper provides a comprehensive study on the mobility management in 5G HetNet in terms of radio resource control, the initial access and registration procedure of the user equipment to the network, the paging procedure that provides the location of the UE within thenetwork, connected mode mobility management schemes, beam level mobility and beam management.
Abstract: With the rapid increase in the number of mobile users, wireless access technologies are evolving to provide mobile users with high data rates and support new applications that include both human and machine-type communications. Heterogeneous networks (HetNets), created by the joint installation of macro cells and a large number of densely deployed small cells, are considered an important solution to deal with the increasing network capacity demands and provide high coverage to wireless users in future fifth generation (5G) wireless networks. Due to the increasing complexity of network topology in 5G HetNets with the integration of many different base station types, in 5G architecture mobility management has many challenges. Intense deployment of small cells, along with many advantages it provides, brings important mobility management problems such as frequent handover (HO), HO failure, HO delays, ping-pong HO and high energy consumption which will result in lower user experience and heavy signal loads. In this paper, we provide a comprehensive study on the mobility management in 5G HetNet in terms of radio resource control, the initial access and registration procedure of the user equipment (UE) to the network, the paging procedure that provides the location of the UE within the network, connected mode mobility management schemes, beam level mobility and beam management. Besides, this paper addresses the challenges and suggest possible solutions for the 5G mobility management.

Journal ArticleDOI
TL;DR: The study aims to provide a detailed review of cooperative communication among all the techniques and potential problems associated with the spectrum management that has been addressed with the possible solutions proposed by the latest researches.
Abstract: With an extensive growth in user demand for high throughput, large capacity, and low latency, the ongoing deployment of Fifth-Generation (5G) systems is continuously exposing the inherent limitations of the system, as compared with its original premises. Such limitations are encouraging researchers worldwide to focus on next-generation 6G wireless systems, which are expected to address the constraints. To meet the above demands, future radio network architecture should be effectively designed to utilize its maximum radio spectrum capacity. It must simultaneously utilize various new techniques and technologies, such as Carrier Aggregation (CA), Cognitive Radio (CR), and small cell-based Heterogeneous Networks (HetNet), high-spectrum access (mmWave), and Massive Multiple-Input-Multiple-Output (M-MIMO), to achieve the desired results. However, the concurrent operations of these techniques in current 5G cellular networks create several spectrum management issues; thus, a comprehensive overview of these emerging technologies is presented in detail in this study. Then, the problems involved in the concurrent operations of various technologies for the spectrum management of the current 5G network are highlighted. The study aims to provide a detailed review of cooperative communication among all the techniques and potential problems associated with the spectrum management that has been addressed with the possible solutions proposed by the latest researches. Future research challenges are also discussed to highlight the necessary steps that can help achieve the desired objectives for designing 6G wireless networks.