scispace - formally typeset
Search or ask a question

Showing papers on "Heterogeneous network published in 2021"


Journal ArticleDOI
TL;DR: 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.
Abstract: The fifth generation (5G) wireless communication networks are being deployed worldwide from 2020 and more capabilities are in the process of being standardized, such as mass connectivity, ultra-reliability, and guaranteed low latency. However, 5G will not meet all requirements of the future in 2030 and beyond, and sixth generation (6G) wireless communication networks are expected to provide global coverage, enhanced spectral/energy/cost efficiency, better intelligence level and security, etc. To meet these requirements, 6G networks will rely on new enabling technologies, i.e., air interface and transmission technologies and novel network architecture, such as waveform design, multiple access, channel coding schemes, multi-antenna technologies, network slicing, cell-free architecture, and cloud/fog/edge computing. Our vision on 6G is that it will have four new paradigm shifts. First, to satisfy the requirement of global coverage, 6G will not be limited to terrestrial communication networks, which will need to be complemented with non-terrestrial networks such as satellite and unmanned aerial vehicle (UAV) communication networks, thus achieving a space-air-ground-sea integrated communication network. Second, all spectra will be fully explored to further increase data rates and connection density, including the sub-6 GHz, millimeter wave (mmWave), terahertz (THz), and optical frequency bands. Third, facing the big datasets generated by the use of extremely heterogeneous networks, diverse communication scenarios, large numbers of antennas, wide bandwidths, and new service requirements, 6G networks will enable a new range of smart applications with the aid of artificial intelligence (AI) and big data technologies. Fourth, network security will have to be strengthened when developing 6G networks. This article provides a comprehensive survey of recent advances and future trends in these four aspects. Clearly, 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.

935 citations


Journal ArticleDOI
TL;DR: A comprehensive survey on RA in HetNets for 5G communications is provided and two potential structures for 6G communications are provided, such as a learning-based RA structure and a control- based RA structure.
Abstract: In the fifth-generation (5G) mobile communication system, various service requirements of different communication environments are expected to be satisfied. As a new evolution network structure, heterogeneous network (HetNet) has been studied in recent years. Compared with homogeneous networks, HetNets can increase the opportunity in the spatial resource reuse and improve users’ quality of service by developing small cells into the coverage of macrocells. Since there is mutual interference among different users and the limited spectrum resource in HetNets, however, efficient resource allocation (RA) algorithms are vitally important to reduce the mutual interference and achieve spectrum sharing. In this article, we provide a comprehensive survey on RA in HetNets for 5G communications. Specifically, we first introduce the definition and different network scenarios of HetNets. Second, RA models are discussed. Then, we present a classification to analyze current RA algorithms for the existing works. Finally, some challenging issues and future research trends are discussed. Accordingly, we provide two potential structures for 6G communications to solve the RA problems of the next-generation HetNets, such as a learning-based RA structure and a control-based RA structure. The goal of this article is to provide important information on HetNets, which could be used to guide the development of more efficient techniques in this research area.

321 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a survey-style introduction to HLWNets, starting with a framework of system design in the aspects of network architectures, cell deployments, multiple access and modulation schemes, illumination requirements and backhaul.
Abstract: In order to tackle the rapidly growing number of mobile devices and their expanding demands for Internet services, network convergence is envisaged to integrate different technology domains. For indoor wireless communications, one promising approach is to coordinate light fidelity (LiFi) and wireless fidelity (WiFi), namely hybrid LiFi and WiFi networks (HLWNets). This hybrid network combines the high-speed data transmission of LiFi and the ubiquitous coverage of WiFi. In this article, we present a survey-style introduction to HLWNets, starting with a framework of system design in the aspects of network architectures, cell deployments, multiple access and modulation schemes, illumination requirements and backhaul. Key performance metrics and recent achievements are then reviewed to demonstrate the superiority of HLWNets against stand-alone networks. Further, the unique challenges facing HLWNets are elaborated on key research topics including user behavior modeling, interference management, handover and load balancing. Moreover, the potential of HLWNets in the application areas is presented, exemplified by indoor positioning and physical layer security. Finally, the challenges and future research directions are discussed.

116 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an in-depth survey of state-of-the-art non-orthogonal multiple access (NOMA) variants having power and code domains as the backbone for interference mitigation, resource allocations, and QoS management in the 5G environment.
Abstract: Over the last few years, interference has been a major hurdle for successfully implementing various end-user applications in the fifth-generation (5G) of wireless networks. During this era, several communication protocols and standards have been developed and used by the community. However, interference persists, keeping given quality of service (QoS) provision to end-users for different 5G applications. To mitigate the issues mentioned above, in this paper, we present an in-depth survey of state-of-the-art non-orthogonal multiple access (NOMA) variants having power and code domains as the backbone for interference mitigation, resource allocations, and QoS management in the 5G environment. These are future smart communication and supported by device-to-device (D2D), cooperative communication (CC), multiple-input and multiple-output (MIMO), and heterogeneous networks (HetNets). From the existing literature, it has been observed that NOMA can resolve most of the issues in the existing proposals to provide contention-based grant-free transmissions between different devices. The key differences between the orthogonal multiple access (OMA) and NOMA in 5G are also discussed in detail. Moreover, several open issues and research challenges of NOMA-based applications are analyzed. Finally, a comparative analysis of different existing proposals is also discussed to provide deep insights to the readers.

88 citations


Journal ArticleDOI
TL;DR: Comprehensive performance evaluation and comparisons show that RUSH outperforms other schemes in both computation and communication efficiencies, and formal security proofs indicate that RRush resists various attacks.
Abstract: The evolving fifth generation (5G) cellular networks will be a collection of heterogeneous and backward-compatible networks. With the increased heterogeneity and densification of 5G heterogeneous networks (HetNets), it is important to ensure security and efficiency of frequent handovers in 5G wireless roaming environments. However, existing handover authentication mechanisms still have challenging issues, such as anonymity, robust traceability and universality. In this paper, we address these issues by introducing RUSH, a Robust and Universal Seamless Handover authentication protocol for 5G HetNets. In RUSH, anonymous mutual authentication with key agreement is enabled for handovers by exploiting the trapdoor collision property of chameleon hash functions and the tamper-resistance of blockchains. RUSH achieves universal handover authentication for all the diverse mobility scenarios, as exemplified by the handover between 5G new radio and non-3GPP access regardless of the trustworthiness of non-3GPP access and the consistency of the core network. RUSH also achieves perfect forward secrecy, master key forward secrecy, known randomness secrecy, key escrow freeness and robust traceability. Our formal security proofs based on the BAN-logic and formal verification based on AVISPA indicate that RUSH resists various attacks. Comprehensive performance evaluation and comparisons show that RUSH outperforms other schemes in both computation and communication efficiencies.

81 citations


Journal ArticleDOI
TL;DR: In this paper, a multi-objective restricted Boltzmann machine (RBM) model is designed for training to enhance the model's robustness, and a strategy pool is introduced to improve the effect of data fusion and using non-dominated sorting genetic algorithms (NSGA-II) to deal with the imbalanced malware family.
Abstract: The fifth generation (5G) mobile communication technology brings people a higher perceived rate experience, the high-quality service of high-density user connection, and other commercial applications. As an important means of data processing in 5G heterogeneous networks (HetNets), data fusion technology is faced with a large number of malicious code attacks. Thus, it is particularly important to find an efficient malicious code detection method. However, in the traditional research, due to dataset imbalance, the complexity of the deep learning network model, the use of a single-objective algorithm, and other factors, it brings greater loss and lower detection accuracy. Therefore, how to choose a suitable network model and improve the data classification accuracy in HetNets is a big challenge. To enhance the model's robustness, a multi-objective restricted Boltzmann machine (RBM) model is designed for training. In this article, evaluation indices are used to comprehensively measure the effect of data classification, introducing a strategy pool to improve the effect of data fusion and using non-dominated sorting genetic algorithms (NSGA-II) to deal with the imbalanced malware family. Experimental results demonstrate that the proposed multi-objective RBM model combined with NSGA-II can effectively enhance the data classification accuracy of HetNets and reduce the loss in the process of data fusion.

75 citations


Journal ArticleDOI
TL;DR: This paper argues that it is crucial to construct a Collaboration Trust Interconnections System (CTIS) to provide the ubiquitous SAGS network accessibility and security and proposes a greedy-based winner recruitment strategy to achieve intelligent information control with maximum credibility and cost.
Abstract: The heterogeneous networks which collaborate among Space, Air, Ground, and Sea (SAGS) networks significantly promote the development of the Internet of Things (IoT). Billions of IoT devices in SAGS networks generate massive data to support various applications. We argue that it is crucial to construct a Collaboration Trust Interconnections System (CTIS) to provide the ubiquitous SAGS network accessibility and security. In this paper, a CTIS framework among the Unmanned Aerial Vehicles (UAV), Mobile Vehicles (MVs), and IoT devices is proposed to evaluate trust and select low-cost and high-trust participants to improve data quality. In this framework, MVs record the interaction information to form the verification chains, while the UAV is dispatched to collect baseline data to verify the data reported by MVs, thereby constructing global trust. Then, the hash values of baseline data are delivered to MVs to act as calibration baseline data, which provides a verification certificate for interactions among MVs and constructs local interaction trust. Finally, a greedy-based winner recruitment strategy is proposed to achieve intelligent information control with maximum credibility and cost. The simulation results show that the CTIS framework reduces the cost by 5.62%, reduces the false ratio and packet dropping rate by at least 17.16% and 31.51% compared with previous schemes.

68 citations


Journal ArticleDOI
TL;DR: The main objective is to improve the quality of service over a heterogeneous network by reinforcement learning-based multimedia data segregation (RLMDS) algorithm and Computing QoS in Medical Information system using Fuzzy (CQMISF) algorithm in fog computing.
Abstract: Fog computing is an emerging trend in the healthcare sector for the care of patients in emergencies. Fog computing provides better results in healthcare by improving the quality of services in the heterogeneous network. The transmission of critical multimedia healthcare data is required to be transferred in real-time for saving the lives of patients using better quality networks. The main objective is to improve the quality of service over a heterogeneous network by reinforcement learning-based multimedia data segregation (RLMDS) algorithm and Computing QoS in Medical Information system using Fuzzy (CQMISF) algorithm in fog computing. The proposed algorithms works in three phase’s such as classification of healthcare data, selection of optimal gateways for data transmission and improving the transmission quality with the consideration of parameters such as throughput, end-to-end delay and jitter. Proposed algorithms used to classify the healthcare data and transfer the classified high-risk data to end-user with by selecting the optimal gateway. To performance validation, extensive simulations were conducted on MATLAB R2018b on different parameters like throughput, end-to-end delay, and jitter. The performance of the proposed work is compared with FLQoS and AQCA algorithms. The proposed CQMISF algorithm achieves 81.7% overall accuracy and in comparison to FLQoS and AQCA algorithm, the proposed algorithms achieves the significant improvement of 6.195% and 2.01%.

61 citations


Journal ArticleDOI
TL;DR: In this paper, the main issues and constraints of resource allocation, signaling, practical implementation and security aspects of NOMA and its integration with 5G and upcoming wireless technologies are highlighted.

60 citations


Journal ArticleDOI
TL;DR: A novel semi-supervised autoencoder method to integrate multiple networks and generate a low-dimensional feature representation and a convolutional neural network based on the integrated feature embedding to annotate unlabeled gene functions are designed.
Abstract: Motivation The emergence of abundant biological networks, which benefit from the development of advanced high-throughput techniques, contributes to describing and modeling complex internal interactions among biological entities such as genes and proteins. Multiple networks provide rich information for inferring the function of genes or proteins. To extract functional patterns of genes based on multiple heterogeneous networks, network embedding-based methods, aiming to capture non-linear and low-dimensional feature representation based on network biology, have recently achieved remarkable performance in gene function prediction. However, existing methods do not consider the shared information among different networks during the feature learning process. Results Taking the correlation among the networks into account, we design a novel semi-supervised autoencoder method to integrate multiple networks and generate a low-dimensional feature representation. Then we utilize a convolutional neural network based on the integrated feature embedding to annotate unlabeled gene functions. We test our method on both yeast and human datasets and compare with three state-of-the-art methods. The results demonstrate the superior performance of our method. We not only provide a comprehensive analysis of the performance of the newly proposed algorithm but also provide a tool for extracting features of genes based on multiple networks, which can be used in the downstream machine learning task. Availability DeepMNE-CNN is freely available at https://github.com/xuehansheng/DeepMNE-CNN. Contact jiajiepeng@nwpu.edu.cn; shang@nwpu.edu.cn; jianye.hao@tju.edu.cn.

60 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors designed inter-and intra-domain feature extraction modules by applying graph convolution operations to the networks to learn the embedding of drugs and diseases, instead of simply integrating the three networks into a heterogeneous network.
Abstract: In silico reuse of old drugs (also known as drug repositioning) to treat common and rare diseases is increasingly becoming an attractive proposition because it involves the use of de-risked drugs, with potentially lower overall development costs and shorter development timelines. Therefore, there is a pressing need for computational drug repurposing methodologies to facilitate drug discovery. In this study, we propose a new method, called DRHGCN (Drug Repositioning based on the Heterogeneous information fusion Graph Convolutional Network), to discover potential drugs for a certain disease. To make full use of different topology information in different domains (i.e. drug-drug similarity, disease-disease similarity and drug-disease association networks), we first design inter- and intra-domain feature extraction modules by applying graph convolution operations to the networks to learn the embedding of drugs and diseases, instead of simply integrating the three networks into a heterogeneous network. Afterwards, we parallelly fuse the inter- and intra-domain embeddings to obtain the more representative embeddings of drug and disease. Lastly, we introduce a layer attention mechanism to combine embeddings from multiple graph convolution layers for further improving the prediction performance. We find that DRHGCN achieves high performance (the average AUROC is 0.934 and the average AUPR is 0.539) in four benchmark datasets, outperforming the current approaches. Importantly, we conducted molecular docking experiments on DRHGCN-predicted candidate drugs, providing several novel approved drugs for Alzheimer's disease (e.g. benzatropine) and Parkinson's disease (e.g. trihexyphenidyl and haloperidol).

Journal ArticleDOI
TL;DR: This paper shows how the optimal bias factors in terms of SECP can be changed according to the numbers of user types and tiers of MEC servers, and how they are different to the conventional ones that did not consider the computing capabilities and task sizes.
Abstract: The mobile edge computing (MEC) has been introduced for providing computing capabilities at the edge of networks to improve the latency performance of wireless networks. In this paper, we provide the novel framework for MEC-enabled heterogeneous networks (HetNets), composed of the multi-tier networks with access points (APs) (i.e., MEC servers), which have different transmission power and different computing capabilities. In this framework, we also consider multiple-type mobile users with different sizes of computation tasks, and they offload the tasks to a MEC server, and receive the computation resulting data from the server. We derive the successful edge computing probability (SECP), defined as the probability that a user offloads and finishes its computation task at the MEC server within the target latency. We provide a closed-form expression of the approximated SECP for general case, and closed-form expressions of the exact SECP for special cases. This paper then provides the design insights for the optimal configuration of MEC-enabled HetNets by analyzing the effects of network parameters and bias factors, used in MEC server association, on the SECP. Specifically, it shows how the optimal bias factors in terms of SECP can be changed according to the numbers of user types and tiers of MEC servers, and how they are different to the conventional ones that did not consider the computing capabilities and task sizes.

Journal ArticleDOI
TL;DR: This work proposes a blockchain-empowered AAA scheme for accessing data of LS-HetNet, where the account address of blockchain is used as the identity authentication, and the access control permission of data is redesigned and stored on the blockchain.

Journal ArticleDOI
TL;DR: An algorithm named fuzzy attribute-based joint integrated scheduling and tree formation (FAJIT) technique for tree formation and parent node selection using fuzzy logic in a heterogeneous network is proposed and is compared with the distributed algorithm for Integrated tree Construction and data Aggregation (DICA).
Abstract: Wireless sensor network (WSN) is used to sense the environment, collect the data, and further transmit it to the base station (BS) for analysis. A synchronized tree-based approach is an efficient approach to aggregate data from various sensor nodes in a WSN environment. However, achieving energy efficiency in such a tree formation is challenging. In this research work, an algorithm named fuzzy attribute-based joint integrated scheduling and tree formation (FAJIT) technique for tree formation and parent node selection using fuzzy logic in a heterogeneous network is proposed. FAJIT mainly focuses on addressing the parent node selection problem in the heterogeneous network for aggregating different types of data packets to improve energy efficiency. The selection of parent nodes is performed based on the candidate nodes with the minimum number of dynamic neighbors. Fuzzy logic is applied in the case of an equal number of dynamic neighbors. In the proposed technique, fuzzy logic is first applied to WSN, and then min–max normalization is used to retrieve normalized weights (membership values) for the given edges of the graph. This membership value is used to denote the degree to which an element belongs to a set. Therefore, the node with the minimum sum of all weights is considered as the parent node. The result of FAJIT is compared with the distributed algorithm for Integrated tree Construction and data Aggregation (DICA) on various parameters: average schedule length, energy consumption data interval, the total number of transmission slots, control overhead, and energy consumption in the control phase. The results demonstrate that the proposed algorithm is better in terms of energy efficiency.

Journal ArticleDOI
TL;DR: In this article, a combination of deep learning technology, modulation information recognition, and beam formation is introduced to solve the security problem of the 5G heterogeneous network, which can effectively reduce the computational complexity under different numbers of transmitting antennas, which verifies the superiority of the unsupervised beamforming algorithm based on deep learning proposed in this research.
Abstract: With increasingly complex network structure, requirements for heterogeneous 5G are also growing. The aim of this study is to meet the network security performance under the existing high-capacity and highly reliable transmission. In this context, deep learning technology is adopted to solve the security problem of the 5G heterogeneous network. First, the security problems existing in 5G heterogeneous networks are presented, mainly from two aspects of the physical layer security problems and application prospects of deep learning in communication technology. Then the combination of deep learning and 5G heterogeneous networks is analyzed. The combination of deep learning technology, modulation information recognition, and beam formation is introduced. The application of deep learning in communications technology is analyzed, and the modulation information recognition and beamforming based on deep learning are introduced. Finally, the challenges of solving security problems in 5G heterogeneous networks by deep learning are explored. The results show that the deep learning model can solve the modulation recognition problem well, and the modulation mode of the convolutional neural network can well identify the modulation signals involved in the experiment. Therefore, deep learning has a good advantage in solving modulation recognition. In addition, compared to the traditional algorithm, the unsupervised beamforming algorithm based on deep learning proposed in this research can effectively reduce the computational complexity under different numbers of transmitting antennas, which verifies the superiority of the unsupervised beamforming algorithm based on deep learning proposed in this research. Therefore, the present work provides a good idea for solving the security problem of 5G heterogeneous networks.

Journal ArticleDOI
TL;DR: This protocol employs the Media Access Control protocol for the sync of high-speed wireless communication networks in the Terahertz (THz) band and performs multiple machine access and collision control for improving the resource utilization and latency-less services of the users.
Abstract: Cloud computing is an important technology to offer consumer appliances a wide pool of elastic resources. The heterogeneous network faces collision while making communication, which reduces the entire network performance. The future cloud-edge networks will deal with a vast amount of clients and servers, such as the Internet of Things (IoT) and the 6G networks, which require flexible solutions. From these points, Multiple Machine Access Learning with Collision Carrier Avoidance (MMALCCA) protocol is proposed in the environment of 6G Internet of Things for creating an effective communication process. This protocol employs the Media Access Control (MAC) protocol for the sync of high-speed wireless communication networks in the Terahertz (THz) band. MMALCCA performs multiple machine access and collision control for improving the resource utilization and latency-less services of the users. The decisions of the protocol are made using the output of the classification and regression learning method for improving the efficiency of MAC sync. The performance of the proposed protocol is verified using the metrics latency, collision probability, service failure, and resource utilization by varying channels and user equipment density.

Journal ArticleDOI
TL;DR: This study observed that while numerous works focused on cellular technologies to enable connectivity for aerial platforms, a single wireless technology is not sufficient to meet the stringent connectivity demands of the aerial use cases, especially for the piloting operations.
Abstract: Electrification turned over a new leaf in aviation by introducing new types of aerial vehicles along with new means of transportation. Addressing a plethora of use cases, drones are gaining attention in the industry and increasingly appear in the sky. Emerging concepts of flying taxi enable passengers to be transported over several tens of kilometers. Therefore, unmanned traffic management systems are under development to cope with the complexity of future airspace, thereby resulting in unprecedented communication needs. Moreover, the long-term increase in the number of commercial airplanes pushes the limits of voice-oriented communications, and future options such as single-pilot operations demand robust connectivity. In this survey, we provide a comprehensive review and vision for enabling the connectivity applications of aerial vehicles utilizing current and future communication technologies. We begin by categorizing the connectivity use cases per aerial vehicle and analyzing their connectivity requirements. By reviewing more than 500 related studies, we aim for a comprehensive approach to cover wireless communication technologies, and provide an overview of recent findings from the literature toward the possibilities and challenges of employing the wireless communication standards. After analyzing the proposed network architectures, we list the open-source testbed platforms to facilitate future investigations by researchers. This study helped us observe that while numerous works focused on cellular technologies to enable connectivity for aerial platforms, a single wireless technology is not sufficient to meet the stringent connectivity demands of the aerial use cases, especially for the piloting operations. We identified the need of further investigations on multi-technology heterogeneous network architectures to enable robust and real-time connectivity in the sky. Future works should consider suitable technology combinations to develop unified aerial networks that can meet the diverse quality of service demands of the aerial use cases.

Journal ArticleDOI
TL;DR: This work employs deep deterministic policy gradient (DDPG), a model-free deep reinforcement learning algorithm, to solve the MDP problem in continuous state and action space and results show the convergence property of proposed algorithm and the effectiveness in improving the energy efficiency in a D2D-enabled heterogeneous network.
Abstract: Improving energy efficiency has shown increasing importance in designing future cellular system. In this work, we consider the issue of energy efficiency in D2D-enabled heterogeneous cellular networks. Specifically, communication mode selection and resource allocation are jointly considered with the aim to maximize the energy efficiency in the long term. And an Markov decision process (MDP) problem is formulated, where each user can switch between traditional cellular mode and D2D mode dynamically. We employ deep deterministic policy gradient (DDPG), a model-free deep reinforcement learning algorithm, to solve the MDP problem in continuous state and action space. The architecture of proposed method consists of one actor network and one critic network. The actor network uses deterministic policy gradient scheme to generate deterministic actions for agent directly, and the critic network employs value function based Q networks to evaluate the performance of the actor network. Simulation results show the convergence property of proposed algorithm and the effectiveness in improving the energy efficiency in a D2D-enabled heterogeneous network.

Journal ArticleDOI
Yu Xie1, Bin Yu2, Shengze Lv2, Chen Zhang2, Guodong Wang2, Maoguo Gong2 
TL;DR: A taxonomy of heterogeneous network representation learning algorithms according to different approaches of capturing semantic information in heterogeneous networks, including path based algorithms and semantic unit based algorithms is proposed.

Journal ArticleDOI
TL;DR: In this article, a review article scrutinizes the issues of interferences observed and studied in different structures and techniques of the 5G and beyond network, focusing on the various interference effect in HetNet, RN, D2D, and IoT.
Abstract: In the modern technological world, wireless communication has taken a massive leap from the conventional communication system to a new radio communication network. The novel concept of Fifth Generation (5G) cellular networks brings a combination of a diversified set of devices and machines with great improvement in a unique way compared to previous technologies. To broaden the user’s experience, 5G technology provides the opportunity to meet the people’s potential necessities for efficient communication. Specifically, researchers have designed a network of small cells with unfamiliar technologies that have never been introduced before. The new network design is an amalgamation of various schemes such as Heterogeneous Network (HetNet), Device-to-Device (D2D) communication, Internet of Things (IoT), Relay Node (RN), Beamforming, Massive Multiple Input Multiple Output (M-MIMO), millimeter-wave (mm-wave), and so on. Also, enhancement in predecessor’s techniques is required so that new radio is compatible with a traditional network. However, the disparate technological models’ design and concurrent practice have created an unacceptable intervention in each other’s signals. These vulnerable interferences have significantly degraded the overall network performance. This review article scrutinizes the issues of interferences observed and studied in different structures and techniques of the 5G and beyond network. The study focuses on the various interference effect in HetNet, RN, D2D, and IoT. Furthermore, as an in-depth literature review, we discuss various types of interferences related to each method by studying the state-of-the-art relevant research in the literature. To provide new insight into interference issue management for the next-generation network, we address and explore various relevant topics in each section that help make the system more robust. Overall, this review article’s goal is to guide all the stakeholders, including students, operators, engineers, and researchers, aiming to explore this promising research theme, comprehend interferences and their types, and related techniques to mitigate them. We also state methodologies proposed by the $3^{\mathrm {rd}}$ Generation Partnership Project (3GPP) and present the promising and feasible research directions toward this challenging topic for the realization of 5G and beyond network.

Journal ArticleDOI
TL;DR: In an “edge-cloud” heterogeneous network environment, create a mapping scheme between application modules and basic resource equipment, considering the two factors of tolerant task latency and system power consumption and heuristic dynamic task processing algorithm is used to reduce the task latency time.
Abstract: In today’s era of Internet of Things (IoT), efficient and real-time processing of massive data generated by IoT device has become the primary issue for traditional cloud computing network architectures. As a supplement of cloud computing, edge computing enhances the real-time performance of service completion by offloading services to edge servers closer to the terminal device for execution, while reducing power consumption and computing load in the cloud. In this article, we propose the following solutions to resolve the different requests of the IoT device: in an “edge-cloud” heterogeneous network environment, create a mapping scheme between application modules and basic resource equipment, considering the two factors of tolerant task latency and system power consumption. In the application step-by-step execution process, heuristic dynamic task processing algorithm is used to reduce the task latency time. Experiments with the “iFogSim” simulator show that, application service quality is significantly improved and system power consumption is greatly reduced, which compared with the stable application module placement strategy and the static task scheduling strategy.

Journal ArticleDOI
Qi An1, Liang Yu1
TL;DR: Liang et al. as discussed by the authors proposed a Network EmbeDding framework in mulTiPlex networks (NEDTP) to predict drug-target interactions (DTIs) through biological data, which can reduce the time and economic cost of drug development.
Abstract: Accurate prediction of drug-target interactions (DTIs) through biological data can reduce the time and economic cost of drug development. The prediction method of DTIs based on a similarity network is attracting increasing attention. Currently, many studies have focused on predicting DTIs. However, such approaches do not consider the features of drugs and targets in multiple networks or how to extract and merge them. In this study, we proposed a Network EmbeDding framework in mulTiPlex networks (NEDTP) to predict DTIs. NEDTP builds a similarity network of nodes based on 15 heterogeneous information networks. Next, we applied a random walk to extract the topology information of each node in the network and learn it as a low-dimensional vector. Finally, the Gradient Boosting Decision Tree model was constructed to complete the classification task. NEDTP achieved accurate results in DTI prediction, showing clear advantages over several state-of-the-art algorithms. The prediction of new DTIs was also verified from multiple perspectives. In addition, this study also proposes a reasonable model for the widespread negative sampling problem of DTI prediction, contributing new ideas to future research. Code and data are available at https://github.com/LiangYu-Xidian/NEDTP.

Journal ArticleDOI
TL;DR: In this paper, a two-stage auction scheme is proposed to maximize utilities for both sellers and buyers in the network, and two algorithms are proposed to further refine TARCO on the social welfare of the network.
Abstract: In heterogeneous cellular network, task scheduling for computation offloading is one of the biggest challenges. Most works focus on alleviating heavy burden of macro base stations by moving the computation tasks on macro cell user equipment (MUE) to remote cloud or small cell base stations. But the selfishness of network users is seldom considered. Motivated by the multiple access mobile edge computing, this paper provides incentive for task transfer from macro cell users to small cell base stations. The proposed incentive scheme utilizes small cell user equipments to provide relay services. The problem of computation offloading is modeled as a two-stage auction, in which the remote MUEs with common social character can form a group and then buy the computation resource of small cell base stations with the relaying of small cell user equipment. A two-stage auction scheme named TARCO is contributed to maximize utilities for both sellers and buyers in the network. The truthfulness, individual rational and budget balance properties of TARCO are also proved in this paper. In addition, two algorithms are proposed to further refine TARCO on the social welfare of the network. One can achieve higher utility of MUEs and the other can obtain higher total social welfare. Extensive simulation results demonstrate that, TARCO is better than random algorithm by 104.90 percent in terms of average utility of MUEs, while the performance of TARCO is further improved up to 28.75 percent and 17.06 percent by the proposed two algorithms, respectively.

Journal ArticleDOI
TL;DR: In this article, the authors experimentally evaluate two distinct hybrid architectures applied to 5G New Radio (NR) FiWi-Fi systems based on different optical fronthaul approaches, which operate in non-standalone (NSA) mode, defined by the 3rd generation partnership project (3GPP), for simultaneously transmitting 4G and 5G technologies through an unique FiWi system.
Abstract: The fifth-generation of mobile networks (5G) claims for a radio access network (RAN) update in order to support the enormous incoming wireless data traffic. In this context, we experimentally evaluate two distinct hybrid architectures applied to 5G New Radio (NR) FiWi systems based on different optical fronthaul approaches. The first architecture operates in non-standalone (NSA) mode, defined by the 3rd generation partnership project (3GPP), for simultaneously transmitting 4G and 5G technologies through an unique FiWi system. The three considered waveforms are as follows: a filtered orthogonal frequency division multiplexing (F-OFDM) signal at 778 MHz with 10 MHz bandwidth from our 5G flexible-waveform transceiver; a long-term evolution-advanced (LTE-A) signal with five 20 MHz sub-bands centralized at 2.24 GHz; a 5G NR signal at 2.35 GHz with 100 MHz bandwidth. On the other side, the second architecture employs radio over fiber (RoF), free space optics (FSO), and wireless technologies converged into a heterogeneous network (HetNet). The additional multi-standard and multiband optical-wireless network is based on a 10-MHz bandwidth F-OFDM signal at 788 MHz, a 100-MHz bandwidth 5G NR signal at 3.5 GHz, and a 400-MHz bandwidth M -QAM signal at 26 GHz. Throughput up to 3 and 1.4 Gbps are demonstrated for RoF/FSO and RoF/FSO/Wireless transmission, respectively.

Journal ArticleDOI
TL;DR: This work introduces a comprehensive review of the main information-theoretic metrics used to measure the secrecy performance in physical layer security, and a theoretical framework related to the most commonly used physical layerSecurity techniques to improve secrecy performance is provided.
Abstract: Physical layer security is a promising approach that can benefit traditional encryption methods. The idea of physical layer security is to take advantage of the propagation medium’s features and impairments to ensure secure communication in the physical layer. This work introduces a comprehensive review of the main information-theoretic metrics used to measure the secrecy performance in physical layer security. Furthermore, a theoretical framework related to the most commonly used physical layer security techniques to improve secrecy performance is provided. Finally, our work surveys physical layer security research over several enabling 5G technologies, such as massive multiple-input multiple-output, millimeter-wave communications, heterogeneous networks, non-orthogonal multiple access, and full-duplex. We also include the key concepts of each of the technologies mentioned above. Also identified are future fields of research and technical challenges of physical layer security.

Journal ArticleDOI
TL;DR: It is found that each follower oscillator can converge to a bounded region of the leader by adopting either a continuous-time protocol or sampled-data protocol, and the upper bound of the region is solved for networked heterogeneous harmonic oscillators.
Abstract: This paper studies quasi-synchronization in networked heterogeneous harmonic oscillators. By introducing a leader, two distributed synchronization protocols are first proposed for heterogeneous networks by utilizing continuous real-time information and aperiodic sampled-data information. Then, the sufficient conditions on quasi-synchronization are established for heterogeneous networks coupled with nonidentical harmonic oscillators. It is found that each follower oscillator can converge to a bounded region of the leader by adopting either a continuous-time protocol or sampled-data protocol. The upper bound of the region is solved for networked heterogeneous harmonic oscillators. Finally, an electrical network is provided to illustrate the applicability of the theoretical results, and two examples are provided to illustrate the effectiveness of the sufficient criteria.

Journal ArticleDOI
TL;DR: This paper integrates user capability and Software Defined Network (SDN) technique, and proposes a capability-based privacy protection handover authentication mechanism in SDN-based 5G HetNets that can achieve the mutual authentication and key agreement between User Equipments (UEs) and BSs at the same time largely reduce the authentication handover cost.
Abstract: Ultra-dense Heterogeneous network (HetNet) technique can significantly improve wireless link quality, spectrum efficiency and system capacity, and satisfy different requirements for coverage in hotspots, which has been viewed as one of the key technologies in fifth Generation (5G) network. Due to the existence of many different types of base stations (BSs) and the complexity of the network topology in the 5G HetNets, there are a lot of new challenges in security and mobility management aspects for this multi-tier 5G architecture including insecure access points and potential frequent handovers among several different types of base stations. In this paper, we integrate user capability and Software Defined Network (SDN) technique, and propose a capability-based privacy protection handover authentication mechanism in SDN-based 5G HetNets. Our proposed scheme can achieve the mutual authentication and key agreement between User Equipments (UEs) and BSs in 5G HetNets at the same time largely reduce the authentication handover cost. We demonstrate that our proposed scheme indeed can provide robust security protection by employing several security analysis methods including the BAN logic and the formal verification tool Scyther. In addition, the performance evaluation results show that our scheme outperforms other existing schemes.

Journal ArticleDOI
TL;DR: In this article, the authors comprehensively survey moving networks and mobile cells and identify the use-cases and value additions that moving networks may bring to future cellular architecture and identifies the challenges associated with them.
Abstract: The unprecedented increase in the demand for mobile data, fuelled by new emerging applications and use-cases such as high-definition video streaming and heightened online activities has caused massive strain on the existing cellular networks. As a solution, the fifth generation (5G) of cellular technology has been introduced to improve network performance through various innovative features such as millimeter-wave spectrum and Heterogeneous Networks (HetNets). In essence, HetNets include several small cells underlaid within macro-cell to serve densely populated regions like stadiums, shopping malls, and so on. Recently, a mobile layer of HetNet has been under consideration by the research community and is often referred to as moving networks. Moving networks comprise of mobile cells that are primarily introduced to improve Quality of Service (QoS) for the commuting users inside public transport because QoS is deteriorated due to vehicular penetration losses and high Doppler shift. Furthermore, the users inside fast moving public transport also exert excessive load on the core network due to large group handovers. To this end, mobile cells will play a crucial role in reducing the overall handover count and will help in alleviating these problems by decoupling in-vehicle users from the core network. This decoupling is achieved by introducing separate in-vehicle access link, and out-of-vehicle back-haul links with the core network. Additionally side-haul links will connect mobile cells with their neighbors. To date, remarkable research results have been achieved by the research community in addressing challenges linked to moving networks. However, to the best of our knowledge, a discussion on moving networks and mobile cells in a holistic way is still missing in the current literature. To fill the gap, in this article, we comprehensively survey moving networks and mobile cells. We cover the technological aspects of moving cells and their applications in the futuristic applications. We also discuss the use-cases and value additions that moving networks may bring to future cellular architecture and identify the challenges associated with them. Based on the identified challenges, we discuss the future research directions.

Journal ArticleDOI
TL;DR: This paper proposes an approach to detect the malicious domain name by extracting and analyzing the features using deep neural network and utilizes the hierarchy of bidirectional recurrent neural networks (HBiRNN) to extract effective semantic features instead of traditional methods.
Abstract: The emergence of the beyond 5G (B5G) mobile networks has provided us with a variety of services and enriched our lives. The B5G super-heterogeneous network systems and highly differentiated application scenarios require highly elastic and endogenous information security, including network trust, security, and privacy. However, security issues have also been raised, which greatly threaten our information security and privacy. For example, malwares use domain generation algorithms (DGAs) to generate huge quantities of domain names and then induce users to access to steal private information, which greatly threatens our information security. In this paper, we propose an approach to detect the malicious domain name by extracting and analyzing the features using deep neural network. Unlike traditional algorithms that are generally built on tedious feature engineering, our paper utilizes the hierarchy of bidirectional recurrent neural networks (HBiRNN) to extract effective semantic features instead of traditional methods. We use the discriminator based on HBiRNN (D-HBiRNN) to detect malicious websites. This experiment verifies the validity of the algorithm and compares it with the traditional algorithm based on feature engineering. Moreover, the superiority of the algorithm is proved.

Journal ArticleDOI
TL;DR: This work model SAGIN’s heterogeneous resource orchestration as a multi-domain virtual network embedding (VNE) problem, and proposes a SAGIn cross-domain VNE algorithm based on virtual network architecture and deep reinforcement learning (DRL).
Abstract: Traditional ground wireless communication networks cannot provide high-quality services for artificial intelligence (AI) applications such as intelligent transportation systems (ITS) due to deployment, coverage and capacity issues. The space-air-ground integrated network (SAGIN) has become a research focus in the industry. Compared with traditional wireless communication networks, SAGIN is more flexible and reliable, and it has wider coverage and higher quality of seamless connection. However, due to its inherent heterogeneity, time-varying and self-organizing characteristics, the deployment and use of SAGIN still faces huge challenges, among which the orchestration of heterogeneous resources is a key issue. Based on virtual network architecture and deep reinforcement learning (DRL), we model SAGIN's heterogeneous resource orchestration as a multi-domain virtual network embedding (VNE) problem, and propose a SAGIN cross-domain VNE algorithm. We model the different network segments of SAGIN, and set the network attributes according to the actual situation of SAGIN and user needs. In DRL, the agent is acted by a five-layer policy network. We build a feature matrix based on network attributes extracted from SAGIN and use it as the agent training environment. Through training, the probability of each underlying node being embedded can be derived. In test phase, we complete the embedding process of virtual nodes and links in turn based on this probability. Finally, we verify the effectiveness of the algorithm from both training and testing.