scispace - formally typeset
Search or ask a question

Showing papers on "Heterogeneous network published in 2018"


Journal ArticleDOI
TL;DR: This paper provides a latest survey of the physical layer security research on various promising 5G technologies, includingPhysical layer security coding, massive multiple-input multiple-output, millimeter wave communications, heterogeneous networks, non-orthogonal multiple access, full duplex technology, and so on.
Abstract: Physical layer security which safeguards data confidentiality based on the information-theoretic approaches has received significant research interest recently. The key idea behind physical layer security is to utilize the intrinsic randomness of the transmission channel to guarantee the security in physical layer. The evolution toward 5G wireless communications poses new challenges for physical layer security research. This paper provides a latest survey of the physical layer security research on various promising 5G technologies, including physical layer security coding, massive multiple-input multiple-output, millimeter wave communications, heterogeneous networks, non-orthogonal multiple access, full duplex technology, and so on. Technical challenges which remain unresolved at the time of writing are summarized and the future trends of physical layer security in 5G and beyond are discussed.

580 citations


Journal ArticleDOI
TL;DR: The suitability of hybrid beamforming methods, both, existing and proposed till first quarter of 2017, are explored, and the exciting future challenges in this domain are identified.
Abstract: The increasing wireless data traffic demands have driven the need to explore suitable spectrum regions for meeting the projected requirements. In the light of this, millimeter wave (mmWave) communication has received considerable attention from the research community. Typically, in fifth generation (5G) wireless networks, mmWave massive multiple-input multiple-output (MIMO) communications is realized by the hybrid transceivers which combine high dimensional analog phase shifters and power amplifiers with lower-dimensional digital signal processing units. This hybrid beamforming design reduces the cost and power consumption which is aligned with an energy-efficient design vision of 5G. In this paper, we track the progress in hybrid beamforming for massive MIMO communications in the context of system models of the hybrid transceivers’ structures, the digital and analog beamforming matrices with the possible antenna configuration scenarios and the hybrid beamforming in heterogeneous wireless networks. We extend the scope of the discussion by including resource management issues in hybrid beamforming. We explore the suitability of hybrid beamforming methods, both, existing and proposed till first quarter of 2017, and identify the exciting future challenges in this domain.

505 citations


Posted Content
TL;DR: FedProx as discussed by the authors is a generalization and re-parametrization of FedAvg, which is the state-of-the-art method for federated learning.
Abstract: Federated Learning is a distributed learning paradigm with two key challenges that differentiate it from traditional distributed optimization: (1) significant variability in terms of the systems characteristics on each device in the network (systems heterogeneity), and (2) non-identically distributed data across the network (statistical heterogeneity). In this work, we introduce a framework, FedProx, to tackle heterogeneity in federated networks. FedProx can be viewed as a generalization and re-parametrization of FedAvg, the current state-of-the-art method for federated learning. While this re-parameterization makes only minor modifications to the method itself, these modifications have important ramifications both in theory and in practice. Theoretically, we provide convergence guarantees for our framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work (systems heterogeneity). Practically, we demonstrate that FedProx allows for more robust convergence than FedAvg across a suite of realistic federated datasets. In particular, in highly heterogeneous settings, FedProx demonstrates significantly more stable and accurate convergence behavior relative to FedAvg---improving absolute test accuracy by 22% on average.

490 citations


Journal ArticleDOI
TL;DR: This paper proposes a novel hybrid network architecture for the smart city by leveraging the strength of emerging Software Defined Networking and blockchain technologies and proposes a Proof-of-Work scheme in the model to ensure security and privacy.

281 citations


Journal ArticleDOI
TL;DR: A three-factor anonymous authentication scheme for WSNs in Internet of Things environments, where fuzzy commitment scheme is adopted to handle the user's biometric information and keeps computational efficiency, and also achieves more security and functional features.

274 citations


Journal ArticleDOI
TL;DR: This paper investigates the optimal policy for user scheduling and resource allocation in HetNets powered by hybrid energy with the purpose of maximizing energy efficiency of the overall network and demonstrates the convergence property of the proposed algorithm.
Abstract: Densely deployment of various small-cell base stations in cellular networks to increase capacity will lead to heterogeneous networks (HetNets), and meanwhile, embedding the energy harvesting capabilities in base stations as an alternative energy supply is becoming a reality. How to make efficient utilization of radio resource and renewable energy is a brand-new challenge. This paper investigates the optimal policy for user scheduling and resource allocation in HetNets powered by hybrid energy with the purpose of maximizing energy efficiency of the overall network. Since wireless channel conditions and renewable energy arrival rates have stochastic properties and the environment’s dynamics are unknown, the model-free reinforcement learning approach is used to learn the optimal policy through interactions with the environment. To solve our problem with continuous-valued state and action variables, a policy-gradient-based actor-critic algorithm is proposed. The actor part uses the Gaussian distribution as the parameterized policy to generate continuous stochastic actions, and the policy parameters are updated with the gradient ascent method. The critic part uses compatible function approximation to estimate the performance of the policy and helps the actor learn the gradient of the policy. The advantage function is used to further reduce the variance of the policy gradient. Using the numerical simulations, we demonstrate the convergence property of the proposed algorithm and analyze network energy efficiency.

256 citations


Journal ArticleDOI
TL;DR: In this paper, a novel air-ground integrated mobile edge network (AGMEN) is proposed, where UAVs are flexibly deployed and scheduled, and assist the communication, caching, and computing of the edge network.
Abstract: The ever increasing mobile data demands have posed significant challenges in the current radio access networks, while the emerging computation- heavy Internet of Things applications with varied requirements demand more flexibility and resilience from the cloud/edge computing architecture. In this article, to address the issues, we propose a novel air-ground integrated mobile edge network (AGMEN), where UAVs are flexibly deployed and scheduled, and assist the communication, caching, and computing of the edge network. Specifically, we present the detailed architecture of AGMEN, and investigate the benefits and application scenarios of drone cells, and UAV-assisted edge caching and computing. Furthermore, the challenging issues in AGMEN are discussed, and potential research directions are highlighted.

251 citations


Journal ArticleDOI
TL;DR: A systematical survey of the state-of-the-art caching techniques that were recently developed in cellular networks, including macro-cellular networks, heterogeneous networks, device-to-device networks, cloud-radio access networks, and fog-radioaccess networks.
Abstract: Mobile data traffic is currently growing exponentially and these rapid increases have caused the backhaul data rate requirements to become the major bottleneck to reducing costs and raising revenue for operators. To address this problem, caching techniques have attracted significant attention since they can effectively reduce the backhaul traffic by eliminating duplicate data transmission that carries popular content. In addition, other system performance metrics can also be improved through caching techniques, e.g., spectrum efficiency, energy efficiency, and transmission delay. In this paper, we provide a systematical survey of the state-of-the-art caching techniques that were recently developed in cellular networks, including macro-cellular networks, heterogeneous networks, device-to-device networks, cloud-radio access networks, and fog-radio access networks. In particular, we give a tutorial on the fundamental caching techniques and introduce caching algorithms from three aspects, i.e., content placement, content delivery, and joint placement and delivery. We provide comprehensive comparisons among different algorithms in terms of different performance metrics, including throughput, backhaul cost, power consumption, and network delay. Finally, we summarize the main research achievements in different networks, and highlight main challenges and potential research directions.

226 citations


Posted Content
14 Dec 2018
TL;DR: This work proposes and introduces \fedprox, which is similar in spirit to \fedavg, but more amenable to theoretical analysis, and describes the convergence of \fed Prox under a novel \textit{device similarity} assumption.
Abstract: Federated Learning is a distributed learning paradigm with two key challenges that differentiate it from traditional distributed optimization: (1) significant variability in terms of the systems characteristics on each device in the network (systems heterogeneity), and (2) non-identically distributed data across the network (statistical heterogeneity). In this work, we introduce a framework, FedProx, to tackle heterogeneity in federated networks. FedProx can be viewed as a generalization and re-parametrization of FedAvg, the current state-of-the-art method for federated learning. While this re-parameterization makes only minor modifications to the method itself, these modifications have important ramifications both in theory and in practice. Theoretically, we provide convergence guarantees for our framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work (systems heterogeneity). Practically, we demonstrate that FedProx allows for more robust convergence than FedAvg across a suite of realistic federated datasets. In particular, in highly heterogeneous settings, FedProx demonstrates significantly more stable and accurate convergence behavior relative to FedAvg---improving absolute test accuracy by 22% on average.

224 citations


Journal ArticleDOI
TL;DR: The objective of this special issue is to disseminate the contributions in the field of ACNs and addresses the particular issues and reviews major mechanisms in three key areas: LAP-based communication networks, HAP- based Communication networks, and integrated ACNs.
Abstract: Owing to the explosive growth of requirements of rapid emergency communication response and accurate observation services, airborne communication networks (ACNs) have received much attention from both industry and academia. ACNs are subject to heterogeneous networks that are engineered to utilize satellites, high-altitude platforms (HAPs), and low-altitude platforms (LAPs) to build communication access platforms. Compared to terrestrial wireless networks, ACNs are characterized by frequently changed network topologies and more vulnerable communication connections. Furthermore, ACNs have the demand for the seamless integration of heterogeneous networks such that the network quality-of-service (QoS) can be improved. Thus, designing mechanisms and protocols for ACNs poses many challenges. To solve these challenges, extensive research has been conducted. The objective of this special issue is to disseminate the contributions in the field of ACNs. To present this special issue with the necessary background and offer an overall view of this field, three key areas of ACNs are covered. Specifically, this paper covers LAP-based communication networks, HAP-based communication networks, and integrated ACNs. For each area, this paper addresses the particular issues and reviews major mechanisms. This paper also points out future research directions and challenges.

218 citations


Journal ArticleDOI
TL;DR: A new 5G wireless security architecture is proposed, based on which the analysis of identity management and flexible authentication is provided, and a handover procedure as well as a signaling load scheme are explored to show the advantages of the proposed security architecture.
Abstract: The advanced features of 5G mobile wireless network systems yield new security requirements and challenges. This paper presents a comprehensive study on the security of 5G wireless network systems compared with the traditional cellular networks. The paper starts with a review on 5G wireless networks particularities as well as on the new requirements and motivations of 5G wireless security. The potential attacks and security services are summarized with the consideration of new service requirements and new use cases in 5G wireless networks. The recent development and the existing schemes for the 5G wireless security are presented based on the corresponding security services, including authentication, availability, data confidentiality, key management, and privacy. This paper further discusses the new security features involving different technologies applied to 5G, such as heterogeneous networks, device-to-device communications, massive multiple-input multiple-output, software-defined networks, and Internet of Things. Motivated by these security research and development activities, we propose a new 5G wireless security architecture, based on which the analysis of identity management and flexible authentication is provided. As a case study, we explore a handover procedure as well as a signaling load scheme to show the advantages of the proposed security architecture. The challenges and future directions of 5G wireless security are finally summarized.

Journal ArticleDOI
TL;DR: Simulation results show that the distributed JCORAO scheme can effectively decrease the energy consumption and task completion time with lower complexity.
Abstract: In this paper, we propose a distributed joint computation offloading and resource allocation optimization (JCORAO) scheme in heterogeneous networks with mobile edge computing. An optimization problem is formulated to provide the optimal computation offloading strategy policy, uplink subchannel allocation, uplink transmission power allocation, and computation resource scheduling. The optimization problem is decomposed into two sub-problems due to the NP-hard property. In order to analyze the offloading strategy, a sub-algorithm named distributed potential game is built. The existence of Nash equilibrium is proved. To jointly allocate uplink subchannel, uplink transmission power, and computation resource for the offloading mobile terminals, a sub-algorithm named cloud and wireless resource allocation algorithm is designed. The solutions for subchannel allocation consist of uniform zero frequency reuse method without interference and fractional frequency reuse method based on Hungarian and graph coloring with interference. A distributed JCORAO scheme is proposed to solve the optimization problem by the mutual iteration of the two sub-algorithms. Simulation results show that the distributed JCORAO scheme can effectively decrease the energy consumption and task completion time with lower complexity.

Journal ArticleDOI
TL;DR: The article explores the use of drones in fields as diverse as military surveillance and network rehabilitation for disaster-struck areas and highlights the importance of incorporating the drones in the multi-tier heterogeneous network to extend the network coverage and capacity.
Abstract: Wireless networks comprising unmanned aerial vehicles can offer limited connectivity in a cost-effective manner to disaster-struck regions where terrestrial infrastructure might have been damaged. While these drones offer advantages such as rapid deployment to far-flung areas, their operations may be rendered ineffective by the absence of an adequate energy management strategy. This article considers the multi-faceted applications of these platforms and the challenges thereof in the networks of the future. In addition to providing an overview of the work done by researchers in determining the features of the air-to-ground channel, the article explores the use of drones in fields as diverse as military surveillance and network rehabilitation for disaster-struck areas. It also presents a case study that envisages a scenario in which drones operate alongside conventional wireless infrastructure, thereby allowing a greater number of users to establish a line-of-sight link for communication. This study investigates a power allocation strategy for the microwave base station and the small base stations operating at 28 GHz frequency band. The self-adaptive power control strategy for drones is dependent on the maximum allowable interference threshold and minimum data rate requirements. This study highlights the importance of incorporating the drones in the multi-tier heterogeneous network to extend the network coverage and capacity.

Journal ArticleDOI
TL;DR: This article introduces a vehicular edge multi-access network that treats vehicles as edge computation resources to construct the cooperative and distributed computing architecture and proposes a collaborative task offloading and output transmission mechanism to guarantee low latency as well as the application- level performance.
Abstract: Mobile edge computing (MEC) has emerged as a promising paradigm to realize user requirements with low-latency applications. The deep integration of multi-access technologies and MEC can significantly enhance the access capacity between heterogeneous devices and MEC platforms. However, the traditional MEC network architecture cannot be directly applied to the Internet of Vehicles (IoV) due to high speed mobility and inherent characteristics. Furthermore, given a large number of resource-rich vehicles on the road, it is a new opportunity to execute task offloading and data processing onto smart vehicles. To facilitate good merging of the MEC technology in IoV, this article first introduces a vehicular edge multi-access network that treats vehicles as edge computation resources to construct the cooperative and distributed computing architecture. For immersive applications, co-located vehicles have the inherent properties of collecting considerable identical and similar computation tasks. We propose a collaborative task offloading and output transmission mechanism to guarantee low latency as well as the application- level performance. Finally, we take 3D reconstruction as an exemplary scenario to provide insights on the design of the network framework. Numerical results demonstrate that the proposed scheme is able to reduce the perception reaction time while ensuring the application-level driving experiences.

Journal ArticleDOI
TL;DR: This work proposes deepNF, a network fusion method based on Multimodal Deep Autoencoders to extract high‐level features of proteins from multiple heterogeneous interaction networks and shows that this method outperforms previous methods for both human and yeast STRING networks.
Abstract: Motivation The prevalence of high-throughput experimental methods has resulted in an abundance of large-scale molecular and functional interaction networks. The connectivity of these networks provides a rich source of information for inferring functional annotations for genes and proteins. An important challenge has been to develop methods for combining these heterogeneous networks to extract useful protein feature representations for function prediction. Most of the existing approaches for network integration use shallow models that encounter difficulty in capturing complex and highly non-linear network structures. Thus, we propose deepNF, a network fusion method based on Multimodal Deep Autoencoders to extract high-level features of proteins from multiple heterogeneous interaction networks. Results We apply this method to combine STRING networks to construct a common low-dimensional representation containing high-level protein features. We use separate layers for different network types in the early stages of the multimodal autoencoder, later connecting all the layers into a single bottleneck layer from which we extract features to predict protein function. We compare the cross-validation and temporal holdout predictive performance of our method with state-of-the-art methods, including the recently proposed method Mashup. Our results show that our method outperforms previous methods for both human and yeast STRING networks. We also show substantial improvement in the performance of our method in predicting gene ontology terms of varying type and specificity. Availability and implementation deepNF is freely available at: https://github.com/VGligorijevic/deepNF. Supplementary information Supplementary data are available at Bioinformatics online.

Journal ArticleDOI
TL;DR: A modular and scalable architecture based on lightweight virtualization that simplifies management and enables distributed deployments, creating a highly dynamic system with characteristics such as fault tolerance and system availability.
Abstract: The world of connected devices has led to the rise of the Internet of Things paradigm, where applications rely on multiple devices, gathering and sharing data across highly heterogeneous networks The variety of possible mechanisms, protocols, and hardware has become a hindrance in the development of architectures capable of addressing the most common IoT use cases, while abstracting services from the underlying communication subsystem Moreover, the world is moving toward new strict requirements in terms of timeliness and low latency in combination with ultra-high availability and reliability Thus, future IoT architectures will also have to support the requirements of these cyber-physical applications In this regard, edge computing has been presented as one of the most promising solutions, relying on the cooperation of nodes by moving services directly to end devices and caching information locally Therefore, in this article, we propose a modular and scalable architecture based on lightweight virtualization The provided modularity, combined with the orchestration supplied by Docker, simplifies management and enables distributed deployments, creating a highly dynamic system Moreover, characteristics such as fault tolerance and system availability are achieved by distributing the application logic across different layers, where failures of devices and micro-services can be masked by this natively redundant architecture, with minimal impact on the overall system performance Experimental results have validated the implementation of the proposed architecture for on-demand services deployment across different architecture layers

Journal ArticleDOI
TL;DR: This paper discusses the recent advances in the techniques of mobile data offloading, and classifies the existing mobile data Offloading technologies into four categories, i.e., data offloaded through small cell networks, data off loading through WiFi networks,Data offloading through opportunistic mobile networks, and data offload through heterogeneous networks.
Abstract: Recently, due to the increasing popularity of enjoying various multimedia services on mobile devices (e.g., smartphones, ipads, and electronic tablets), the generated mobile data traffic has been explosively growing and has become a serve burden on mobile network operators. To address such a serious challenge in mobile networks, an effective approach is to manage data traffic by using complementary technologies (e.g., small cell network, WiFi network, and so on) to achieve mobile data offloading. In this paper, we discuss the recent advances in the techniques of mobile data offloading. Particularly, based on the initiator diversity of data offloading, we classify the existing mobile data offloading technologies into four categories, i.e., data offloading through small cell networks, data offloading through WiFi networks, data offloading through opportunistic mobile networks, and data offloading through heterogeneous networks. Besides, we show a detailed taxonomy of the related mobile data offloading technologies by discussing the pros and cons for various offloading technologies for different problems in mobile networks. Finally, we outline some opening research issues and challenges, which can provide guidelines for future research work.

Journal ArticleDOI
TL;DR: A comprehensive survey on state-of-the-art research activities on RRM for machine-type communications in LTE/LTE-A cellular networks including access control, radio resource allocation, power management, and the latest 3GPP standards supporting M2M communications.
Abstract: In futuristic wireless communications, a massive number of devices need to access networks with diverse quality of service (QoS) requirements. It is estimated that the number of connected devices will exceed 20 billions in 2020, and machine-to-machine (M2M) devices will account for nearly half of total connected devices. However, existing cellular systems and wireless standards, designed primarily for human-to-human (H2H) communications focusing on reducing access latency, increasing data rate, and system throughput, are not well suited for M2M communications that require massive connections, diverse QoS requirements, and low energy consumption. Radio resource management (RRM) in conventional H2H communications aims at improving spectrum efficiency and energy efficiency. Similarly, RRM also plays a vital role in M2M communications. In this paper, we make a comprehensive survey on state-of-the-art research activities on RRM in M2M communications. First, we discuss the issues on RRM for machine-type communications in LTE/LTE-A cellular networks including access control, radio resource allocation, power management, and the latest 3GPP standards supporting M2M communications. Acknowledging the fact that a single technology can not support all M2M applications, we discuss RRM issues for unlicensed band radio access technologies in M2M capillary networks, including IEEE 802.11ah, Bluetooth low energy, ZigBee, and smart metering networks. We also survey M2M RRM methods in heterogeneous networks consisting of cellular networks, capillary networks, and ultra dense networks. Finally, we review recent standard activities and discuss the open issues and research challenges.

Journal ArticleDOI
TL;DR: The MECO problem in UDN is studied and a heuristic greedy offloading scheme is proposed as the solution, demonstrating the necessity for and superior performance of conducting computation offloading over multiple MEC servers.
Abstract: The ultra-dense network (UDN) is envisioned to be an enabling and highly promising technology to enhance spatial multiplexing and network capacity in future 5G networks. Moreover, to address the conflict between computation-intensive applications and resource-constrained IoT mobile devices (MDs), multi-access mobile edge computing (MA-MEC), which provides the IoT MDs with cloud capabilities at the edge of radio access networks, has been proposed. UDN and MA-MEC are regarded as two distinct but complementary enabling technologies for 5G IoT applications. Over the past several years, lots of research on mobile edge computation offloading (MECO) -- the key technique in MA-MEC -- has emerged. However, it is noticed that all these works focused on the single-tier base station scenario and computation offloading between the MD and the MEC server connected to the macro base station, and few works can be found on the problem of computation offloading for MA-MEC in UDN (i.e., a multi-user ultra-dense MEC server scenario). Toward this end, we study in this article the MECO problem in UDN and propose a heuristic greedy offloading scheme as our solution. Extensive numerical results and comparisons demonstrate the necessity for and superior performance of conducting computation offloading over multiple MEC servers.

Journal ArticleDOI
06 Apr 2018-Sensors
TL;DR: Results show that a proper parameter setting is needed to cover large urban areas while maintaining the airtime sufficiently low to keep packet losses at satisfactory levels, as well as assessing the coverage and the performance of the LoRa technology in a real urban scenario.
Abstract: Information and Communication Technologies (ICTs), through wireless communications and the Internet of Things (IoT) paradigm, are the enabling keys for transforming traditional cities into smart cities, since they provide the core infrastructure behind public utilities and services. However, to be effective, IoT-based services could require different technologies and network topologies, even when addressing the same urban scenario. In this paper, we highlight this aspect and present two smart city testbeds developed in Italy. The first one concerns a smart infrastructure for public lighting and relies on a heterogeneous network using the IEEE 802.15.4 short-range communication technology, whereas the second one addresses smart-building applications and is based on the LoRa low-rate, long-range communication technology. The smart lighting scenario is discussed providing the technical details and the economic benefits of a large-scale (around 3000 light poles) flexible and modular implementation of a public lighting infrastructure, while the smart-building testbed is investigated, through measurement campaigns and simulations, assessing the coverage and the performance of the LoRa technology in a real urban scenario. Results show that a proper parameter setting is needed to cover large urban areas while maintaining the airtime sufficiently low to keep packet losses at satisfactory levels.

Journal ArticleDOI
TL;DR: In this article, a tensor-based, holistic, hierarchical approach is introduced to generate efficient routing paths using tensor decomposition methods to implement routing recommendations for big data networks.
Abstract: Telecommunication networks are evolving toward a data-center-based architecture, which includes physical network functions, virtual network functions, as well as various types of management and orchestration systems. The primary purpose of this type of heterogeneous network is to provide efficient and convenient communication services for users. However, the diverse factors of a heterogeneous network such as bandwidth, delay, and communication protocol, bring great challenges for routing recommendations. In addition, the growing volume of big data and the explosive deployment of heterogeneous networks have started a new era of applying big data technologies to implement routing recommendations. In this article, a tensor-based big-data-driven routing recommendation framework, including the edge plane, fog plane, cloud plane, and application plane, is proposed. In this framework, a tensor-based, holistic, hierarchical approach is introduced to generate efficient routing paths using tensor decomposition methods. Also, a tensor matching method including the controlling tensor, seed tensor, and orchestration tensor is employed to realize routing recommendation. Finally, a case study is used to demonstrate the key processing procedures of the proposed framework.

Journal ArticleDOI
TL;DR: A bio-inspired and trust-based cluster head selection approach for WSN adopted in ITS applications and the results demonstrated that the proposed model achieved longer network lifetime, i.e., nodes are kept alive longer than what LEACH, SEP and DEEC can achieve.

Journal ArticleDOI
TL;DR: A new approach to the modeling and analysis of heterogeneous cellular networks (HetNets) that accurately incorporates coupling across the locations of users and base stations, which exists due to the deployment of small cell base stations at the places of high user density is developed.
Abstract: This paper develops a new approach to the modeling and analysis of heterogeneous cellular networks (HetNets) that accurately incorporates coupling across the locations of users and base stations, which exists due to the deployment of small cell base stations (SBSs) at the places of high user density (termed user hotspots in this paper). Modeling the locations of the geographical centers of user hotspots as a homogeneous Poisson point process (PPP), we assume that the users and SBSs are clustered around each user hotspot center independently with two different distributions. The macrocell BS locations are modeled by an independent PPP. This model is consistent with the user and SBS configurations considered by 3GPP. Using this model, we study the performance of a typical user in terms of coverage probability and throughput for two association policies: 1) Policy 1 , under which a typical user is served by the open-access BS that provides maximum averaged received power and 2) Policy 2 , under which the typical user is served by the small cell tier if the maximum averaged received power from the open-access SBSs is greater than a certain power threshold ; and macro tier otherwise. A key intermediate step in our analysis is the derivation of distance distributions from a typical user to the open-access and closed-access interfering SBSs. Our analysis demonstrates that as the number of SBSs reusing the same resource block increases, coverage probability decreases, whereas throughput increases. Therefore, contrary to the usual assumption of orthogonal channelization, it is reasonable to assign the same resource block to multiple SBSs in a given cluster as long as the coverage probability remains acceptable. This approach to HetNet modeling and analysis significantly generalizes the state-of-the-art approaches that are based on modeling the locations of BSs and users by independent PPPs.

Journal ArticleDOI
TL;DR: A novel integrated framework, called MPBP (Meta-Path feature-based BP neural network model), to predict multiple types of links for heterogeneous networks, and shows that the MPBP with very good performance is superior to the baseline methods.
Abstract: Most real-world systems, composed of different types of objects connected via many interconnections, can be abstracted as various complex heterogeneous networks. Link prediction for heterogeneous networks is of great significance for mining missing links and reconfiguring networks according to observed information, with considerable applications in, for example, friend and location recommendations and disease–gene candidate detection. In this paper, we put forward a novel integrated framework, called MPBP (Meta-Path feature-based BP neural network model), to predict multiple types of links for heterogeneous networks. More specifically, the concept of meta-path is introduced, followed by the extraction of meta-path features for heterogeneous networks. Next, based on the extracted meta-path features, a supervised link prediction model is built with a three-layer BP neural network. Then, the solution algorithm of the proposed link prediction model is put forward to obtain predicted results by iteratively training the network. Last, numerical experiments on the dataset of examples of a gene–disease network and a combat network are conducted to verify the effectiveness and feasibility of the proposed MPBP. It shows that the MPBP with very good performance is superior to the baseline methods.

Journal ArticleDOI
TL;DR: In this paper, the authors studied the trade-off between data rate performance and energy consumption in NOMA heterogeneous networks and proposed energy-efficient user scheduling and power allocation schemes.
Abstract: Non-orthogonal multiple access has attracted much recent attention due to its capability of improving the system spectral efficiency in wireless communications. Deploying NOMA in a heterogeneous network can satisfy users' explosive data traffic requirements, and NOMA will likely play an important role in the next generation mobile communication networks. However, NOMA brings new technical challenges on resource allocation due to the mutual cross-tier interference in heterogeneous networks. In this article, to study the trade-off between data rate performance and energy consumption in NOMA, we examine the problem of energy-efficient user scheduling and power optimization in NOMA heterogeneous networks. The energy-efficient user scheduling and power allocation schemes are introduced for the downlink NOMA heterogeneous network for perfect and imperfect CSI, respectively. Simulation results show that the resource allocation schemes can significantly increase the energy efficiency of NOMA heterogeneous networks for cases of both perfect CSI and imperfect CSI.

Journal ArticleDOI
TL;DR: A mathematical framework is contributed to model the process of critical session transfers in a softwarized 5G access network, and the corresponding impact on other user sessions is quantified.
Abstract: Network softwarization is a major paradigm shift, which enables programmable and flexible system operation in challenging use cases. In the fifth-generation (5G) mobile networks, the more advanced scenarios envision transfer of high-rate mission-critical traffic. Achieving end-to-end reliability of these stringent sessions requires support from multiple radio access technologies and calls for dynamic orchestration of resources across both radio access and core network segments. Emerging 5G systems can already offer network slicing, multi-connectivity, and end-to-end quality provisioning mechanisms for critical data transfers within a single software-controlled network. Whereas these individual enablers are already in active development, a holistic perspective on how to construct a unified, service-ready system as well as understand the implications of critical traffic on serving other user sessions is not yet available. Against this background, this paper first introduces a softwarized 5G architecture for end-to-end reliability of the mission-critical traffic. Then, a mathematical framework is contributed to model the process of critical session transfers in a softwarized 5G access network, and the corresponding impact on other user sessions is quantified. Finally, a prototype hardware implementation is completed to investigate the practical effects of supporting mission-critical data in a softwarized 5G core network, as well as substantiate the key system design choices.

Journal ArticleDOI
TL;DR: This paper presents methods for slicing deterministic and packet-switched industrial communication protocols, which simplify the manageability of heterogeneous networks with various application requirements and shows how to use network calculus to assess the end-to-end properties of the network slices.
Abstract: Industry 4.0 introduces modern communication and computation technologies such as cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines, and applications will rely on connectivity, while having different requirements to the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous, as they will feature a number of diverse communication technologies. Current technologies are not well suited for this scenario, which requires that the network is managed at an abstraction level, which is decoupled from the underlying technologies. In this paper, we consider network slicing as a mechanism to handle these challenges. We present methods for slicing deterministic and packet-switched industrial communication protocols, which simplify the manageability of heterogeneous networks with various application requirements. Furthermore, we show how to use network calculus to assess the end-to-end properties of the network slices.

Journal ArticleDOI
TL;DR: An overview of the main research works in the field of SDN satellite networks and some open challenges are described in light of the network slicing concept by 5G virtualization, along with a possible roadmap including different network virtualization levels.
Abstract: The envisioned 5G ecosystem will be composed of heterogeneous networks based on different technologies and communication means, including satellite communication networks. The latter can help increase the capabilities of terrestrial networks, especially in terms of higher coverage, reliability, and availability, contributing to the achievement of some of the 5G KPIs. However, technological changes are not immediate. Many current satellite communication networks are based on proprietary hardware, which hinders the integration with future 5G terrestrial networks as well as the adoption of new protocols and algorithms. On the other hand, the two main paradigms that are emerging in the networking scenario -- software defined networking (SDN) and network functions virtualization -- can change this perspective. In this respect, this article presents first an overview of the main research works in the field of SDN satellite networks in order to understand the already proposed solutions. Then some open challenges are described in light of the network slicing concept by 5G virtualization, along with a possible roadmap including different network virtualization levels. The remaining unsolved problems are related to the development and deployment of a complete integration of satellite components in the 5G ecosystem.

Journal ArticleDOI
01 Jun 2018
TL;DR: A new non-linear fuzzy optimization model for deriving crisp weights from fuzzy comparison matrices for network selection is presented, and the weights obtained are more consistent than the existing optimization models.
Abstract: Display Omitted The disadvantages of using extent analysis method for network selection problems are discussedWeights of network parameters are obtained by applying a nonlinear fuzzy optimization modelConsistency Index with this proposed model is better than the existing non-linear modelsParameterized utility functions are used to evaluate the utility values of network attributesResults obtained for network selection with the MEW method are better than TOPSIS and SAW methods Next generation wireless networks will integrate various heterogeneous technologies like WLAN, WiMax and cellular technologies etc, to support multimedia services with higher bandwidth and guaranteed quality of service (QoS) In order to keep the mobile user always connected to the best wireless network in terms of QoS parameters and user preferences, an optimal network selection technique in heterogeneous networks is required This paper proposes a novel fuzzy-Analytic Hierarchy Process (AHP) based network selection in heterogeneous wireless networks Triangular fuzzy numbers are used to represent the elements in the comparison matrices for voice, video and best effort applications Deriving crisp weights from these fuzzy comparison matrices is a challenging task When extent analysis method is applied, irrational zero weights are obtained for some attributes Due to this, many important criteria are not considered in the decision making process To overcome this problem, a new non-linear fuzzy optimization model for deriving crisp weights from fuzzy comparison matrices for network selection is presented The weights obtained from this model are more consistent than the existing optimization models Also, parameterized utility functions are used to model the different Quality of Service (QoS) attributes (bandwidth, delay, jitter, bit error rate) and user preferences (cost) for three different types of applications Finally, scores are calculated exclusively for each network by three MADM (Multiple Attribute Decision Making) methods Simple Additive Weighting (SAW), TOPSIS (The Technique for Order of Preference by Similarity to Ideal Solution) and MEW (Multiplicative Exponential Weighting) Results show that the MEW method gives more appropriate scores with utility functions than the SAW and TOPSIS methods

Journal ArticleDOI
TL;DR: The architectures of two multi-satellite relay transmission systems based on TDMA and NOMA are introduced and a focus more on the performance evaluation and research challenges of the TDMAbased architecture to explore its system optimizing approach.
Abstract: The demands of 5G mobile communications, such as higher throughput and lower latency in future communications, has promoted more advanced communication technologies and heterogeneous network integration. Satellite communication has been recently considered as a key part of 5G for the benefits of meeting the availability and ubiquitous coverage requirements targeted by 5G. As a potential 5G technology, multi-satellite cooperative transmission systems have been studied, as they can provide the high throughput brought by virtual multiple-input multiple-output structures. In this article, we briefly review the concepts and techniques of multi-satellite cooperative transmission systems in 5G. Moreover, the architectures of two multi-satellite relay transmission systems based on TDMA and NOMA are introduced. In particular, we focus more on the performance evaluation and research challenges of the TDMAbased architecture to explore its system optimizing approach. Finally, to exploit the full potential of multi-satellite systems in 5G networks, future trends and challenges are discussed.