scispace - formally typeset
Search or ask a question

Showing papers in "IEEE networking letters in 2022"


Journal ArticleDOI
TL;DR: A novel traffic entropy learning based load prediction and management model that envisages improvement of load distribution by minimization of performance degradation due to traffic prediction errors and significantly improves resource utilization and energy consumption.
Abstract: This letter proposes a novel traffic entropy learning based load prediction and management model that envisages improvement of load distribution by minimization of performance degradation due to traffic prediction errors. The entropy determines the variance considering dynamic surge and plunge of the traffic periodically and suggests to acquire sufficient number of active physical machines (PMs) to render efficacious services. The experimental simulation and comparison of the proposed model with existing approaches reveal that it significantly improves resource utilization up to 21.5% with reduction of active servers and energy consumption up to 26.5% and 11.7%, respectively.

12 citations


Journal ArticleDOI
TL;DR: This work aims to obtain the maximum sum rate by optimizing the phase matrix of RIS subject to users’ transmitted power and proposes the semidefinite programming to relax the function while the majorization-minimization is used to derive the closed-form phase shifts.
Abstract: Reconfigurable intelligent surface (RIS) is an electromagnetic surface, and has abundant low-power reflecting elements which can dynamically tune the wireless propagation environment by changing their phase shifts. Non-orthogonal multiple access (NOMA) technology would be capable of greatly improving the spectral efficiency of communication via differentiating users through power. Via integrating RIS into NOMA to improve wireless system performance, we take an uplink RIS-aided NOMA network with direct links into account, where multiple users communicate with the access point under the assistance of a RIS with multiple elements. We aim to obtain the maximum sum rate by optimizing the phase matrix of RIS subject to users’ transmitted power. Two algorithms are put forward to conquer the formulated intricate non-convex puzzle. More exactly, the semidefinite programming is proposed to relax the function while the majorization-minimization is used to derive the closed-form phase shifts. Finally, presented simulation results demonstrate the high performance of proposed schemes compared with no RIS and show the benefits of the existence of direct links.

11 citations


Journal ArticleDOI
TL;DR: In this paper , the authors present an evaluation of slotted ALOHA using game theory to capture the strategic choices of the nodes, considered as independent agents that attempt to obtain updates from a shared source, with collisions preventing them from getting a usable update.
Abstract: This letter presents an evaluation of slotted ALOHA using game theory to capture the strategic choices of the nodes, considered as independent agents that attempt to obtain updates from a shared source, with collisions preventing them from getting a usable update. Their objectives are to minimize the sum of the average age of information and a transmission cost term. The latter is an important addition to the model, shown to achieve better coordination among the nodes, so that, while the price of anarchy of the system is unbounded, a limited price of stability, approaching 1 for increasing cost, can be obtained.

9 citations


Journal ArticleDOI
TL;DR: In this article , a malicious user prediction model based on quantum machine learning is proposed to estimate the vicious entity present in the communication system precedently before allocating the data in the distributed environments.
Abstract: This letter proposes a novel malicious user prediction model based on quantum machine learning that estimates the vicious entity present in the communication system precedently before allocating the data in the distributed environments. The proposed model scrutinizes the behavior of each user and estimates probable data breaches using a developed malicious user predictor unit. The model computes essential scores associated with each user request for the learning process of the prediction unit by generating training samples. The predictor unit exploits the computational and behavioral properties of Qubits and Quantum gates for the accurate prediction of the malicious user with high precision to grant access to non-malicious data requests only. The experimental evaluation and comparison of the proposed model with state-of-the-art methods reveal that it significantly improves the security of the system up to 33.28%.

7 citations


Journal ArticleDOI
TL;DR: An effective and novel IoT node authentication approach using Mahalanobis Distance correlation and Chi-square distribution theories is introduced which has lower computational time to determine node authenticity.
Abstract: The wireless Internet of Things (IoT) node authentication approaches also used Radio Frequency (RF) fingerprinting or physical unclonable features (PUF) of IoT devices for node authentication. Machine learning based models play vital role in these approaches. In this letter, we introduce an effective and novel IoT node authentication approach using Mahalanobis Distance correlation and Chi-square distribution theories. Further, it has lower computational time to determine node authenticity. The comparative results of our proposal with three recent machine learning based approaches and PUF based approaches are promising which validates the effectiveness and the novelty of our proposal.

6 citations


Journal ArticleDOI
TL;DR: Experiments conducted on the NSL-KDD dataset show that the solution is able to accurately detect new attacks encountered during testing, while its overall performance is comparable to numerous state-of-the-art works from the cybersecurity literature.
Abstract: In this letter, we present a two-stage pipeline for robust network intrusion detection. First, we implement an extreme gradient boosting (XGBoost) model to perform supervised intrusion detection, and leverage the SHapley Additive exPlanation (SHAP) framework to devise explanations of our model. In the second stage, we use these explanations to train an auto-encoder to distinguish between previously seen and unseen attacks. Experiments conducted on the NSL-KDD dataset show that our solution is able to accurately detect new attacks encountered during testing, while its overall performance is comparable to numerous state-of-the-art works from the cybersecurity literature.

5 citations


Journal ArticleDOI
TL;DR: Under the uncertainty of BFL, this work proposes to use deep reinforcement learning to find the optimal decisions for the machine learning model owner (MLMO) to minimize the system latency and mining cost while achieving the target accuracy.
Abstract: Blockchain-enabled Federated Learning (BFL) enables model updates to be stored in blockchain in a reliable manner. However, one problem is the increase of the training latency due to the mining process. Moreover, mobile devices have energy and CPU constraints. Therefore, the machine learning model owner (MLMO) needs to decide the data and energy that the mobile devices use for the training and determine the block generation rate to minimize the system latency and mining cost while achieving the target accuracy. Under the uncertainty of BFL, we propose to use deep reinforcement learning to find the optimal decisions for the MLMO.

4 citations


Journal ArticleDOI
TL;DR: This letter considers different summary statistics, i.e., different functions of the state, which can represent the useful information for a monitoring process, particularly in safety and industrial applications and proposes policies that minimize the estimation error.
Abstract: The optimization of Value of Information (VoI) in sensor networks integrates awareness of the measured process in the communication system. However, most existing scheduling algorithms do not consider the specific needs of monitoring applications, but define VoI as a generic Mean Square Error (MSE) of the whole system state regardless of the relevance of individual components. In this letter, we consider different summary statistics, i.e., different functions of the state, which can represent the useful information for a monitoring process, particularly in safety and industrial applications. We propose policies that minimize the estimation error for different summary statistics, showing significant gains by simulation.

4 citations


Journal ArticleDOI
TL;DR: In this article , an analytical model to investigate the performance of UORA in transient condition under bursty arrivals with arbitrary distribution was presented. And the key performance metrics of access success probability; average access delay; cumulative distribution function (CDF) for the number of transmissions; and utilization of random access resource units (RA-RUs) were derived.
Abstract: IEEE 802.11ax defines a new channel access scheme of OFDMA-based random access (UORA) for varying number of associated stations (STAs) to send their requests or data to the access point (AP) in a contention manner. This letter presents an analytical model to investigate the performance of UORA in transient condition under bursty arrivals with arbitrary distribution. The key performance metrics of access success probability; average access delay; cumulative distribution function (CDF) for the number of transmissions; and utilization of random access resource units (RA-RUs) were derived. Simulation results verified the accuracy of the proposed analytical model.

3 citations


Journal ArticleDOI
TL;DR: In this paper , an ML-based Attack Centric Method (ACM) is introduced to evaluate the APT detection performance on the generated dataset, which has been shown to outperform the baseline approaches with a maximum macro average F1 score of 82.27% corresponding to 9.4% improvement with respect to the baseline performance.
Abstract: In order to define a benchmark for Machine Learning (ML)-based Advanced Persistent Threat (APT) detection in the network traffic, this letter presents SCVIC-APT-2021, a new dataset that can realistically represent the contemporary network architecture and APT characteristics. Following upon this, an ML-based Attack Centric Method (ACM) is introduced to evaluate the APT detection performance on the generated dataset. Furthermore, ACM has been shown to outperform the baseline approaches with a maximum macro average F1 score of 82.27% corresponding to 9.4% improvement with respect to the baseline performance.

3 citations


Journal ArticleDOI
TL;DR: In this article , the authors proposed a reinforcement learning-based greedy algorithm to solve the max-min optimization problem for UAV deployment and its power allocation in emergency communication services, which achieved around 2.3 bps/Hz high.
Abstract: Unmanned aerial vehicles (UAVs) as an Aerial Base Station (ABS) are the enabler in the provisioning of emergency communication services. However, ABS unplanned deployment creates interference from the neighboring co-channel base station, which hinders meeting the required quality-of-service (QoS) requirements and the minimum rate of users. Hence, the ABS deployment and its power allocation require a machine learning-based solution xthat can plan in real-time to enhance the users’ max-min sum-rate . We propose the reinforcement learning-based $\epsilon $ greedy algorithm to solve the max-min optimization problem. The simulation results validate the proposal by achieving around 2.3 bps/Hz high minimum sum-rate compared to the conventional water filling algorithm at the same ABS altitude.

Journal ArticleDOI
TL;DR: In this article , the authors consider different summary statistics, i.e., different functions of the state, which can represent the useful information for a monitoring process, particularly in safety and industrial applications.
Abstract: The optimization of Value of Information (VoI) in sensor networks integrates awareness of the measured process in the communication system. However, most existing scheduling algorithms do not consider the specific needs of monitoring applications, but define VoI as a generic Mean Square Error (MSE) of the whole system state regardless of the relevance of individual components. In this letter, we consider different summary statistics, i.e., different functions of the state, which can represent the useful information for a monitoring process, particularly in safety and industrial applications. We propose policies that minimize the estimation error for different summary statistics, showing significant gains by simulation.

Journal ArticleDOI
TL;DR: An ML-based Attack Centric Method (ACM) is introduced to evaluate the APT detection performance on the generated dataset and has been shown to outperform the baseline approaches with a maximum macro average F1 score of 82.27%.
Abstract: In order to define a benchmark for Machine Learning (ML)-based Advanced Persistent Threat (APT) detection in the network traffic, this letter presents SCVIC-APT-2021, a new dataset that can realistically represent the contemporary network architecture and APT characteristics. Following upon this, an ML-based Attack Centric Method (ACM) is introduced to evaluate the APT detection performance on the generated dataset. Furthermore, ACM has been shown to outperform the baseline approaches with a maximum macro average F1 score of 82.27% corresponding to 9.4% improvement with respect to the baseline performance.

Journal ArticleDOI
TL;DR: A programmability framework is envisaged, with major objective to support the development of 5G-enabled vertical applications, as well as, their testing, prior interacting with commercial networks.
Abstract: The 5th Generation (5G) of mobile networks brings the concept of openness to vertical industries, by enabling new levels of programmability in the network core and edge domains. Openness is enabled through APIs and it creates the field for developing 5G-enabled vertical applications, i.e., applications that can interact with the underlay network. In this context, a programmability framework is envisaged, with major objective to support the development of 5G-enabled vertical applications, as well as, their testing, prior interacting with commercial networks. Implementation directions for the proposed framework are also provided, leading to a bunch of development and research-oriented takeaways.

Journal ArticleDOI
TL;DR: The reinforcement learning-based greedy algorithm is proposed to solve the max-min optimization problem and the simulation results validate the proposal by achieving around 2.3 bps/Hz high minimum sum-rate compared to the conventional water filling algorithm at the same ABS altitude.
Abstract: Unmanned aerial vehicles (UAVs) as an Aerial Base Station (ABS) are the enabler in the provisioning of emergency communication services. However, ABS unplanned deployment creates interference from the neighboring co-channel base station, which hinders meeting the required quality-of-service (QoS) requirements and the minimum rate of users. Hence, the ABS deployment and its power allocation require a machine learning-based solution xthat can plan in real-time to enhance the users’ max-min sum-rate. We propose the reinforcement learning-based $\epsilon $ greedy algorithm to solve the max-min optimization problem. The simulation results validate the proposal by achieving around 2.3 bps/Hz high minimum sum-rate compared to the conventional water filling algorithm at the same ABS altitude.

Journal ArticleDOI
TL;DR: A secure energy efficient game theoretical model, wherein the probability with which regular users should sense the channel to mitigate the effect of malicious users is devised, is devised.
Abstract: Energy consumption in Cooperative Spectrum Sensing can be reduced by allowing users to sense the channel at irregular intervals and utilize the information from peer users over other time instants. However, the network becomes insecure if peer users send incorrect sensing results to secondary user over the time instants at which it did not sense the channel. To overcome this problem, we propose a secure energy efficient game theoretical model, wherein the probability with which regular users should sense the channel to mitigate the effect of malicious users is devised. Extensive simulations indicate feasible performance by our proposed model.

Journal ArticleDOI
TL;DR: In this article , the authors proposed a novel consensus algorithm using Proof of Majority (PoM) to increase decentralization and to eliminate resource-intensive tasks leading to a reduced carbon footprint.
Abstract: The popularity of Blockchain is rising on account of its far-reaching applications in diverse industries. However, recently, blockchain has seen a rise of energy extensive mining pools which is leading to centralization, contradicting the basic blockchain tenet of decentralization. This letter proposes a novel consensus algorithm using Proof of Majority (PoM), to increase decentralization and to eliminate resource-intensive tasks leading to a reduced carbon footprint. We have evaluated the proposed algorithm in terms of latency and throughput. The proposed consensus algorithm outperforms popular existing consensus algorithms.

Journal ArticleDOI
TL;DR: A global aeronautical traffic demand map is generated and utilized and the graph-based handover (GBH) framework and the multi-attribute decision making (MADM) algorithm jointly to optimize the overall throughput.
Abstract: With the development of space-air-ground integrated network (SAGIN), the Low Earth Orbit Satellite Communication (LEO SatCom) network has great potential to provide high rate communication services for aeronautical traffic. To overcome the frequency handovers between aircraft and LEO satellites, we generated a global aeronautical traffic demand map and utilized the graph-based handover (GBH) framework and the multi-attribute decision making (MADM) algorithm jointly to optimize the overall throughput. Moreover, we further propose a new parameter, called channel reservation order (CRO), for MADM algorithm. Simulation results indicate our strategy improves the performance in terms of overall throughput and access failure probability.

Journal ArticleDOI
TL;DR: This letter proposes a novel SFC provisioning scheme for fog paradigms that achieves load balancing (LB) and redistribution between heavily- and lightly-loaded nodes, without exceeding delay bounds.
Abstract: Network function virtualization (NFV) requires service function chain (SFC) provisioning on cloud or fog servers. Now fog nodes offers short provisioning times as compared to the cloud, albeit limited resources. This makes fog nodes vulnerable to saturation if users demand computation-intensive services, where offloading to the cloud is prohibited due to the prolonged delays. Hence, this letter proposes a novel SFC provisioning scheme for fog paradigms that achieves load balancing (LB) and redistribution between heavily- and lightly-loaded nodes, without exceeding delay bounds. The scheme features high admission rates and reduced delays versus prominent solutions such as standalone delay or load minimization methods.

Journal ArticleDOI
TL;DR: In this article , a machine learning enabled link adaption and scheduling framework for Industrial Internet of Things (IIoT), leveraging quasi-periodicity of traffic in IIoT, is presented.
Abstract: A machine learning enabled link adaption (LA) and scheduling framework is presented for Industrial Internet of Things (IIoT), leveraging quasi-periodicity of traffic in IIoT. The following steps are introduced: i) a reduced complexity link establishment accounting jointly for beamforming and load management; ii) interference prediction using long short-term memory neural networks; iii) semi-coordinated scheduling based on node grouping for interference avoidance. Through numerical evaluation it is demonstrated that the proposed approach can substantially improve average spectral efficiency by as much as 62% in a realistic IIoT scenario at negligible overhead.

Journal ArticleDOI
TL;DR: In this article , the authors present an experimental evaluation of end-to-end delay in Sigfox networks, considering measurements from the transmitter to the server, and empirically indicate performance limitations that application developers should consider.
Abstract: This letter presents an experimental evaluation of end-to-end delay in Sigfox networks, considering measurements from the transmitter to the server. We carried out two types of cases. The first was a static experiment with a typical delivery time ranging between 2.5 and 4.5 seconds, with a 100% delivery rate. The second considers mobility, with the experiments carried out in mostly rural environments in which the transmitter is inside a car. In this case, we observed a delivery rate of 20% and an end-to-end delay of up to eight seconds. Our evaluation empirically indicates performance limitations that application developers should consider.

Journal ArticleDOI
TL;DR: This letter provides explicit formulas describing how the minimum safe inter-vehicle distance (IVD), for avoiding rear-end collision, can be shortened with the use of decentralized environmental notification messages (DENMs).
Abstract: This letter provides a safety analysis for emergency braking scenarios involving consecutive vehicles. The vehicles use adaptive cruise control (ACC) with a constant-distance policy together with additional vehicle-to-vehicle (V2V) communication for emergency braking. We provide explicit formulas describing how the minimum safe inter-vehicle distance (IVD), for avoiding rear-end collision, can be shortened with the use of decentralized environmental notification messages (DENMs). More precisely, those formulas describe the dependency of such IVDs on V2V communication delay. We further show how these results can be used to compute probabilities of safe braking in the presence of packet losses.

Journal ArticleDOI
TL;DR: Through Monte-Carlo simulations, it is demonstrated that the proposed buffer-aware approach significantly outperforms the conventional schemes with prefixed antenna allocation considering two benchmarks operating based on the receive-antenna-selection (RAS)/transmit antenna selection (TAS) and maximum-ratio-combining (MRC)/TAS.
Abstract: The in-band full-duplex buffer-aided relaying is a spectral-efficient technology for extending the range of reliable communication. In this context, we introduce a novel joint buffer-aware rate allocation and antenna selection algorithm to schedule the network transmissions. The closed-form expression for the average throughput of the network is derived under Nakagami-m fading environment. Through Monte-Carlo simulations, we demonstrate that our proposed buffer-aware approach significantly outperforms the conventional schemes with prefixed antenna allocation considering two benchmarks operating based on the receive-antenna-selection (RAS)/transmit antenna selection (TAS) and maximum-ratio-combining (MRC)/TAS. The overlapping results in the presented simulation examples corroborates our theoretical analysis.

Journal ArticleDOI
TL;DR: In this paper , a power-optimized NOMA with TXOP tuning based channel access scheme for IEEE 802.11ah dense IoT networks is proposed to mitigate the number of contending nodes in a group.
Abstract: In this letter, we propose a power-optimized NOMA with TXOP tuning based novel channel access scheme for IEEE 802.11ah dense IoT networks. Using the proposed method, we form the NOMA-clusters to mitigate the number of contending nodes in a group. Further, the proposed scheme enables nodes in a NOMA-cluster to transmit simultaneously using power-domain NOMA, as well as multiple frames based on the TXOP limit in single resource block. With an analytical model, we evaluate the throughput and connectivity performance. From the analytical and simulation results, it is evident that the proposed scheme significantly improve the performance of NOMA-IoT networks.

Journal ArticleDOI
TL;DR: A generalized probabilistic routing model is devised to analyze and estimate the average buffer occupancy, without message exchange, and estimates the buffer overflow probability using the Chernoff’s bound.
Abstract: The multi-copy routing schemes in a mobile opportunistic network (MON) lead to buffer congestion, thereby affecting the network performance. This calls for a proactive buffer management policy for congestion mitigation. Existing works on buffer management in MONs require additional message exchange and are reactive in nature. In this letter, we devise a generalized probabilistic routing model to analyze and estimate the average buffer occupancy, without message exchange. We then estimate the buffer overflow probability using the Chernoff’s bound. The accuracy of the theoretical model is validated through simulation using a synthetic mobility model and a real-life mobility trace.

Journal ArticleDOI
TL;DR: In this article , a privacy-preserving framework based on the DENT project named PP-DENT is proposed, which enables privacy for eSIM users and conditional privacy for governments.
Abstract: Blockchain technology has attracted many developers in various fields to launch their projects. DENT Wireless, a well-known telecom company, has developed the DENT project to provide blockchain-based eSIM and exchange. This exciting project has attracted many eSIM users. However, DENT cannot support data privacy if requested by the users. This letter offers a privacy-preserving framework based on the DENT project named PP-DENT striking a balance between users’ protection and regulatory policies. The PP-DENT enables privacy for eSIM users and conditional privacy for governments.

Journal ArticleDOI
TL;DR: This letter analyzes the performance of the TCP CUBIC and TCP BBR protocols in the presence of background traffic and indicates that the BBR protocol produces higher throughput and fairness than the CUBic protocol.
Abstract: This letter analyzes the performance of the TCP CUBIC and TCP BBR protocols in the presence of background traffic. The analysis is performed via emulation using actual TCP implementations and considering high capacity end-to-end data connections and different connection durations (e.g., mouse and elephant flows). The results indicate that the BBR protocol produces higher throughput and fairness than the CUBIC protocol.

Journal ArticleDOI
Zhanwei Yu, Yi Zhao, Tao Deng, Lei Yu, Di Yuan 
TL;DR: In this paper , an integer linear programming (ILP) model is proposed to minimize the total carbon footprint in edge computing systems, and the problem can be cast as a minimum-cost flow problem.
Abstract: In sprite the state-of-the-art, significantly reducing carbon footprint (CF) in communications systems remains urgent. We address this challenge in the context of edge computing. The carbon intensity of electricity supply largely varies spatially as well as temporally. This, together with energy sharing via a battery management system (BMS), justifies the potential of CF-oriented task offloading, by redistributing the computational tasks in time and space. In this paper, we consider optimal task scheduling and offloading, as well as battery charging to minimize the total CF. We formulate this CF minimization problem as an integer linear programming model. However, we demonstrate that, via a graph-based reformulation, the problem can be cast as a minimum-cost flow problem. This finding reveals that global optimum can be admitted in polynomial time. Numerical results using real-world data show that optimization can reduce up to 83.3% of the total CF.

Journal ArticleDOI
TL;DR: In this paper , the authors proposed a novel consensus algorithm using Proof of Majority (PoM) to increase decentralization and to eliminate resource-intensive tasks leading to a reduced carbon footprint.
Abstract: The popularity of Blockchain is rising on account of its far-reaching applications in diverse industries. However, recently, blockchain has seen a rise of energy extensive mining pools which is leading to centralization, contradicting the basic blockchain tenet of decentralization. This letter proposes a novel consensus algorithm using Proof of Majority (PoM), to increase decentralization and to eliminate resource-intensive tasks leading to a reduced carbon footprint. We have evaluated the proposed algorithm in terms of latency and throughput. The proposed consensus algorithm outperforms popular existing consensus algorithms.

Journal ArticleDOI
TL;DR: This work proposes a cooperative and demand-aware caching strategy, which is modelled using the Separable Assignment Problem, to maximize the cache hit ratio and shows that the proposed strategy outperforms existing caching policies.
Abstract: Today, billions of smart devices are interconnected via wireless networks, leading to large volumes of video contents circulating through the bandwidth-limited backhaul. This causes network performance to deteriorate. As a mitigation mechanism, caching of highly popular contents to network edges is deployed. We propose a cooperative and demand-aware caching strategy, which is modelled using the Separable Assignment Problem, to maximize the cache hit ratio. This problem is solved with a recursive enumeration method, where dynamic programming is used to fill each edge. The extensive application-level evaluations show that the proposed strategy outperforms existing caching policies.