Showing papers in "IEEE Access in 2019"
[...]
TL;DR: In this paper, the authors provide a detailed overview and historical perspective on state-of-the-art solutions, and elaborate on the fundamental differences with other technologies, the most important open research issues to tackle, and the reasons why the use of reconfigurable intelligent surfaces necessitates to rethink the communication-theoretic models currently employed in wireless networks.
Abstract: The future of mobile communications looks exciting with the potential new use cases and challenging requirements of future 6th generation (6G) and beyond wireless networks. Since the beginning of the modern era of wireless communications, the propagation medium has been perceived as a randomly behaving entity between the transmitter and the receiver, which degrades the quality of the received signal due to the uncontrollable interactions of the transmitted radio waves with the surrounding objects. The recent advent of reconfigurable intelligent surfaces in wireless communications enables, on the other hand, network operators to control the scattering, reflection, and refraction characteristics of the radio waves, by overcoming the negative effects of natural wireless propagation. Recent results have revealed that reconfigurable intelligent surfaces can effectively control the wavefront, e.g., the phase, amplitude, frequency, and even polarization, of the impinging signals without the need of complex decoding, encoding, and radio frequency processing operations. Motivated by the potential of this emerging technology, the present article is aimed to provide the readers with a detailed overview and historical perspective on state-of-the-art solutions, and to elaborate on the fundamental differences with other technologies, the most important open research issues to tackle, and the reasons why the use of reconfigurable intelligent surfaces necessitates to rethink the communication-theoretic models currently employed in wireless networks. This article also explores theoretical performance limits of reconfigurable intelligent surface-assisted communication systems using mathematical techniques and elaborates on the potential use cases of intelligent surfaces in 6G and beyond wireless networks.
885 citations
[...]
TL;DR: This paper offers the first in-depth look at the vast applications of THz wireless products and applications and provides approaches for how to reduce power and increase performance across several problem domains, giving early evidence that THz techniques are compelling and available for future wireless communications.
Abstract: Frequencies from 100 GHz to 3 THz are promising bands for the next generation of wireless communication systems because of the wide swaths of unused and unexplored spectrum. These frequencies also offer the potential for revolutionary applications that will be made possible by new thinking, and advances in devices, circuits, software, signal processing, and systems. This paper describes many of the technical challenges and opportunities for wireless communication and sensing applications above 100 GHz, and presents a number of promising discoveries, novel approaches, and recent results that will aid in the development and implementation of the sixth generation (6G) of wireless networks, and beyond. This paper shows recent regulatory and standard body rulings that are anticipating wireless products and services above 100 GHz and illustrates the viability of wireless cognition, hyper-accurate position location, sensing, and imaging. This paper also presents approaches and results that show how long distance mobile communications will be supported to above 800 GHz since the antenna gains are able to overcome air-induced attenuation, and present methods that reduce the computational complexity and simplify the signal processing used in adaptive antenna arrays, by exploiting the Special Theory of Relativity to create a cone of silence in over-sampled antenna arrays that improve performance for digital phased array antennas. Also, new results that give insights into power efficient beam steering algorithms, and new propagation and partition loss models above 100 GHz are given, and promising imaging, array processing, and position location results are presented. The implementation of spatial consistency at THz frequencies, an important component of channel modeling that considers minute changes and correlations over space, is also discussed. This paper offers the first in-depth look at the vast applications of THz wireless products and applications and provides approaches for how to reduce power and increase performance across several problem domains, giving early evidence that THz techniques are compelling and available for future wireless communications.
600 citations
[...]
TL;DR: This paper provides a systematic vision of the organization of the blockchain networks, a comprehensive survey of the emerging applications of blockchain networks in a broad area of telecommunication, and discusses several open issues in the protocol design for blockchain consensus.
Abstract: The past decade has witnessed the rapid evolution in blockchain technologies, which has attracted tremendous interests from both the research communities and industries. The blockchain network was originated from the Internet financial sector as a decentralized, immutable ledger system for transactional data ordering. Nowadays, it is envisioned as a powerful backbone/framework for decentralized data processing and data-driven self-organization in flat, open-access networks. In particular, the plausible characteristics of decentralization, immutability, and self-organization are primarily owing to the unique decentralized consensus mechanisms introduced by blockchain networks. This survey is motivated by the lack of a comprehensive literature review on the development of decentralized consensus mechanisms in blockchain networks. In this paper, we provide a systematic vision of the organization of blockchain networks. By emphasizing the unique characteristics of decentralized consensus in blockchain networks, our in-depth review of the state-of-the-art consensus protocols is focused on both the perspective of distributed consensus system design and the perspective of incentive mechanism design. From a game-theoretic point of view, we also provide a thorough review of the strategy adopted for self-organization by the individual nodes in the blockchain backbone networks. Consequently, we provide a comprehensive survey of the emerging applications of blockchain networks in a broad area of telecommunication. We highlight our special interest in how the consensus mechanisms impact these applications. Finally, we discuss several open issues in the protocol design for blockchain consensus and the related potential research directions.
385 citations
[...]
TL;DR: A highly scalable and hybrid DNNs framework called scale-hybrid-IDS-AlertNet is proposed which can be used in real-time to effectively monitor the network traffic and host-level events to proactively alert possible cyberattacks.
Abstract: Machine learning techniques are being widely used to develop an intrusion detection system (IDS) for detecting and classifying cyberattacks at the network-level and the host-level in a timely and automatic manner. However, many challenges arise since malicious attacks are continually changing and are occurring in very large volumes requiring a scalable solution. There are different malware datasets available publicly for further research by cyber security community. However, no existing study has shown the detailed analysis of the performance of various machine learning algorithms on various publicly available datasets. Due to the dynamic nature of malware with continuously changing attacking methods, the malware datasets available publicly are to be updated systematically and benchmarked. In this paper, a deep neural network (DNN), a type of deep learning model, is explored to develop a flexible and effective IDS to detect and classify unforeseen and unpredictable cyberattacks. The continuous change in network behavior and rapid evolution of attacks makes it necessary to evaluate various datasets which are generated over the years through static and dynamic approaches. This type of study facilitates to identify the best algorithm which can effectively work in detecting future cyberattacks. A comprehensive evaluation of experiments of DNNs and other classical machine learning classifiers are shown on various publicly available benchmark malware datasets. The optimal network parameters and network topologies for DNNs are chosen through the following hyperparameter selection methods with KDDCup 99 dataset. All the experiments of DNNs are run till 1,000 epochs with the learning rate varying in the range [0.01-0.5]. The DNN model which performed well on KDDCup 99 is applied on other datasets, such as NSL-KDD, UNSW-NB15, Kyoto, WSN-DS, and CICIDS 2017, to conduct the benchmark. Our DNN model learns the abstract and high-dimensional feature representation of the IDS data by passing them into many hidden layers. Through a rigorous experimental testing, it is confirmed that DNNs perform well in comparison with the classical machine learning classifiers. Finally, we propose a highly scalable and hybrid DNNs framework called scale-hybrid-IDS-AlertNet which can be used in real-time to effectively monitor the network traffic and host-level events to proactively alert possible cyberattacks.
372 citations
[...]
TL;DR: A detailed review of the security-related challenges and sources of threat in the IoT applications is presented and four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.
Abstract: The Internet of Things (IoT) is the next era of communication. Using the IoT, physical objects can be empowered to create, receive, and exchange data in a seamless manner. Various IoT applications focus on automating different tasks and are trying to empower the inanimate physical objects to act without any human intervention. The existing and upcoming IoT applications are highly promising to increase the level of comfort, efficiency, and automation for the users. To be able to implement such a world in an ever-growing fashion requires high security, privacy, authentication, and recovery from attacks. In this regard, it is imperative to make the required changes in the architecture of the IoT applications for achieving end-to-end secure IoT environments. In this paper, a detailed review of the security-related challenges and sources of threat in the IoT applications is presented. After discussing the security issues, various emerging and existing technologies focused on achieving a high degree of trust in the IoT applications are discussed. Four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.
371 citations
[...]
TL;DR: This paper reviews several optimization methods to improve the accuracy of the training and to reduce training time, and delve into the math behind training algorithms used in recent deep networks.
Abstract: Deep learning (DL) is playing an increasingly important role in our lives. It has already made a huge impact in areas, such as cancer diagnosis, precision medicine, self-driving cars, predictive forecasting, and speech recognition. The painstakingly handcrafted feature extractors used in traditional learning, classification, and pattern recognition systems are not scalable for large-sized data sets. In many cases, depending on the problem complexity, DL can also overcome the limitations of earlier shallow networks that prevented efficient training and abstractions of hierarchical representations of multi-dimensional training data. Deep neural network (DNN) uses multiple (deep) layers of units with highly optimized algorithms and architectures. This paper reviews several optimization methods to improve the accuracy of the training and to reduce training time. We delve into the math behind training algorithms used in recent deep networks. We describe current shortcomings, enhancements, and implementations. The review also covers different types of deep architectures, such as deep convolution networks, deep residual networks, recurrent neural networks, reinforcement learning, variational autoencoders, and others.
356 citations
[...]
TL;DR: This paper reviews the literature, tabulate, and summarize the emerging blockchain applications, platforms, and protocols specifically targeting AI area, and identifies and discusses open research challenges of utilizing blockchain technologies for AI.
Abstract: Recently, artificial intelligence (AI) and blockchain have become two of the most trending and disruptive technologies. Blockchain technology has the ability to automate payment in cryptocurrency and to provide access to a shared ledger of data, transactions, and logs in a decentralized, secure, and trusted manner. Also with smart contracts, blockchain has the ability to govern interactions among participants with no intermediary or a trusted third party. AI, on the other hand, offers intelligence and decision-making capabilities for machines similar to humans. In this paper, we present a detailed survey on blockchain applications for AI. We review the literature, tabulate, and summarize the emerging blockchain applications, platforms, and protocols specifically targeting AI area. We also identify and discuss open research challenges of utilizing blockchain technologies for AI.
315 citations
[...]
TL;DR: The experiment results show that the proposed ICMPACO algorithm can effectively obtain the best optimization value in solving TSP and effectively solve the gate assignment problem, obtain better assignment result, and it takes on better optimization ability and stability.
Abstract: In this paper, an improved ant colony optimization (ICMPACO) algorithm based on the multi-population strategy, co-evolution mechanism, pheromone updating strategy, and pheromone diffusion mechanism is proposed to balance the convergence speed and solution diversity, and improve the optimization performance in solving the large-scale optimization problem. In the proposed ICMPACO algorithm, the optimization problem is divided into several sub-problems and the ants in the population are divided into elite ants and common ants in order to improve the convergence rate, and avoid to fall into the local optimum value. The pheromone updating strategy is used to improve optimization ability. The pheromone diffusion mechanism is used to make the pheromone released by ants at a certain point, which gradually affects a certain range of adjacent regions. The co-evolution mechanism is used to interchange information among different sub-populations in order to implement information sharing. In order to verify the optimization performance of the ICMPACO algorithm, the traveling salesmen problem (TSP) and the actual gate assignment problem are selected here. The experiment results show that the proposed ICMPACO algorithm can effectively obtain the best optimization value in solving TSP and effectively solve the gate assignment problem, obtain better assignment result, and it takes on better optimization ability and stability.
310 citations
[...]
TL;DR: A thorough examination of the different studies that have been conducted since 2006, when deep learning first arose as a new area of machine learning, for speech applications is provided.
Abstract: Over the past decades, a tremendous amount of research has been done on the use of machine learning for speech processing applications, especially speech recognition. However, in the past few years, research has focused on utilizing deep learning for speech-related applications. This new area of machine learning has yielded far better results when compared to others in a variety of applications including speech, and thus became a very attractive area of research. This paper provides a thorough examination of the different studies that have been conducted since 2006, when deep learning first arose as a new area of machine learning, for speech applications. A thorough statistical analysis is provided in this review which was conducted by extracting specific information from 174 papers published between the years 2006 and 2018. The results provided in this paper shed light on the trends of research in this area as well as bring focus to new research topics.
304 citations
[...]
TL;DR: This survey provides a comprehensive overview of a variety of object detection methods in a systematic manner, covering the one-stage and two-stage detectors, and lists the traditional and new applications.
Abstract: Object detection is one of the most important and challenging branches of computer vision, which has been widely applied in people's life, such as monitoring security, autonomous driving and so on, with the purpose of locating instances of semantic objects of a certain class. With the rapid development of deep learning algorithms for detection tasks, the performance of object detectors has been greatly improved. In order to understand the main development status of object detection pipeline thoroughly and deeply, in this survey, we analyze the methods of existing typical detection models and describe the benchmark datasets at first. Afterwards and primarily, we provide a comprehensive overview of a variety of object detection methods in a systematic manner, covering the one-stage and two-stage detectors. Moreover, we list the traditional and new applications. Some representative branches of object detection are analyzed as well. Finally, we discuss the architecture of exploiting these object detection methods to build an effective and efficient system and point out a set of development trends to better follow the state-of-the-art algorithms and further research.
294 citations
[...]
TL;DR: The Rel-16 features and outlook towards Rel-17 and beyond are discussed and new features to further expand the applicability of the 5G System to new markets and use cases are introduced.
Abstract: The 5G System is being developed and enhanced to provide unparalleled connectivity to connect everyone and everything, everywhere. The first version of the 5G System, based on the Release 15 (“Rel-15”) version of the specifications developed by 3GPP, comprising the 5G Core (5GC) and 5G New Radio (NR) with 5G User Equipment (UE), is currently being deployed commercially throughout the world both at sub-6 GHz and at mmWave frequencies. Concurrently, the second phase of 5G is being standardized by 3GPP in the Release 16 (“Rel-16”) version of the specifications which will be completed by March 2020. While the main focus of Rel-15 was on enhanced mobile broadband services, the focus of Rel-16 is on new features for URLLC (Ultra-Reliable Low Latency Communication) and Industrial IoT, including Time Sensitive Communication (TSC), enhanced Location Services, and support for Non-Public Networks (NPNs). In addition, some crucial new features, such as NR on unlicensed bands (NR-U), Integrated Access & Backhaul (IAB) and NR Vehicle-to-X (V2X), are also being introduced as part of Rel-16, as well as enhancements for massive MIMO, wireless and wireline convergence, the Service Based Architecture (SBA) and Network Slicing. Finally, the number of use cases, types of connectivity and users, and applications running on top of 5G networks, are all expected to increase dramatically, thus motivating additional security features to counter security threats which are expected to increase in number, scale and variety. In this paper, we discuss the Rel-16 features and provide an outlook towards Rel-17 and beyond, covering both new features and enhancements of existing features. 5G Evolution will focus on three main areas: enhancements to features introduced in Rel-15 and Rel-16, features that are needed for operational enhancements, and new features to further expand the applicability of the 5G System to new markets and use cases.
[...]
TL;DR: This paper proposes a novel method that aims at finding significant features by applying machine learning techniques resulting in improving the accuracy in the prediction of cardiovascular disease with the hybrid random forest with a linear model (HRFLM).
Abstract: Heart disease is one of the most significant causes of mortality in the world today. Prediction of cardiovascular disease is a critical challenge in the area of clinical data analysis. Machine learning (ML) has been shown to be effective in assisting in making decisions and predictions from the large quantity of data produced by the healthcare industry. We have also seen ML techniques being used in recent developments in different areas of the Internet of Things (IoT). Various studies give only a glimpse into predicting heart disease with ML techniques. In this paper, we propose a novel method that aims at finding significant features by applying machine learning techniques resulting in improving the accuracy in the prediction of cardiovascular disease. The prediction model is introduced with different combinations of features and several known classification techniques. We produce an enhanced performance level with an accuracy level of 88.7% through the prediction model for heart disease with the hybrid random forest with a linear model (HRFLM).
[...]
TL;DR: This paper considers a downlink multiple-input single-output (MISO) broadcast system, where the base station transmits independent data streams to multiple legitimate receivers and keeps them secret from multiple eavesdroppers and proposes an efficient algorithm based on the alternating optimization and the path-following algorithm to solve it in an iterative manner.
Abstract: In this paper, we introduce an intelligent reflecting surface (IRS) to provide a programmable wireless environment for physical layer security. By adjusting the reflecting coefficients, the IRS can change the attenuation and scattering of the incident electromagnetic wave so that it can propagate in the desired way toward the intended receiver. Specifically, we consider a downlink multiple-input single-output (MISO) broadcast system, where the base station (BS) transmits independent data streams to multiple legitimate receivers and keeps them secret from multiple eavesdroppers. By jointly optimizing the beamformers at the BS and reflecting coefficients at the IRS, we formulate a minimum-secrecy-rate maximization problem under various practical constraints on the reflecting coefficients. The constraints capture the scenarios of both continuous and discrete reflecting coefficients of the reflecting elements. Due to the non-convexity of the formulated problem, we propose an efficient algorithm based on the alternating optimization and the path-following algorithm to solve it in an iterative manner. Besides, we show that the proposed algorithm can converge to a local (global) optimum. Furthermore, we develop two suboptimal algorithms with some forms of closed-form solutions to reduce computational complexity. Finally, the simulation results validate the advantages of the introduced IRS and the effectiveness of the proposed algorithms.
[...]
TL;DR: In this article, the IEEE 802.11bd and NR V2X standardization for vehicular RATs is surveyed and compared with their respective predecessors, and the authors highlight their inability to guarantee the quality of service requirements of many advanced vehicular applications.
Abstract: With the rising interest in autonomous vehicles, developing radio access technologies (RATs) that enable reliable and low-latency vehicular communications has become of paramount importance. Dedicated short-range communications (DSRCs) and cellular V2X (C-V2X) are two present-day technologies that are capable of supporting day-1 vehicular applications. However, these RATs fall short of supporting communication requirements of many advanced vehicular applications, which are believed to be critical in enabling fully autonomous vehicles. Both the DSRC and C-V2X are undergoing extensive enhancements in order to support advanced vehicular applications that are characterized by high reliability, low latency, and high throughput requirements. These RAT evolutions-the IEEE 802.11bd for the DSRC and NR V2X for C-V2X-can supplement today's vehicular sensors in enabling autonomous driving. In this paper, we survey the latest developments in the standardization of 802.11bd and NR V2X. We begin with a brief description of the two present-day vehicular RATs. In doing so, we highlight their inability to guarantee the quality of service requirements of many advanced vehicular applications. We then look at the two RAT evolutions, i.e., the IEEE 802.11bd and NR V2X, outline their objectives, describe their salient features, and provide an in-depth description of key mechanisms that enable these features. While both, the IEEE 802.11bd and NR V2X, are in their initial stages of development, we shed light on their preliminary performance projections and compare and contrast the two evolutionary RATs with their respective predecessors.
[...]
TL;DR: The basic theory of GANs and the differences among different generative models in recent years were analyzed and summarized and the derived models of GAns are classified and introduced one by one.
Abstract: Generative adversarial network (GANs) is one of the most important research avenues in the field of artificial intelligence, and its outstanding data generation capacity has received wide attention. In this paper, we present the recent progress on GANs. First, the basic theory of GANs and the differences among different generative models in recent years were analyzed and summarized. Then, the derived models of GANs are classified and introduced one by one. Third, the training tricks and evaluation metrics were given. Fourth, the applications of GANs were introduced. Finally, the problem, we need to address, and future directions were discussed.
[...]
TL;DR: It is shown that the outsourced training introduces new security risks: an adversary can create a maliciously trained network (a backdoored neural network, or a BadNet) that has the state-of-the-art performance on the user's training and validation samples but behaves badly on specific attacker-chosen inputs.
Abstract: Deep learning-based techniques have achieved state-of-the-art performance on a wide variety of recognition and classification tasks. However, these networks are typically computationally expensive to train, requiring weeks of computation on many GPUs; as a result, many users outsource the training procedure to the cloud or rely on pre-trained models that are then fine-tuned for a specific task. In this paper, we show that the outsourced training introduces new security risks: an adversary can create a maliciously trained network (a backdoored neural network, or a BadNet) that has the state-of-the-art performance on the user's training and validation samples but behaves badly on specific attacker-chosen inputs. We first explore the properties of BadNets in a toy example, by creating a backdoored handwritten digit classifier. Next, we demonstrate backdoors in a more realistic scenario by creating a U.S. street sign classifier that identifies stop signs as speed limits when a special sticker is added to the stop sign; we then show in addition that the backdoor in our U.S. street sign detector can persist even if the network is later retrained for another task and cause a drop in an accuracy of 25% on average when the backdoor trigger is present. These results demonstrate that backdoors in neural networks are both powerful and-because the behavior of neural networks is difficult to explicate-stealthy. This paper provides motivation for further research into techniques for verifying and inspecting neural networks, just as we have developed tools for verifying and debugging software.
[...]
TL;DR: The review reveals that several opportunities are available for utilizing blockchain in various industrial sectors; however, there are still some challenges to be addressed to achieve better utilization of this technology.
Abstract: Blockchain technologies have recently come to the forefront of the research and industrial communities as they bring potential benefits for many industries. This is due to their practical capabilities in solving many issues currently inhibiting further advances in various industrial domains. Securely recording and sharing transactional data, establishing automated and efficient supply chain processes, and enhancing transparency across the whole value chain are some examples of these issues. Blockchain offers an effective way to tackle these issues using distributed, shared, secure, and permissioned transactional ledgers. The employment of blockchain technologies and the possibility of applying them in different situations enables many industrial applications through increased efficiency and security; enhanced traceability and transparency; and reduced costs. In this paper, different industrial application domains where the use of blockchain technologies has been proposed are reviewed. This paper explores the opportunities, benefits, and challenges of incorporating blockchain in different industrial applications. Furthermore, the paper attempts to identify the requirements that support the implementation of blockchain for different industrial applications. The review reveals that several opportunities are available for utilizing blockchain in various industrial sectors; however, there are still some challenges to be addressed to achieve better utilization of this technology.
[...]
TL;DR: The potential of wireless sensors and IoT in agriculture, as well as the challenges expected to be faced when integrating this technology with the traditional farming practices are highlighted.
Abstract: Despite the perception people may have regarding the agricultural process, the reality is that today's agriculture industry is data-centered, precise, and smarter than ever. The rapid emergence of the Internet-of-Things (IoT) based technologies redesigned almost every industry including “smart agriculture” which moved the industry from statistical to quantitative approaches. Such revolutionary changes are shaking the existing agriculture methods and creating new opportunities along a range of challenges. This article highlights the potential of wireless sensors and IoT in agriculture, as well as the challenges expected to be faced when integrating this technology with the traditional farming practices. IoT devices and communication techniques associated with wireless sensors encountered in agriculture applications are analyzed in detail. What sensors are available for specific agriculture application, like soil preparation, crop status, irrigation, insect and pest detection are listed. How this technology helping the growers throughout the crop stages, from sowing until harvesting, packing and transportation is explained. Furthermore, the use of unmanned aerial vehicles for crop surveillance and other favorable applications such as optimizing crop yield is considered in this article. State-of-the-art IoT-based architectures and platforms used in agriculture are also highlighted wherever suitable. Finally, based on this thorough review, we identify current and future trends of IoT in agriculture and highlight potential research challenges.
[...]
TL;DR: A novel QC-assisted and QML-based framework for 6G communication networks is proposed while articulating its challenges and potential enabling technologies at the network infrastructure, network edge, air interface, and user end.
Abstract: The upcoming fifth generation (5G) of wireless networks is expected to lay a foundation of intelligent networks with the provision of some isolated artificial intelligence (AI) operations. However, fully intelligent network orchestration and management for providing innovative services will only be realized in Beyond 5G (B5G) networks. To this end, we envisage that the sixth generation (6G) of wireless networks will be driven by on-demand self-reconfiguration to ensure a many-fold increase in the network performance and service types. The increasingly stringent performance requirements of emerging networks may finally trigger the deployment of some interesting new technologies, such as large intelligent surfaces, electromagnetic–orbital angular momentum, visible light communications, and cell-free communications, to name a few. Our vision for 6G is a massively connected complex network capable of rapidly responding to the users’ service calls through real-time learning of the network state as described by the network edge (e.g., base-station locations and cache contents), air interface (e.g., radio spectrum and propagation channel), and the user-side (e.g., battery-life and locations). The multi-state, multi-dimensional nature of the network state, requiring the real-time knowledge, can be viewed as a quantum uncertainty problem. In this regard, the emerging paradigms of machine learning (ML), quantum computing (QC), and quantum ML (QML) and their synergies with communication networks can be considered as core 6G enablers. Considering these potentials, starting with the 5G target services and enabling technologies, we provide a comprehensive review of the related state of the art in the domains of ML (including deep learning), QC, and QML and identify their potential benefits, issues, and use cases for their applications in the B5G networks. Subsequently, we propose a novel QC-assisted and QML-based framework for 6G communication networks while articulating its challenges and potential enabling technologies at the network infrastructure, network edge, air interface, and user end. Finally, some promising future research directions for the quantum- and QML-assisted B5G networks are identified and discussed.
[...]
TL;DR: This work proposes an improved authentication protocol for IoV that performs better in terms of security and performance and provides a formal proof to the proposed protocol to demonstrate that the protocol is indeed secure.
Abstract: An Internet of Vehicles (IoV) allows forming a self-organized network and broadcasting messages for the vehicles on roads. However, as the data are transmitted in an insecure network, it is essential to use an authentication mechanism to protect the privacy of vehicle users. Recently, Ying et al. proposed an authentication protocol for IoV and claimed that the protocol could resist various attacks. Unfortunately, we discovered that their protocol suffered from an offline identity guessing attack, location spoofing attack, and replay attack, and consumed a considerable amount of time for authentication. To resolve these shortcomings, we propose an improved protocol. In addition, we provide a formal proof to the proposed protocol to demonstrate that our protocol is indeed secure. Compared with previous methods, the proposed protocol performs better in terms of security and performance.
[...]
TL;DR: A comprehensive survey on the IoT-aided smart grid systems is presented in this article, which includes the existing architectures, applications, and prototypes of the IoTaided SG systems.
Abstract: Traditional power grids are being transformed into smart grids (SGs) to address the issues in the existing power system due to uni-directional information flow, energy wastage, growing energy demand, reliability, and security. SGs offer bi-directional energy flow between service providers and consumers, involving power generation, transmission, distribution, and utilization systems. SGs employ various devices for the monitoring, analysis, and control of the grid, deployed at power plants, distribution centers, and in consumers' premises in a very large number. Hence, an SG requires connectivity, automation, and the tracking of such devices. This is achieved with the help of the Internet of Things (IoT). The IoT helps SG systems to support various network functions throughout the generation, transmission, distribution, and consumption of energy by incorporating the IoT devices (such as sensors, actuators, and smart meters), as well as by providing the connectivity, automation, and tracking for such devices. In this paper, we provide a comprehensive survey on the IoT-aided SG systems, which includes the existing architectures, applications, and prototypes of the IoT-aided SG systems. This survey also highlights the open issues, challenges, and future research directions for the IoT-aided SG systems.
[...]
TL;DR: This paper proposes an approach that leverages the Ethereum blockchain and smart contracts efficiently perform business transactions for soybean tracking and traceability across the agricultural supply chain, eliminating the need for a trusted centralized authority, intermediaries and provides transactions records, enhancing efficiency and safety with high integrity, reliability, and security.
Abstract: The globalized production and the distribution of agriculture production bring a renewed focus on the safety, quality, and the validation of several important criteria in agriculture and food supply chains. The growing number of issues related to food safety and contamination risks has established an immense need for effective traceability solution that acts as an essential quality management tool ensuring adequate safety of products in the agricultural supply chain. Blockchain is a disruptive technology that can provide an innovative solution for product traceability in agriculture and food supply chains. Today’s agricultural supply chains are complex ecosystem involving several stakeholders making it cumbersome to validate several important criteria such as country of origin, stages in crop development, conformance to quality standards, and monitor yields. In this paper, we propose an approach that leverages the Ethereum blockchain and smart contracts efficiently perform business transactions for soybean tracking and traceability across the agricultural supply chain. Our proposed solution eliminates the need for a trusted centralized authority, intermediaries and provides transactions records, enhancing efficiency and safety with high integrity, reliability, and security. The proposed solution focuses on the utilization of smart contracts to govern and control all interactions and transactions among all the participants involved within the supply chain ecosystem. All transactions are recorded and stored in the blockchain’s immutable ledger with links to a decentralized file system (IPFS) and thus providing to all a high level of transparency and traceability into the supply chain ecosystem in a secure, trusted, reliable, and efficient manner.
[...]
TL;DR: The techniques investigated in this paper represent the recent trends in the FPGA-based accelerators of deep learning networks and are expected to direct the future advances on efficient hardware accelerators and to be useful for deep learning researchers.
Abstract: Due to recent advances in digital technologies, and availability of credible data, an area of artificial intelligence, deep learning, has emerged and has demonstrated its ability and effectiveness in solving complex learning problems not possible before. In particular, convolutional neural networks (CNNs) have demonstrated their effectiveness in the image detection and recognition applications. However, they require intensive CPU operations and memory bandwidth that make general CPUs fail to achieve the desired performance levels. Consequently, hardware accelerators that use application-specific integrated circuits, field-programmable gate arrays (FPGAs), and graphic processing units have been employed to improve the throughput of CNNs. More precisely, FPGAs have been recently adopted for accelerating the implementation of deep learning networks due to their ability to maximize parallelism and their energy efficiency. In this paper, we review the recent existing techniques for accelerating deep learning networks on FPGAs. We highlight the key features employed by the various techniques for improving the acceleration performance. In addition, we provide recommendations for enhancing the utilization of FPGAs for CNNs acceleration. The techniques investigated in this paper represent the recent trends in the FPGA-based accelerators of deep learning networks. Thus, this paper is expected to direct the future advances on efficient hardware accelerators and to be useful for deep learning researchers.
[...]
TL;DR: Inspired by the U-net architecture and its variants successfully applied to various medical image segmentation, this paper proposes NAS-Unet which is stacked by the same number of DownSC and UpSC on a U-like backbone network.
Abstract: Neural architecture search (NAS) has significant progress in improving the accuracy of image classification. Recently, some works attempt to extend NAS to image segmentation which shows preliminary feasibility. However, all of them focus on searching architecture for semantic segmentation in natural scenes. In this paper, we design three types of primitive operation set on search space to automatically find two cell architecture DownSC and UpSC for semantic image segmentation especially medical image segmentation. Inspired by the U-net architecture and its variants successfully applied to various medical image segmentation, we propose NAS-Unet which is stacked by the same number of DownSC and UpSC on a U-like backbone network. The architectures of DownSC and UpSC updated simultaneously by a differential architecture strategy during the search stage. We demonstrate the good segmentation results of the proposed method on Promise12, Chaos, and ultrasound nerve datasets, which collected by magnetic resonance imaging, computed tomography, and ultrasound, respectively. Without any pretraining, our architecture searched on PASCAL VOC2012, attains better performances and much fewer parameters (about 0.8M) than U-net and one of its variants when evaluated on the above three types of medical image datasets.
[...]
TL;DR: A binary version of the hybrid grey wolf optimization (GWO) and particle swarm optimization (PSO) is proposed to solve feature selection problems in this paper and significantly outperformed the binary GWO (BGWO), the binary PSO, the binary genetic algorithm, and the whale optimization algorithm with simulated annealing when using several performance measures.
Abstract: A binary version of the hybrid grey wolf optimization (GWO) and particle swarm optimization (PSO) is proposed to solve feature selection problems in this paper. The original PSOGWO is a new hybrid optimization algorithm that benefits from the strengths of both GWO and PSO. Despite the superior performance, the original hybrid approach is appropriate for problems with a continuous search space. Feature selection, however, is a binary problem. Therefore, a binary version of hybrid PSOGWO called BGWOPSO is proposed to find the best feature subset. To find the best solutions, the wrapper-based method K-nearest neighbors classifier with Euclidean separation matric is utilized. For performance evaluation of the proposed binary algorithm, 18 standard benchmark datasets from UCI repository are employed. The results show that BGWOPSO significantly outperformed the binary GWO (BGWO), the binary PSO, the binary genetic algorithm, and the whale optimization algorithm with simulated annealing when using several performance measures including accuracy, selecting the best optimal features, and the computational time.
[...]
TL;DR: A comprehensive study on the application of big data and machine learning in the electrical power grid introduced through the emergence of the next-generation power system—the smart grid (SG), with current limitations with viable solutions along with their effectiveness.
Abstract: This paper conducts a comprehensive study on the application of big data and machine learning in the electrical power grid introduced through the emergence of the next-generation power system-the smart grid (SG). Connectivity lies at the core of this new grid infrastructure, which is provided by the Internet of Things (IoT). This connectivity, and constant communication required in this system, also introduced a massive data volume that demands techniques far superior to conventional methods for proper analysis and decision-making. The IoT-integrated SG system can provide efficient load forecasting and data acquisition technique along with cost-effectiveness. Big data analysis and machine learning techniques are essential to reaping these benefits. In the complex connected system of SG, cyber security becomes a critical issue; IoT devices and their data turning into major targets of attacks. Such security concerns and their solutions are also included in this paper. Key information obtained through literature review is tabulated in the corresponding sections to provide a clear synopsis; and the findings of this rigorous review are listed to give a concise picture of this area of study and promising future fields of academic and industrial research, with current limitations with viable solutions along with their effectiveness.
[...]
TL;DR: The newly released IEEE Std C95.1™-2019 defines exposure criteria and associated limits for the protection of persons against established adverse health effects from exposures to electric, magnetic, and electromagnetic fields in the frequency range 0 Hz to 300 GHz.
Abstract: The newly released IEEE Std C95.1™-2019 defines exposure criteria and associated limits for the protection of persons against established adverse health effects from exposures to electric, magnetic, and electromagnetic fields, in the frequency range 0 Hz to 300 GHz. The exposure limits apply to persons permitted in restricted environments and to the general public in unrestricted environments. These limits are not intended to apply to the exposure of patients by or under the direction of physicians and care professionals, as well as to the exposure of informed volunteers in scientific research studies, or to the use of medical devices or implants. IEEE Std C95.1™-2019 can be obtained at no cost from the IEEE Get Program https://ieeexplore.ieee.org/document/8859679.
[...]
TL;DR: Wang et al. as mentioned in this paper investigated the underlying behavior of the timing channel from the perspective of the memory activity records and summarized the signature of timing channel in the underlying memory activities.
Abstract: Recently, the Infrastructure as a Service Cloud (IaaS) (e.g., Amazon EC2) has been widely used by many organizations. However, some IaaS security issues create serious threats to its users. A typical issue is the timing channel. This kind of channel can be a cross-VM information channel, as proven by many researchers. Owing to the fact that it is covert and traceless, the traditional identification methods cannot build an accurate analysis model and obtain a compromised result. We investigated the underlying behavior of the timing channel from the perspective of the memory activity records and summarized the signature of the timing channel in the underlying memory activities. An identification method based on the long-term behavior signatures was proposed. We proposed a complete set of forensics steps including evidence extraction, identification, record reserve, and evidence reports. We studied four typical timing channels, and the experiments showed that these channels can be detected and investigated, even with the disturbances from normal processes.
[...]
TL;DR: A comparative study of the tradeoffs of blockchain is presented, a comparison among different consensus mechanisms is provided, and challenges, including scalability, privacy, interoperability, energy consumption and regulatory issues are discussed.
Abstract: Blockchain is the underlying technology of a number of digital cryptocurrencies. Blockchain is a chain of blocks that store information with digital signatures in a decentralized and distributed network. The features of blockchain, including decentralization, immutability, transparency and auditability, make transactions more secure and tamper proof. Apart from cryptocurrency, blockchain technology can be used in financial and social services, risk management, healthcare facilities, and so on. A number of research studies focus on the opportunity that blockchain provides in various application domains. This paper presents a comparative study of the tradeoffs of blockchain and also explains the taxonomy and architecture of blockchain, provides a comparison among different consensus mechanisms and discusses challenges, including scalability, privacy, interoperability, energy consumption and regulatory issues. In addition, this paper also notes the future scope of blockchain technology.
[...]
TL;DR: This survey presents a detailed survey on wireless evolution towards 6G networks, characterized by ubiquitous 3D coverage, introduction of pervasive AI and enhanced network protocol stack, and related potential technologies that are helpful in forming sustainable and socially seamless networks.
Abstract: While 5G is being commercialized worldwide, research institutions around the world have started to look beyond 5G and 6G is expected to evolve into green networks, which deliver high Quality of Service and energy efficiency. To meet the demands of future applications, significant improvements need to be made in mobile network architecture. We envision 6G undergoing unprecedented breakthrough and integrating traditional terrestrial mobile networks with emerging space, aerial and underwater networks to provide anytime anywhere network access. This paper presents a detailed survey on wireless evolution towards 6G networks. In this survey, the prime focus is on the new architectural changes associated with 6G networks, characterized by ubiquitous 3D coverage, introduction of pervasive AI and enhanced network protocol stack. Along with this, we discuss related potential technologies that are helpful in forming sustainable and socially seamless networks, encompassing terahertz and visible light communication, new communication paradigm, blockchain and symbiotic radio. Our work aims to provide enlightening guidance for subsequent research of green 6G.