scispace - formally typeset
Search or ask a question
Author

Yanhua Zhang

Bio: Yanhua Zhang is an academic researcher from Beijing University of Technology. The author has contributed to research in topics: Resource allocation & Cognitive radio. The author has an hindex of 14, co-authored 90 publications receiving 1016 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This survey investigates some of the work that has been done to enable the integrated blockchain and edge computing system and discusses the research challenges, identifying several vital aspects of the integration of blockchain andEdge computing: motivations, frameworks, enabling functionalities, and challenges.
Abstract: Blockchain, as the underlying technology of crypto-currencies, has attracted significant attention. It has been adopted in numerous applications, such as smart grid and Internet-of-Things. However, there is a significant scalability barrier for blockchain, which limits its ability to support services with frequent transactions. On the other side, edge computing is introduced to extend the cloud resources and services to be distributed at the edge of the network, but currently faces challenges in its decentralized management and security. The integration of blockchain and edge computing into one system can enable reliable access and control of the network, storage, and computation distributed at the edges, hence providing a large scale of network servers, data storage, and validity computation near the end in a secure manner. Despite the prospect of integrated blockchain and edge computing systems, its scalability enhancement, self organization, functions integration, resource management, and new security issues remain to be addressed before widespread deployment. In this survey, we investigate some of the work that has been done to enable the integrated blockchain and edge computing system and discuss the research challenges. We identify several vital aspects of the integration of blockchain and edge computing: motivations, frameworks, enabling functionalities, and challenges. Finally, some broader perspectives are explored.

488 citations

Journal ArticleDOI
TL;DR: Simulation results are presented to show that the performance of cache-enabled opportunistic IA networks in terms of the network's sum rate and energy efficiency can be significantly improved by using the proposed approach.
Abstract: Both caching and interference alignment (IA) are promising techniques for next-generation wireless networks. Nevertheless, most of the existing works on cache-enabled IA wireless networks assume that the channel is invariant, which is unrealistic considering the time-varying nature of practical wireless environments. In this paper, we consider realistic time-varying channels. Specifically, the channel is formulated as a finite-state Markov channel (FSMC). The complexity of the system is very high when we consider realistic FSMC models. Therefore, in this paper, we propose a novel deep reinforcement learning approach, which is an advanced reinforcement learning algorithm that uses a deep $Q$ network to approximate the $Q$ value-action function. We use Google TensorFlow to implement deep reinforcement learning in this paper to obtain the optimal IA user selection policy in cache-enabled opportunistic IA wireless networks. Simulation results are presented to show that the performance of cache-enabled opportunistic IA networks in terms of the network's sum rate and energy efficiency can be significantly improved by using the proposed approach.

272 citations

Journal ArticleDOI
TL;DR: In the proposed virtualization for DLT (vDLT), the underlying resources are abstracted and by providing a logical view of resources, vDLT can significantly improve the performance, facilitate system evolution, and simplify DLT management and configuration.
Abstract: Recently, with the tremendous development of crypto-currencies, distributed ledger technology (DLT) (e.g., blockchain) has attracted significant attention. The traditional Internet was originally design to handle the exchange of information. With DLT, we will have the Internet of value. Although, DLT has a great potential to create new foundations for our economic and social systems, the existing DLT has a number of drawbacks (e.g., scalability) that prevent it from being used as a generic platform for distributed ledger across the globe. In this paper, we present a novel virtualization approach to address the challenges in the existing DLT systems. Specifically, in the proposed virtualization for DLT (vDLT), the underlying resources (e.g., hardware, compute, storage, network, and so on) are abstracted. By providing a logical view of resources, vDLT can significantly improve the performance, facilitate system evolution, and simplify DLT management and configuration. Several use cases of vDLT are presented to illustrate the effectiveness of the proposed vDLT.

124 citations

Journal ArticleDOI
TL;DR: A novel vehicle network architecture in the smart city scenario, mitigating the network congestion with the joint optimization of networking, caching, and computing resources is proposed and formulated as a partially observable Markov decision process to minimize the system cost.
Abstract: With the explosion in the number of connected devices and Internet of Things (IoT) services in smart city, the challenges to meet the demands from both data traffic delivery and information processing are increasingly prominent. Meanwhile, the connected vehicle networks have become an essential part in smart city, bringing massive data traffic as well as significant communication, caching, and computing resources. As the two typical services types in smart city, delay-tolerant and delay-sensitive traffic requires very different quality of service (QoS)/quality of experience (QoE), and could be delivered through the routes with different features to meet their QoS/QoE requirements with the lowest costs. In this paper, we propose a novel vehicle network architecture in the smart city scenario, mitigating the network congestion with the joint optimization of networking, caching, and computing resources. Cloud computing at the data centers as well as mobile edge computing at the evolved node Bs and on-board units are taken as the paradigms to provide caching and computing resources. The programmable control principle originated from the software-defined networking paradigm has been introduced into this architecture to facilitate the system optimization and resource integration. With the careful modeling of the services, the vehicle mobility, and the system state, a joint resource management scheme is proposed and formulated as a partially observable Markov decision process to minimize the system cost, which consists of both network overhead and execution time of computing tasks. Extensive simulation results with different system parameters reveal that the proposed scheme could significantly improve the system performance compared to the existing schemes.

83 citations

Journal ArticleDOI
TL;DR: This article introduces some promising technologies, such as edge computing and blockchain, and proposes a joint optimization framework about caching, computation, and security for delay-tolerant data in M2M communications networks based on dueling deep $Q$ -network (DQN).
Abstract: Recently, the development of the Internet of Things (IoT) provides plenty of opportunities and challenges in various fields. As an essential part of IoT, machine-to-machine (M2M) communications open a novel way that the machine-type communication devices (MTCDs) are connected and communicated without any human intervention. Meanwhile, delay-tolerant data play an important role in M2M communications-based IoT, and it puts more emphasis on powerful data caching, computing, and processing, as well as the security and stability of data transmission. To meet these requirements in M2M communications networks, in this article, we introduce some promising technologies, such as edge computing and blockchain, and propose a joint optimization framework about caching, computation, and security for delay-tolerant data in M2M communications networks based on dueling deep $Q$ -network (DQN). According to the dynamic decision process by DQN, the optimal selection and decision of caching servers, computing servers, and blockchain systems can be made to achieve maximum system rewards, which includes higher efficiency of data processing, lower network costs, and better security of data interaction. Extensive simulation results with different system parameters show that our proposed framework can effectively improve the system performance for blockchain-enabled M2M communications compared to the existing schemes.

69 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper presents a comprehensive literature review on applications of deep reinforcement learning (DRL) in communications and networking, and presents applications of DRL for traffic routing, resource sharing, and data collection.
Abstract: This paper presents a comprehensive literature review on applications of deep reinforcement learning (DRL) in communications and networking. Modern networks, e.g., Internet of Things (IoT) and unmanned aerial vehicle (UAV) networks, become more decentralized and autonomous. In such networks, network entities need to make decisions locally to maximize the network performance under uncertainty of network environment. Reinforcement learning has been efficiently used to enable the network entities to obtain the optimal policy including, e.g., decisions or actions, given their states when the state and action spaces are small. However, in complex and large-scale networks, the state and action spaces are usually large, and the reinforcement learning may not be able to find the optimal policy in reasonable time. Therefore, DRL, a combination of reinforcement learning with deep learning, has been developed to overcome the shortcomings. In this survey, we first give a tutorial of DRL from fundamental concepts to advanced models. Then, we review DRL approaches proposed to address emerging issues in communications and networking. The issues include dynamic network access, data rate control, wireless caching, data offloading, network security, and connectivity preservation which are all important to next generation networks, such as 5G and beyond. Furthermore, we present applications of DRL for traffic routing, resource sharing, and data collection. Finally, we highlight important challenges, open issues, and future research directions of applying DRL.

1,153 citations

Journal ArticleDOI
TL;DR: This paper bridges the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas, and provides an encyclopedic review of mobile and Wireless networking research based on deep learning, which is categorize by different domains.
Abstract: The rapid uptake of mobile devices and the rising popularity of mobile applications and services pose unprecedented demands on mobile and wireless networking infrastructure. Upcoming 5G systems are evolving to support exploding mobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as to maximize user experience. Fulfilling these tasks is challenging, as mobile environments are increasingly complex, heterogeneous, and evolving. One potential solution is to resort to advanced machine learning techniques, in order to help manage the rise in data volumes and algorithm-driven applications. The recent success of deep learning underpins new and powerful tools that tackle problems in this space. In this paper, we bridge the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas. We first briefly introduce essential background and state-of-the-art in deep learning techniques with potential applications to networking. We then discuss several techniques and platforms that facilitate the efficient deployment of deep learning onto mobile systems. Subsequently, we provide an encyclopedic review of mobile and wireless networking research based on deep learning, which we categorize by different domains. Drawing from our experience, we discuss how to tailor deep learning to mobile environments. We complete this survey by pinpointing current challenges and open future directions for research.

975 citations

Journal ArticleDOI
TL;DR: 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.
Abstract: The fifth generation (5G) wireless communication networks are being deployed worldwide from 2020 and more capabilities are in the process of being standardized, such as mass connectivity, ultra-reliability, and guaranteed low latency. However, 5G will not meet all requirements of the future in 2030 and beyond, and sixth generation (6G) wireless communication networks are expected to provide global coverage, enhanced spectral/energy/cost efficiency, better intelligence level and security, etc. To meet these requirements, 6G networks will rely on new enabling technologies, i.e., air interface and transmission technologies and novel network architecture, such as waveform design, multiple access, channel coding schemes, multi-antenna technologies, network slicing, cell-free architecture, and cloud/fog/edge computing. Our vision on 6G is that it will have four new paradigm shifts. First, to satisfy the requirement of global coverage, 6G will not be limited to terrestrial communication networks, which will need to be complemented with non-terrestrial networks such as satellite and unmanned aerial vehicle (UAV) communication networks, thus achieving a space-air-ground-sea integrated communication network. Second, all spectra will be fully explored to further increase data rates and connection density, including the sub-6 GHz, millimeter wave (mmWave), terahertz (THz), and optical frequency bands. Third, facing the big datasets generated by the use of extremely heterogeneous networks, diverse communication scenarios, large numbers of antennas, wide bandwidths, and new service requirements, 6G networks will enable a new range of smart applications with the aid of artificial intelligence (AI) and big data technologies. Fourth, network security will have to be strengthened when developing 6G networks. This article provides a comprehensive survey of recent advances and future trends in these four aspects. Clearly, 6G with additional technical requirements beyond those of 5G will enable faster and further communications to the extent that the boundary between physical and cyber worlds disappears.

935 citations

Journal ArticleDOI
TL;DR: This paper is the first to present the state-of-the-art of the SAGIN since existing survey papers focused on either only one single network segment in space or air, or the integration of space-ground, neglecting the Integration of all the three network segments.
Abstract: Space-air-ground integrated network (SAGIN), as an integration of satellite systems, aerial networks, and terrestrial communications, has been becoming an emerging architecture and attracted intensive research interest during the past years. Besides bringing significant benefits for various practical services and applications, SAGIN is also facing many unprecedented challenges due to its specific characteristics, such as heterogeneity, self-organization, and time-variability. Compared to traditional ground or satellite networks, SAGIN is affected by the limited and unbalanced network resources in all three network segments, so that it is difficult to obtain the best performances for traffic delivery. Therefore, the system integration, protocol optimization, resource management, and allocation in SAGIN is of great significance. To the best of our knowledge, we are the first to present the state-of-the-art of the SAGIN since existing survey papers focused on either only one single network segment in space or air, or the integration of space-ground, neglecting the integration of all the three network segments. In light of this, we present in this paper a comprehensive review of recent research works concerning SAGIN from network design and resource allocation to performance analysis and optimization. After discussing several existing network architectures, we also point out some technology challenges and future directions.

661 citations

Journal ArticleDOI
TL;DR: An in-depth survey of BCoT is presented and the insights of this new paradigm are discussed and the open research directions in this promising area are outlined.
Abstract: Internet of Things (IoT) is reshaping the incumbent industry to smart industry featured with data-driven decision-making. However, intrinsic features of IoT result in a number of challenges, such as decentralization, poor interoperability, privacy, and security vulnerabilities. Blockchain technology brings the opportunities in addressing the challenges of IoT. In this paper, we investigate the integration of blockchain technology with IoT. We name such synthesis of blockchain and IoT as blockchain of things (BCoT). This paper presents an in-depth survey of BCoT and discusses the insights of this new paradigm. In particular, we first briefly introduce IoT and discuss the challenges of IoT. Then, we give an overview of blockchain technology. We next concentrate on introducing the convergence of blockchain and IoT and presenting the proposal of BCoT architecture. We further discuss the issues about using blockchain for fifth generation beyond in IoT as well as industrial applications of BCoT. Finally, we outline the open research directions in this promising area.

654 citations