scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Network in 2019"


Journal ArticleDOI
TL;DR: In this paper, the authors proposed to integrate the Deep Reinforcement Learning techniques and Federated Learning framework with mobile edge systems for optimizing mobile edge computing, caching and communication, and designed the "In-Edge AI" framework in order to intelligently utilize the collaboration among devices and edge nodes to exchange the learning parameters for a better training and inference of the models, and thus to carry out dynamic system-level optimization and application-level enhancement while reducing the unnecessary system communication load.
Abstract: Recently, along with the rapid development of mobile communication technology, edge computing theory and techniques have been attracting more and more attention from global researchers and engineers, which can significantly bridge the capacity of cloud and requirement of devices by the network edges, and thus can accelerate content delivery and improve the quality of mobile services. In order to bring more intelligence to edge systems, compared to traditional optimization methodology, and driven by the current deep learning techniques, we propose to integrate the Deep Reinforcement Learning techniques and Federated Learning framework with mobile edge systems, for optimizing mobile edge computing, caching and communication. And thus, we design the "In-Edge AI" framework in order to intelligently utilize the collaboration among devices and edge nodes to exchange the learning parameters for a better training and inference of the models, and thus to carry out dynamic system-level optimization and application-level enhancement while reducing the unnecessary system communication load. "In-Edge AI" is evaluated and proved to have near-optimal performance but relatively low overhead of learning, while the system is cognitive and adaptive to mobile communication systems. Finally, we discuss several related challenges and opportunities for unveili

764 citations


Journal ArticleDOI
TL;DR: A number of key technical challenges as well as the potential solutions associated with 6G, including physical-layer transmission techniques, network designs, security approaches, and testbed developments are outlined.
Abstract: With the fast development of smart terminals and emerging new applications (e.g., real-time and interactive services), wireless data traffic has drastically increased, and current cellular networks (even the forthcoming 5G) cannot completely match the quickly rising technical requirements. To meet the coming challenges, the sixth generation (6G) mobile network is expected to cast the high technical standard of new spectrum and energy-efficient transmission techniques. In this article, we sketch the potential requirements and present an overview of the latest research on the promising techniques evolving to 6G, which have recently attracted considerable attention. Moreover, we outline a number of key technical challenges as well as the potential solutions associated with 6G, including physical-layer transmission techniques, network designs, security approaches, and testbed developments.

731 citations


Journal ArticleDOI
TL;DR: This article presents an efficient energy scheduling scheme with deep reinforcement learning for the proposed framework of an IoT-based energy management system based on edge computing infrastructure withDeep reinforcement learning.
Abstract: In recent years, green energy management systems (smart grid, smart buildings, and so on) have received huge research and industrial attention with the explosive development of smart cities. By introducing Internet of Things (IoT) technology, smart cities are able to achieve exquisite energy management by ubiquitous monitoring and reliable communications. However, long-term energy efficiency has become an important issue when using an IoT-based network structure. In this article, we focus on designing an IoT-based energy management system based on edge computing infrastructure with deep reinforcement learning. First, an overview of IoT-based energy management in smart cities is described. Then the framework and software model of an IoT-based system with edge computing are proposed. After that, we present an efficient energy scheduling scheme with deep reinforcement learning for the proposed framework. Finally, we illustrate the effectiveness of the proposed scheme.

344 citations


Journal ArticleDOI
TL;DR: A secure and intelligent architecture for next-generation wireless networks is proposed by integrating AI and blockchain into wireless networks to enable flexible and secure resource sharing and a new caching scheme is developed by utilizing deep reinforcement learning.
Abstract: Blockchain and AI are promising techniques for next-generation wireless networks. Blockchain can establish a secure and decentralized resource sharing environment. AI can be explored to solve problems with uncertain, time-variant, and complex features. Both of these techniques have recently seen a surge in interest. The integration of these two techniques can further enhance the performance of wireless networks. In this article, we first propose a secure and intelligent architecture for next-generation wireless networks by integrating AI and blockchain into wireless networks to enable flexible and secure resource sharing. Then we propose a blockchain empowered content caching problem to maximize system utility, and develop a new caching scheme by utilizing deep reinforcement learning. Numerical results demonstrate the effectiveness of the proposed scheme.

279 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce the basic concept of blockchain and illustrate why a consensus mechanism plays an indispensable role in a blockchain enabled IoT system, and discuss the main ideas of two famous consensus mechanisms, PoW and PoS, and list their limitations in IoT.
Abstract: Blockchain has been regarded as a promising technology for IoT, since it provides significant solutions for decentralized networks that can address trust and security concerns, high maintenance cost problems, and so on. The decentralization provided by blockchain can be largely attributed to the use of a consensus mechanism, which enables peer-to-peer trading in a distributed manner without the involvement of any third party. This article starts by introducing the basic concept of blockchain and illustrating why a consensus mechanism plays an indispensable role in a blockchain enabled IoT system. Then we discuss the main ideas of two famous consensus mechanisms, PoW and PoS, and list their limitations in IoT. Next, two mainstream DAG based consensus mechanisms, the Tangle and Hashgraph, are reviewed to show why DAG consensus is more suitable for IoT system than PoW and PoS. Potential issues and challenges of DAG based consensus mechanisms to be addressed in the future are discussed in the last section.

235 citations


Journal ArticleDOI
TL;DR: The scalability issue is discussed from the perspectives of throughput, storage and networking, and existing enabling technologies for scalable blockchain systems are presented.
Abstract: In the past decade, crypto-currencies such as Bitcoin and Litecoin have developed rapidly. Blockchain as the underlying technology of these digital crypto-currencies has attracted great attention from academia and industry. Blockchain has many good features, such as trust-free, transparency, anonymity, democracy, automation, decentralization and security. Despite these promising features, scalability is still a key barrier when the blockchain technology is widely used in real business environments. In this article, we focus on the scalability issue, and provide a brief survey of recent studies on scalable blockchain systems. We first discuss the scalability issue from the perspectives of throughput, storage and networking. Then, existing enabling technologies for scalable blockchain systems are presented. We also discuss some research challenges and future research directions for scalable blockchain systems.

202 citations


Journal ArticleDOI
TL;DR: This article constructs an energy-efficient scheduling framework for MEC-enabled IoVs to minimize the energy consumption of RSUs under task latency constraints to satisfy heterogeneous requirements of communication, computation and storage in IoVs.
Abstract: Although modern transportation systems facilitate the daily life of citizens, the ever-increasing energy consumption and air pollution challenge the establishment of green cities. Current studies on green IoV generally concentrate on energy management of either battery-enabled RSUs or electric vehicles. However, computing tasks and load balancing among RSUs have not been fully investigated. In order to satisfy heterogeneous requirements of communication, computation and storage in IoVs, this article constructs an energy-efficient scheduling framework for MEC-enabled IoVs to minimize the energy consumption of RSUs under task latency constraints. Specifically, a heuristic algorithm is put forward by jointly considering task scheduling among MEC servers and downlink energy consumption of RSUs. To the best of our knowledge, this is a prior work to focus on the energy consumption control issues of MEC-enabled RSUs. Performance evaluations demonstrate the effectiveness of our framework in terms of energy consumption, latency and task blocking possibility. Finally, this article elaborates some major challenges and open issues toward energy-efficient scheduling in IoVs.

200 citations


Journal ArticleDOI
TL;DR: An edge intelligence and blockchain empowered IIoT framework is presented, which achieves flexible and secure edge service management and a cross-domain sharing inspired edge resource scheduling scheme and a credit-differentiated edge transaction approval mechanism are proposed.
Abstract: Edge intelligence is a key enabler for IIoT as it offers smart cloud services in close proximity to the production environment with low latency and less cost. The need for ubiquitous communication, computing, and caching resources in 5G beyond will lead to a growing demand to integrate heterogeneous resources into the edge network. Furthermore, distributed edge services can make resource transactions vulnerable to malicious nodes. Ensuring secure edge services under complex industrial networks is a big challenge. In this article, we present an edge intelligence and blockchain empowered IIoT framework, which achieves flexible and secure edge service management. Then we propose a cross-domain sharing inspired edge resource scheduling scheme and design a credit-differentiated edge transaction approval mechanism. Numerical results indicate that the proposed schemes bring significant improvement in both edge service cost and service capacities.

195 citations


Journal ArticleDOI
TL;DR: The vision of exploiting MEC for s-health applications is envisioned and two main functionalities that can be implemented leveraging such an architecture to provide efficient data delivery are presented, namely, multimodal data compression and edge-based feature extraction for event detection.
Abstract: Improving the efficiency of healthcare systems is a top national interest worldwide. However, the need to deliver scalable healthcare services to patients while reducing costs is a challenging issue. Among the most promising approaches for enabling smart healthcare (s-health) are edge-computing capabilities and next-generation wireless networking technologies that can provide real-time and cost-effective patient remote monitoring. In this article, we present our vision of exploiting MEC for s-health applications. We envision a MEC-based architecture and discuss the benefits that it can bring to realize in-network and context-aware processing so that the s-health requirements are met. We then present two main functionalities that can be implemented leveraging such an architecture to provide efficient data delivery, namely, multimodal data compression and edge-based feature extraction for event detection. The former allows efficient and low distortion compression, while the latter ensures high-reliability and fast response in case of emergency applications. Finally, we discuss the main challenges and opportunities that edge computing could provide and possible directions for future research.

147 citations


Journal ArticleDOI
TL;DR: An AI enhanced offloading framework is proposed for service accuracy maximization, which considers service accuracy as a new metric besides delay, and intelligently disseminates the traffic to edge servers or through an appropriate path to remote cloud.
Abstract: The Industrial Internet of Things (IIoT) enables intelligent industrial operations by incorporating artificial intelligence (AI) and big data technologies. An AI-enabled framework typically requires prompt and private cloud-based service to process and aggregate manufacturing data. Thus, integrating intelligence into edge computing is without doubt a promising development trend. Nevertheless, edge intelligence brings heterogeneity to the edge servers, in terms of not only computing capability, but also service accuracy. Most works on offloading in edge computing focus on finding the power-delay trade-off, ignoring service accuracy provided by edge servers as well as the accuracy required by IIoT devices. In this vein, in this article we introduce an intelligent computing architecture with cooperative edge and cloud computing for IIoT. Based on the computing architecture, an AI enhanced offloading framework is proposed for service accuracy maximization, which considers service accuracy as a new metric besides delay, and intelligently disseminates the traffic to edge servers or through an appropriate path to remote cloud. A case study is performed on transfer learning to show the performance gain of the proposed framework.

144 citations


Journal ArticleDOI
TL;DR: A smart security framework for VANETs equipped with edge computing nodes and 5G technology has been designed to enhance the capabilities of communication and computation in the modern smart city environment.
Abstract: With the exponential growth of technologies such as IoT, edge computing, and 5G, a tremendous amount of structured and unstructured data is being generated from different applications in the smart citiy environment in recent years. Thus, there is a need to develop sophisticated techniques that can efficiently process such huge volumes of data. One of the important components of smart cities, ITS, has led to many applications, including surveillance, infotainment, real-time traffic monitoring, and so on. However, its security, performance, and availability are major concerns facing the research community. The existing solutions, such as cellular networks, RSUs, and mobile cloud computing, are far from perfect because these are highly dependent on centralized architecture and bear the cost of additional infrastructure deployment. Also, the conventional methods of data processing are not capable of handling dynamic and scalable data efficiently. To mitigate these issues, this article proposes an advanced vehicular communication technique where RSUs are proposed to be replaced by edge computing platforms. Then secure V2V and V2E communication is designed using the Quotient filter, a probabilistic data structure. In summary, a smart security framework for VANETs equipped with edge computing nodes and 5G technology has been designed to enhance the capabilities of communication and computation in the modern smart city environment. It has been experimentally demonstrated that use of edge nodes as an intermediate interface between vehicle and cloud reduces access latency and avoids congestion in the backbone network, which allows quick decisions to be made based on the traffic scenario in the geographical location of the vehicles. The proposed scheme outperforms the conventional vehicular models by providing an energy-efficient secure system with minimum delay.

Journal ArticleDOI
TL;DR: A hybrid computing model, UAV-Edge-Cloud is proposed, bringing edge/cloud computing and UAV swarm together to achieve high quality of service (QoS) guarantees and simulation results show that this approach can improve the QoS of UAV swarms effectively.
Abstract: In this article, we propose a hybrid computing model, UAV-Edge-Cloud, bringing edge/cloud computing and UAV swarm together to achieve high quality of service (QoS) guarantees. First, we design this novel hybrid computing framework to provide powerful resources to support resource-intensive applications and real-time tasks at edge networks. Next, we discuss some potential applications for smart cities and raise open research issues of the proposed hybrid framework. We then study a joint task placement and routing problem for latency-critical applications as a case study. Finally, the simulation results show that our approach can improve the QoS of UAV swarms effectively.

Journal ArticleDOI
TL;DR: In order to accommodate large-size images with storage-constrained blocks, a carefully selected feature vector from each medical image is captured and a customized transaction structure is designed, which protects the privacy of medical images and image features.
Abstract: With the advent of medical IoT devices, the types and volumes of medical images have significantly increased. Retrieving of medical images is of great importance to facilitate disease diagnosis and improve treatment efficiency. However, it may raise privacy concerns from individuals, since medical images contain patients' sensitive and private information. Existing studies on retrieval of medical data either fail to protect sensitive information of medical images or are limited to a single image data provider. In this article, we propose a blockchain-based system for medical image retrieval with privacy protection. We first describe the typical scenarios of medical image retrieval and summarize the corresponding requirements in system design. Using the emerging blockchain techniques, we present the layered architecture and threat model of the proposed system. In order to accommodate large-size images with storage-constrained blocks, we capture a carefully selected feature vector from each medical image and design a customized transaction structure, which protects the privacy of medical images and image features. We also discuss the challenges and opportunities of future research.

Journal ArticleDOI
TL;DR: A brief survey of the challenges and opportunities of THz band operation in wireless communication, along with some potential applications and future research directions is provided.
Abstract: With 5G Phase 1 finalized and 5G Phase 2 recently defined by 3GPP, the mobile communication community is on the verge of deciding what will be the Beyond-5G (B5G) system. B5G is expected to further enhance network performance, for example, by supporting throughput per device up to terabits per second and increasing the frequency range of usable spectral bands significantly. In fact, one of the main pillars of 5G networks has been radio access extension to the millimeter-wave bands. However, new envisioned services, asking for more and more throughput, require the availability of one order of magnitude more spectrum chunks, thus suggesting moving the operations into the THz domain. This move will introduce significant new multidisciplinary research challenges emerging throughout the wireless communication protocol stacks, including the way the mobile network is modeled and deployed. This article, therefore, provides a brief survey of the challenges and opportunities of THz band operation in wireless communication, along with some potential applications and future research directions.

Journal ArticleDOI
TL;DR: To evolve with the new computing and communication paradigms, theCIoT ecosystem has to update by absorbing new capabilities such as deep learning, the CIoT sensing system, data analytics, and cognitiion in providing human-like intelligence.
Abstract: A new network paradigm, CIoT, has been proposed by applying cognitive computing technologies, which is derived from cognitive science and artificial intelligence in combination with the data generated by connected IoT devices and the actions that these devices perform. The development of cognitive computing is very important in the above process to meet key technical challenges, such as generation of big sensory data, efficient computing/storage at the CIoT edge, and integration of multiple data sources and types. On the other hand, to evolve with the new computing and communication paradigms, the CIoT ecosystem has to update by absorbing new capabilities such as deep learning, the CIoT sensing system, data analytics, and cognitiion in providing human-like intelligence.

Journal ArticleDOI
TL;DR: This article focuses on the principles and models of resource allocation algorithms in 5G network slicing, and introduces the basic ideas of the SDN and NFV with their roles in network slicing.
Abstract: With the rapid and sustained growth of network demands, 5G telecommunication networks are expected to provide flexible, scalable, and resilient communication and network services, not only for traditional network operators, but also for vertical industries, OTT, and third parties to satisfy their different requirements. Network slicing is a promising technology to establish customized end-to-end logic networks comprising dedicated and shared resources. By leveraging SDN and NFV, network slices associated with resources can be tailored to satisfy diverse QoS and SLA. Resource allocation of network slicing plays a pivotal role in load balancing, resource utilization, and networking performance. In this article, we focus on the principles and models of resource allocation algorithms in 5G network slicing. We first introduce the basic ideas of the SDN and NFV with their roles in network slicing. The MO architecture of network slicing is also studied, which provides a fundamental framework of resource allocation algorithms. Then, resource types with corresponding isolation levels in RAN slicing and CN slicing are analyzed, respectively. Furthermore, we categorize the mathematical models of resource allocation algorithms based on their objectives and elaborate them with typical examples. Finally, open research issues are identified with potential solutions.

Journal ArticleDOI
TL;DR: This article studies in this article how to allocate edge resources for average service response time minimization and proposes algorithms to achieve this goal.
Abstract: With IoT-based smart cities, massive heterogeneous IoT devices are running diverse advanced services for unprecedented intelligence and efficiency in various domains of city life. Given the exponentially growing number of IoT devices and the large number of smart city services as well as their different QoS requirements, it has been a big challenge for servers to optimally allocate limited resources to all hosted applications for satisfactory performance. Note that by pushing the computing and storage resources to the proximity of end IoT devices, and deploying applications in distributed edge servers, edge computing technology appears to be a promising solution for this challenge. Toward this, we study in this article how to allocate edge resources for average service response time minimization. Besides the proposed algorithms, extensive numerical results are also presented to validate their efficacy.

Journal ArticleDOI
TL;DR: The analysis shows that the proposed architecture with TI as a network backbone has faster response time and higher reliability in comparison to the existing system, and presents a recent case study on the world's first successfully executed teleslanting heart surgery.
Abstract: Telesurgery in the 5G era has huge potential to deliver healthcare surgical services to remote locations using high-speed data transfer with a wireless communication channel. It provides benefits to society in view of its improved precision and accuracy to diagnose patients even from remote locations. However, the existing traditional telesurgery system has high communication latency and overhead, which limits its applicability in a wide range of future applications. To mitigate these issues, in this article, we analyze and give insights on the 5G-enabled Tactile Internet (TI)-based telesurgery system for Healthcare 4.0. The URLCC service of 5G ensures ultra-low latency (< 1 ms) and ultra-high reliability (99.999 percent) communication channel for remote surgery. We propose an architecture for telesurgery with two different aspects of communication channel: traditional network and 5G-enabled TI. Then we present a recent case study on the world's first successfully executed teleslanting heart surgery. The analysis shows that the proposed architecture with TI as a network backbone has faster response time and higher reliability in comparison to the existing system. Finally, some key open issues and research challenges of the traditional telesurgery architecture in terms of latency and reliability are highlighted.

Journal ArticleDOI
TL;DR: Boomerang is proposed, an on-demand cooperative DNN inference framework for edge intelligence under the IIoT environment that exploits DNN right-sizing and DNN partition to executeDNN inference tasks with low latency as well as high accuracy.
Abstract: With the revolution of smart industry, more and more Industrial Internet of Things (IIoT) devices as well as AI algorithms are deployed to achieve industrial intelligence. While applying computation-intensive deep learning on IIoT devices, however, it is challenging to meet the critical latency requirement for industrial manufacturing. Traditional wisdom resorts to the cloud-centric paradigm but still works either inefficiently or ineffectively due to the heavy transmission latency overhead. To address this challenge, we propose Boomerang, an on-demand cooperative DNN inference framework for edge intelligence under the IIoT environment. Boomerang exploits DNN right-sizing and DNN partition to execute DNN inference tasks with low latency as well as high accuracy. DNN right-sizing reshapes the amount of DNN computation via the early-exit mechanism so as to reduce the total runtime of DNN inference. DNN partition adaptively segments DNN computation between the IoT devices and the edge server in order to leverage hybrid computation resources to achieve DNN inference immediacy. Combining these two keys, Boomerang carefully selects the partition point and the exit point to maximize the performance while promising the efficiency requirement. To further reduce the manual overhead of model profiling at the install phase, we develop an advanced version of Boomerang with the DRL model, achieving end-to-end automatic DNN inference plan generation. The prototype implementation and evaluations demonstrate the effectiveness of Boomerang on both versions in achieving efficient edge intelligence for IIoT.

Journal ArticleDOI
TL;DR: A 5G IoT network with UAVs for future smart city architecture that emphasizes IoT in the sky in a bid to unify the global world via heterogeneous smart devices through 3D connectivity is developed.
Abstract: The paradigm of smart cities links the industry of telecommunications directly to sustainable economic growth and high quality of life. To meet this trend, this article develops a 5G IoT network with UAVs for future smart city architecture. We particularly emphasize IoT in the sky in a bid to unify the global world via heterogeneous smart devices through 3D connectivity. In our design, UAVs form a 5G hierarchical IoT network in the sky by connecting to a number of base stations on the ground. Simulation results reveal that the leader UAV has a significant impact on the performance of the whole system, and the proposed approach has an obvious advantage over the existing ones.

Journal ArticleDOI
TL;DR: This work designs and implements a mobility- aware data processing service migration management agent that can automatically learn the user mobility pattern and accordingly control the service migration among the edge servers to minimize the operational cost at runtime.
Abstract: With the advent of edge computing, it is highly recommended to extend some cloud services to the network edge such that the services can be provisioned in the proximity of end users, with better performance efficiency and cost efficiency. Compared to cloud computing, edge computing has high dynamics, and therefore the resources shall be correspondingly managed in an adaptive way. Traditional model-based resource management approaches are limited in practical application due to the involvement of some assumptions or prerequisites. We think it is desirable to introduce a model-free approach that can fit the network dynamics well without any prior knowledge. To this end, we introduce a model-free DRL approach to efficiently manage the resources at the network edge. Following the design principle of DRL, we design and implement a mobility- aware data processing service migration management agent. The experiments show that our agent can automatically learn the user mobility pattern and accordingly control the service migration among the edge servers to minimize the operational cost at runtime. Some potential future research challenges are also presented.

Journal ArticleDOI
TL;DR: This article proposes a cross-domain solution based on a cloud/fog-computing pattern and the IoT AI service framework that achieves intelligent and flexible autonomous driving task processing and enhances transportation performance with the help of the Cognitive Internet of Vehicles.
Abstract: As it combines AI and IoT, autonomous driving has attracted a great deal of attention from both academia and industry because of its benefits to the economy and society. However, ultra-low delay and ultra-high reliability cannot be guaranteed by individual autonomous vehicles with limited intelligence and the existing architectures of the Internet of Vehicles. In this article, based on a cloud/fog-computing pattern and the IoT AI service framework, we propose a cross-domain solution for auto-driving. In contrast to existing studies, which mainly focus on communication technologies, our solution achieves intelligent and flexible autonomous driving task processing and enhances transportation performance with the help of the Cognitive Internet of Vehicles. We first present an overview of the enabling technology and the architecture of the Cognitive Internet of Vehicles for autonomous driving. Then we discuss the autonomous driving Cognitive Internet of Vehicles specifically from the perspectives of what to compute, where to compute, and how to compute. Simulations are then conducted to prove the effect of the Cognitive Internet of Vehicles for autonomous driving. Our study explores the research value and opportunities of the Cognitive Internet of Vehicles in autonomous driving.

Journal ArticleDOI
TL;DR: A management and orchestration architecture that incorporates Software Defined Networking (SDN) and Network Function Virtualization (NFV) components to the basic 3GPP network slice management.
Abstract: A sophisticated and efficient network slicing architecture is needed to support the orchestration of network slices across multiple administrative domains Such multi-domain architecture shall be agnostic of the underlying virtualization and network infrastructure technologies Its objective is to extend the traditional orchestration, management and control capabilities by means of models and constructs in order to form a well-stitched composition of network slices To facilitate such a composition of networking and compute/storage resources, this article introduces a management and orchestration architecture that incorporates Software Defined Networking (SDN) and Network Function Virtualization (NFV) components to the basic 3GPP network slice management The proposed architecture is broadly divided into four major strata, namely the Multi-domain Service Conductor Stratum, Domain-specific Fully- Fledged Orchestration Stratum, Sub-Domain MANO and Connectivity Stratum, and Logical Multi-domain Slice Instance stratum Each of these strata is described in detail, providing the fundamental operational specifics for instantiating and managing the resulting federated network slices

Journal ArticleDOI
TL;DR: In this article, a supervised DL model is proposed to solve the sub-band and power allocation problem in a multi-cell network, using the data generated by a genetic algorithm, and then test the accuracy of the proposed model in predicting the resource allocation solutions.
Abstract: The increased complexity and heterogeneity of emerging 5G and B5G wireless networks will require a paradigm shift from traditional resource allocation mechanisms. Deep learning (DL) is a powerful tool where a multi-layer neural network can be trained to model a resource management algorithm using network data.Therefore, resource allocation decisions can be obtained without intensive online computations which would be required otherwise for the solution of resource allocation problems. In this context, this article focuses on the application of DL to obtain solutions for the radio resource allocation problems in multi-cell networks. Starting with a brief overview of a DNN as a DL model, relevant DNN architectures and the data training procedure, we provide an overview of existing state-of-the-art applying DL in the context of radio resource allocation. A qualitative comparison is provided in terms of their objectives, inputs/outputs, learning and data training methods. Then, we present a supervised DL model to solve the sub-band and power allocation problem in a multi-cell network. Using the data generated by a genetic algorithm, we first train the model and then test the accuracy of the proposed model in predicting the resource allocation solutions. Simulation results show that the trained DL model is able to provide the desired optimal solution 86.3 percent of the time.

Journal ArticleDOI
TL;DR: The privacy issues in Bitcoin are analyzed and some existing privacy-enhancing techniques in blockchain- based cryptocurrencies as well as some privacy-focused altcoins are investigated, and two possible solutions from a top view to balance privacy and regulation of blockchain-based cryptocurrencies are proposed.
Abstract: Privacy is supreme in cryptocurrencies since most users do not want to reveal their identities or the transaction amount in financial transactions. Nevertheless, achieving privacy in blockchain-based cryptocurrencies remains challenging since blockchain is by default a public ledger. For instance, Bitcoin provides builtin pseudonymity rather than true anonymity, which can be compromised by analyzing the transactions. Several solutions have been proposed to enhance the transaction privacy of Bitcoin. Unfortunately, full anonymity is not always desirable, because malicious users are able to conduct illegal transactions, such as money laundering and drug trading, under the cover of anonymity in cryptocurrencies. As a result, regulation in blockchain-based cryptocurrencies is very essential. In this article, we analyze the privacy issues in Bitcoin and investigate some existing privacy-enhancing techniques in blockchain- based cryptocurrencies as well as some privacy-focused altcoins. In addition, we review and compare some works dealing with regulation of cryptocurrencies. Finally, we propose two possible solutions from a top view to balance privacy and regulation of blockchain-based cryptocurrencies. One solution is based on decentralized group signature, in which a group manager is responsible for building a group and tracing the real payer of the group in a transaction. The other solution is based on verifiable encryption, in which a tracing manager is not actively involved in normal transactions but can trace suspicious transactions via an encrypted tag.

Journal ArticleDOI
TL;DR: In this paper, blockchain has been regarded as a promising technology for IoT, since it provides significant solutions for decentralized networks that can address trust and security concerns, high maintenance and high cost.
Abstract: Blockchain has been regarded as a promising technology for IoT, since it provides significant solutions for decentralized networks that can address trust and security concerns, high maintenance cos...

Journal ArticleDOI
Jinke Ren1, Yinghui He1, Guan Huang1, Guanding Yu1, Yunlong Cai1, Zhaoyang Zhang1 
TL;DR: Wang et al. as discussed by the authors proposed a hierarchical computation architecture by inserting an edge layer between the conventional user layer and cloud layer, and further developed an innovative operation mechanism to improve the performance of mobile AR applications.
Abstract: In order to mitigate the long processing delay and high energy consumption of mobile augmented reality (AR) applications, mobile edge computing (MEC) has been recently proposed and is envisioned as a promising means to deliver better Quality of Experience (QoE) for AR consumers. In this article, we first present a comprehensive AR overview, including the indispensable components of general AR applications, fashionable AR devices, and several existing techniques for overcoming the thorny latency and energy consumption problems. Then we propose a novel hierarchical computation architecture by inserting an edge layer between the conventional user layer and cloud layer. Based on the proposed architecture, we further develop an innovative operation mechanism to improve the performance of mobile AR applications. Three key technologies are also discussed to further assist the proposed AR architecture. Simulation results are finally provided to verify that our proposals can significantly improve latency and energy performance as compared to existing baseline schemes.

Journal ArticleDOI
TL;DR: A new AI-enabled smart edge with heterogeneous IoT architecture that combines edge computing, caching, and communication, and the Smart-Edge-CoCaCo algorithm is proposed that is lower than that of the traditional cloud computing model with the increase of computing task data and the number of concurrent users.
Abstract: The development of mobile communication technology, hardware, distributed computing, and artificial intelligence (AI) technology has promoted the application of edge computing in the field of heterogeneous IoT in order to overcome the defects of the traditional cloud computing model in the era of big data. In this article, we first propose a new AI-enabled smart edge with heterogeneous IoT architecture that combines edge computing, caching, and communication. Then we propose the Smart-Edge-CoCaCo algorithm. To minimize total delay and confirm the computation offloading decision, Smart-Edge-CoCaCo uses joint optimization of the wireless communication model, the collaborative filter caching model in edge cloud, and the computation offloading model. Finally, we built an emotion interaction testbed to perform computational delay experiments in real environments. The experiment results show that the computation delay of the Smart-Edge-CoCaCo algorithm is lower than that of the traditional cloud computing model with the increase of computing task data and the number of concurrent users.

Journal ArticleDOI
TL;DR: The articles in this section cover the most recent research and development on the enabling technologies for IoTbased smart cities and to stimulate discussions on state-of-the-art and innovative aspects in the field.
Abstract: The articles in this special section focus on the application of the Internet of Things in smart cities. Smart cities are creating emerging innovation in academia, industry, and government. A city may be called “smart” when investments in human and social capital and traditional and modern communication infrastructure fuel sustainable economic growth and a high quality of life, with wise management of natural resources through participatory governance. A smart city is also defined as a city connecting the physical infrastructure, the ICT infrastructure, the social infrastructure, and the business infrastructure to leverage the collective intelligence of the city. Smart cities are usually established relying on both advanced infrastructures and modern information and communication technologies. The articles in this section cover the most recent research and development on the enabling technologies for IoTbased smart cities and to stimulate discussions on state-of-the-art and innovative aspects in the field.

Journal ArticleDOI
TL;DR: The state of the-art in satellite-based data networks is discussed and a novel network architecture where Software- defined Networking and Network Function Virtualization serve as key enabling technologies are presented.
Abstract: The Internet of Things for terrestrial deployments is a major part of next-generation 5G wireless systems. However, there are many use cases such as monitoring of remote areas, Internet provisioning to under-served or disrupted regions, or intelligent global transport management, which require a more global, scalable, flexible, and resilient solution. In this article, the Internet of Space Things, a ubiquitous cyber-physical system for realizing true global connectivity, is introduced. Within this context, this article discusses the stateof- the-art in satellite-based data networks and presents a novel network architecture where Software- defined Networking and Network Function Virtualization serve as key enabling technologies. Further, the major challenges in terms of network design, routing, and resource allocation are also presented. To this end, the Internet of Space Things is the ultimate cyber-physical system, with much broader application and servic