scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Internet of Things Journal in 2018"


Journal ArticleDOI
TL;DR: The definition of MEC, its advantages, architectures, and application areas are provided; where the security and privacy issues and related existing solutions are also discussed.
Abstract: Mobile edge computing (MEC) is an emergent architecture where cloud computing services are extended to the edge of networks leveraging mobile base stations. As a promising edge technology, it can be applied to mobile, wireless, and wireline scenarios, using software and hardware platforms, located at the network edge in the vicinity of end-users. MEC provides seamless integration of multiple application service providers and vendors toward mobile subscribers, enterprises, and other vertical segments. It is an important component in the 5G architecture which supports variety of innovative applications and services where ultralow latency is required. This paper is aimed to present a comprehensive survey of relevant research and technological developments in the area of MEC. It provides the definition of MEC, its advantages, architectures, and application areas; where we in particular highlight related research and future directions. Finally, security and privacy issues and related existing solutions are also discussed.

1,815 citations


Journal ArticleDOI
Oscar Novo1
TL;DR: This paper proposes a new architecture for arbitrating roles and permissions in IoT based on blockchain technology and shows that the blockchain technology could be used as access management technology in specific scalable IoT scenarios.
Abstract: The Internet of Things (IoT) is stepping out of its infancy into full maturity and establishing itself as a part of the future Internet. One of the technical challenges of having billions of devices deployed worldwide is the ability to manage them. Although access management technologies exist in IoT, they are based on centralized models which introduce a new variety of technical limitations to manage them globally. In this paper, we propose a new architecture for arbitrating roles and permissions in IoT. The new architecture is a fully distributed access control system for IoT based on blockchain technology. The architecture is backed by a proof of concept implementation and evaluated in realistic IoT scenarios. The results show that the blockchain technology could be used as access management technology in specific scalable IoT scenarios.

992 citations


Journal ArticleDOI
TL;DR: The IoT ecosystem is presented and how the combination of IoT and DA is enabling smart agriculture, and future trends and opportunities are provided which are categorized into technological innovations, application scenarios, business, and marketability.
Abstract: The surge in global population is compelling a shift toward smart agriculture practices. This coupled with the diminishing natural resources, limited availability of arable land, increase in unpredictable weather conditions makes food security a major concern for most countries. As a result, the use of Internet of Things (IoT) and data analytics (DA) are employed to enhance the operational efficiency and productivity in the agriculture sector. There is a paradigm shift from use of wireless sensor network (WSN) as a major driver of smart agriculture to the use of IoT and DA. The IoT integrates several existing technologies, such as WSN, radio frequency identification, cloud computing, middleware systems, and end-user applications. In this paper, several benefits and challenges of IoT have been identified. We present the IoT ecosystem and how the combination of IoT and DA is enabling smart agriculture. Furthermore, we provide future trends and opportunities which are categorized into technological innovations, application scenarios, business, and marketability.

814 citations


Journal ArticleDOI
TL;DR: The analysis shows that augmenting off-board information to sensory information has potential to design low-cost localization systems with high accuracy and robustness, however, their performance depends on penetration rate of nearby connected vehicles or infrastructure and the quality of network service.
Abstract: For an autonomous vehicle to operate safely and effectively, an accurate and robust localization system is essential. While there are a variety of vehicle localization techniques in literature, there is a lack of effort in comparing these techniques and identifying their potentials and limitations for autonomous vehicle applications. Hence, this paper evaluates the state-of-the-art vehicle localization techniques and investigates their applicability on autonomous vehicles. The analysis starts with discussing the techniques which merely use the information obtained from on-board vehicle sensors. It is shown that although some techniques can achieve the accuracy required for autonomous driving but suffer from the high cost of the sensors and also sensor performance limitations in different driving scenarios (e.g., cornering and intersections) and different environmental conditions (e.g., darkness and snow). This paper continues the analysis with considering the techniques which benefit from off-board information obtained from V2X communication channels, in addition to vehicle sensory information. The analysis shows that augmenting off-board information to sensory information has potential to design low-cost localization systems with high accuracy and robustness, however, their performance depends on penetration rate of nearby connected vehicles or infrastructure and the quality of network service.

570 citations


Journal ArticleDOI
TL;DR: This survey paper investigates the key rationale, the state-of-the-art efforts, the key enabling technologies and research topics, and typical IoT applications benefiting from edge cloud.
Abstract: The Internet is evolving rapidly toward the future Internet of Things (IoT) which will potentially connect billions or even trillions of edge devices which could generate huge amount of data at a very high speed and some of the applications may require very low latency. The traditional cloud infrastructure will run into a series of difficulties due to centralized computation, storage, and networking in a small number of datacenters, and due to the relative long distance between the edge devices and the remote datacenters. To tackle this challenge, edge cloud and edge computing seem to be a promising possibility which provides resources closer to the resource-poor edge IoT devices and potentially can nurture a new IoT innovation ecosystem. Such prospect is enabled by a series of emerging technologies, including network function virtualization and software defined networking. In this survey paper, we investigate the key rationale, the state-of-the-art efforts, the key enabling technologies and research topics, and typical IoT applications benefiting from edge cloud. We aim to draw an overall picture of both ongoing research efforts and future possible research directions through comprehensive discussions.

563 citations


Journal ArticleDOI
TL;DR: This paper tries to bring order on the IoT security panorama providing a taxonomic analysis from the perspective of the three main key layers of the IoT system model: 1) perception; 2) transportation; and 3) application levels.
Abstract: Social Internet of Things (SIoT) is a new paradigm where Internet of Things (IoT) merges with social networks, allowing people and devices to interact, and facilitating information sharing. However, security and privacy issues are a great challenge for IoT but they are also enabling factors to create a “trust ecosystem.” In fact, the intrinsic vulnerabilities of IoT devices, with limited resources and heterogeneous technologies, together with the lack of specifically designed IoT standards, represent a fertile ground for the expansion of specific cyber threats. In this paper, we try to bring order on the IoT security panorama providing a taxonomic analysis from the perspective of the three main key layers of the IoT system model: 1) perception; 2) transportation; and 3) application levels. As a result of the analysis, we will highlight the most critical issues with the aim of guiding future research directions.

524 citations


Journal ArticleDOI
TL;DR: This work discusses the benefits of IoV along with recent industry standards developed to promote its implementation, and presents recently proposed communication protocols to enable the seamless integration and operation of the IoV.
Abstract: Today, vehicles are increasingly being connected to the Internet of Things which enable them to provide ubiquitous access to information to drivers and passengers while on the move. However, as the number of connected vehicles keeps increasing, new requirements (such as seamless, secure, robust, scalable information exchange among vehicles, humans, and roadside infrastructures) of vehicular networks are emerging. In this context, the original concept of vehicular ad-hoc networks is being transformed into a new concept called the Internet of Vehicles (IoV). We discuss the benefits of IoV along with recent industry standards developed to promote its implementation. We further present recently proposed communication protocols to enable the seamless integration and operation of the IoV. Finally, we present future research directions of IoV that require further consideration from the vehicular research community.

471 citations


Journal ArticleDOI
TL;DR: An energy-aware offloading scheme, which jointly optimizes communication and computation resource allocation under the limited energy and sensitive latency, and an iterative search algorithm combining interior penalty function with D.C. (the difference of two convex functions/sets) programming to find the optimal solution.
Abstract: Mobile edge computing (MEC) brings computation capacity to the edge of mobile networks in close proximity to smart mobile devices (SMDs) and contributes to energy saving compared with local computing, but resulting in increased network load and transmission latency. To investigate the tradeoff between energy consumption and latency, we present an energy-aware offloading scheme, which jointly optimizes communication and computation resource allocation under the limited energy and sensitive latency. In this paper, single and multicell MEC network scenarios are considered at the same time. The residual energy of smart devices’ battery is introduced into the definition of the weighting factor of energy consumption and latency. In terms of the mixed integer nonlinear problem for computation offloading and resource allocation, we propose an iterative search algorithm combining interior penalty function with D.C. (the difference of two convex functions/sets) programming to find the optimal solution. Numerical results show that the proposed algorithm can obtain lower total cost (i.e., the weighted sum of energy consumption and execution latency) comparing with the baseline algorithms, and the energy-aware weighting factor is of great significance to maintain the lifetime of SMDs.

467 citations


Journal ArticleDOI
TL;DR: An experimental evaluation of edge computing and its enabling technologies in a selected use case represented by mobile gaming shows that edge computing is necessary to meet the latency requirements of applications involving virtual and augmented reality.
Abstract: The amount of data generated by sensors, actuators, and other devices in the Internet of Things (IoT) has substantially increased in the last few years IoT data are currently processed in the cloud, mostly through computing resources located in distant data centers As a consequence, network bandwidth and communication latency become serious bottlenecks This paper advocates edge computing for emerging IoT applications that leverage sensor streams to augment interactive applications First, we classify and survey current edge computing architectures and platforms, then describe key IoT application scenarios that benefit from edge computing Second, we carry out an experimental evaluation of edge computing and its enabling technologies in a selected use case represented by mobile gaming To this end, we consider a resource-intensive 3-D application as a paradigmatic example and evaluate the response delay in different deployment scenarios Our experimental results show that edge computing is necessary to meet the latency requirements of applications involving virtual and augmented reality We conclude by discussing what can be achieved with current edge computing platforms and how emerging technologies will impact on the deployment of future IoT applications

448 citations


Journal ArticleDOI
TL;DR: The goal of this paper is the development of an anomaly detection system to prevent the motor of the drone from operating at abnormal temperatures and the experimental results confirm that the proposed system can safely control the drone using information obtained from temperature sensors attached to the motor.
Abstract: Unmanned aerial vehicles (UAVs) are used in many fields including weather observation, farming, infrastructure inspection, and monitoring of disaster areas. However, the currently available UAVs are prone to crashing. The goal of this paper is the development of an anomaly detection system to prevent the motor of the drone from operating at abnormal temperatures. In this anomaly detection system, the temperature of the motor is recorded using DS18B20 sensors. Then, using reinforcement learning, the motor is judged to be operating abnormally by a Raspberry Pi processing unit. A specially built user interface allows the activity of the Raspberry Pi to be tracked on a Tablet for observation purposes. The proposed system provides the ability to land a drone when the motor temperature exceeds an automatically generated threshold. The experimental results confirm that the proposed system can safely control the drone using information obtained from temperature sensors attached to the motor.

443 citations


Journal ArticleDOI
TL;DR: An assessment of the role, impact and challenges of IoT in transforming EPESs is provided and several opportunities for growth and development are offered.
Abstract: A transformation is underway in electric power and energy systems (EPESs) to provide clean distributed energy for sustainable global economic growth. Internet of Things (IoT) is at the forefront of this transformation imparting capabilities, such as real-time monitoring, situational awareness and intelligence, control, and cyber security to transform the existing EPES into intelligent cyber-enabled EPES, which is more efficient, secure, reliable, resilient, and sustainable. Additionally, digitizing the electric power ecosystem using IoT improves asset visibility, optimal management of distributed generation, eliminates energy wastage, and create savings. IoT has a significant impact on EPESs and offers several opportunities for growth and development. There are several challenges with the deployment of IoT for EPESs. Viable solutions need to be developed to overcome these challenges to ensure continued growth of IoT for EPESs. The advancements in computational intelligence capabilities can evolve an intelligent IoT system by emulating biological nervous systems with cognitive computation, streaming and distributed analytics including at the edge and device levels. This review paper provides an assessment of the role, impact and challenges of IoT in transforming EPESs.

Journal ArticleDOI
TL;DR: In this article, the authors utilized queuing theory to bring a thorough study on the energy consumption, execution delay, and payment cost of offloading processes in a fog computing system, where three queuing models were applied, respectively, to the MD, fog, and cloud centers, and the data rate and power consumption of the wireless link were explicitly considered.
Abstract: Fog computing system is an emergent architecture for providing computing, storage, control, and networking capabilities for realizing Internet of Things. In the fog computing system, the mobile devices (MDs) can offload its data or computational expensive tasks to the fog node within its proximity, instead of distant cloud. Although offloading can reduce energy consumption at the MDs, it may also incur a larger execution delay including transmission time between the MDs and the fog/cloud servers, and waiting and execution time at the servers. Therefore, how to balance the energy consumption and delay performance is of research importance. Moreover, based on the energy consumption and delay, how to design a cost model for the MDs to enjoy the fog and cloud services is also important. In this paper, we utilize queuing theory to bring a thorough study on the energy consumption, execution delay, and payment cost of offloading processes in a fog computing system. Specifically, three queuing models are applied, respectively, to the MD, fog, and cloud centers, and the data rate and power consumption of the wireless link are explicitly considered. Based on the theoretical analysis, a multiobjective optimization problem is formulated with a joint objective to minimize the energy consumption, execution delay, and payment cost by finding the optimal offloading probability and transmit power for each MD. Extensive simulation studies are conducted to demonstrate the effectiveness of the proposed scheme and the superior performance over several existed schemes are observed.

Journal ArticleDOI
TL;DR: A cross-layer-based channel access and routing solution for sensing and actuating is proposed for monitoring and controlling agriculture and farms in rural areas and reduces network latency up to a certain extent.
Abstract: Internet of Things (IoT) gives a new dimension in the area of smart farming and agriculture domain. With the use of fog computing and WiFi-based long distance network in IoT, it is possible to connect the agriculture and farming bases situated in rural areas efficiently. To focus on the specific requirements, we propose a scalable network architecture for monitoring and controlling agriculture and farms in rural areas. Compared to the existing IoT-based agriculture and farming solutions, the proposed solution reduces network latency up to a certain extent. In this, a cross-layer-based channel access and routing solution for sensing and actuating is proposed. We analyze the network structure based on coverage range, throughput, and latency.

Journal ArticleDOI
TL;DR: Results depict that the proposed Bayesian belief network classifier-based model has high accuracy and response time in determining the state of an event when compared with other classification algorithms, which enhances the utility of the proposed system.
Abstract: Internet of Things (IoT) technology provides a competent and structured approach to handle service deliverance aspects of healthcare in terms of mobile health and remote patient monitoring. IoT generates an unprecedented amount of data that can be processed using cloud computing. But for real-time remote health monitoring applications, the delay caused by transferring data to the cloud and back to the application is unacceptable. Relative to this context, we proposed the remote patient health monitoring in smart homes by using the concept of fog computing at the smart gateway. The proposed model uses advanced techniques and services, such as embedded data mining, distributed storage, and notification services at the edge of the network. Event triggering-based data transmission methodology is adopted to process the patient’s real-time data at fog layer. Temporal mining concept is used to analyze the events adversity by calculating the temporal health index of the patient. In order to determine the validity of the system, health data of 67 patients in IoT-based smart home environment was systematically generated for 30 days. Results depict that the proposed Bayesian belief network classifier-based model has high accuracy and response time in determining the state of an event when compared with other classification algorithms. Moreover, decision making based on real-time healthcare data further enhances the utility of the proposed system.

Journal ArticleDOI
TL;DR: The field is reviewed from a historical perspective, covering ubiquitous and pervasive computing, ambient intelligence, and wireless sensor networks, and then, move to context-aware computing studies, which identify the open issues and provide an insight for future study areas for IoT researchers.
Abstract: Internet of Things (IoT) has been growing rapidly due to recent advancements in communications and sensor technologies. Meanwhile, with this revolutionary transformation, researchers, implementers, deployers, and users are faced with many challenges. IoT is a complicated, crowded, and complex field; there are various types of devices, protocols, communication channels, architectures, middleware, and more. Standardization efforts are plenty, and this chaos will continue for quite some time. What is clear, on the other hand, is that IoT deployments are increasing with accelerating speed, and this trend will not stop in the near future. As the field grows in numbers and heterogeneity, “intelligence” becomes a focal point in IoT. Since data now becomes “big data,” understanding, learning, and reasoning with big data is paramount for the future success of IoT. One of the major problems in the path to intelligent IoT is understanding “context,” or making sense of the environment, situation, or status using data from sensors, and then acting accordingly in autonomous ways. This is called “context-aware computing,” and it now requires both sensing and, increasingly, learning, as IoT systems get more data and better learning from this big data. In this survey, we review the field, first, from a historical perspective, covering ubiquitous and pervasive computing, ambient intelligence, and wireless sensor networks, and then, move to context-aware computing studies. Finally, we review learning and big data studies related to IoT. We also identify the open issues and provide an insight for future study areas for IoT researchers.

Journal ArticleDOI
TL;DR: This paper proposes a semisupervised DRL model that fits smart city applications as it consumes both labeled and unlabeled data to improve the performance and accuracy of the learning agent and utilizes variational autoencoders as the inference engine for generalizing optimal policies.
Abstract: Smart services are an important element of the smart cities and the Internet of Things (IoT) ecosystems where the intelligence behind the services is obtained and improved through the sensory data. Providing a large amount of training data is not always feasible; therefore, we need to consider alternative ways that incorporate unlabeled data as well. In recent years, deep reinforcement learning (DRL) has gained great success in several application domains. It is an applicable method for IoT and smart city scenarios where auto-generated data can be partially labeled by users’ feedback for training purposes. In this paper, we propose a semisupervised DRL model that fits smart city applications as it consumes both labeled and unlabeled data to improve the performance and accuracy of the learning agent. The model utilizes variational autoencoders as the inference engine for generalizing optimal policies. To the best of our knowledge, the proposed model is the first investigation that extends DRL to the semisupervised paradigm. As a case study of smart city applications, we focus on smart buildings and apply the proposed model to the problem of indoor localization based on Bluetooth low energy signal strength. Indoor localization is the main component of smart city services since people spend significant time in indoor environments. Our model learns the best action policies that lead to a close estimation of the target locations with an improvement of 23% in terms of distance to the target and at least 67% more received rewards compared to the supervised DRL model.

Journal ArticleDOI
TL;DR: The design of a new secure lightweight three-factor remote user authentication scheme for HIoTNs, called the user authenticated key management protocol (UAKMP), which is comparable in computation and communication costs as compared to other existing schemes.
Abstract: In recent years, the research in generic Internet of Things (IoT) attracts a lot of practical applications including smart home, smart city, smart grid, industrial Internet, connected healthcare, smart retail, smart supply chain and smart farming. The hierarchical IoT network (HIoTN) is a special kind of the generic IoT network, which is composed of the different nodes, such as the gateway node, cluster head nodes, and sensing nodes organized in a hierarchy. In HIoTN, there is a need, where a user can directly access the real-time data from the sensing nodes for a particular application in generic IoT networking environment. This paper emphasizes on the design of a new secure lightweight three-factor remote user authentication scheme for HIoTNs, called the user authenticated key management protocol (UAKMP). The three factors used in UAKMP are the user smart card, password, and personal biometrics. The security of the scheme is thoroughly analyzed under the formal security in the widely accepted real-or-random model, the informal security as well as the formal security verification using the widely accepted automated validation of Internet security protocols and applications tool. UAKMP offers several functionality features including offline sensing node registration, freely password and biometric update facility, user anonymity, and sensing node anonymity compared to other related existing schemes. In addition, UAKMP is also comparable in computation and communication costs as compared to other existing schemes.

Journal ArticleDOI
TL;DR: This paper introduces a general framework for IoT-fog-cloud applications, and proposes a delay-minimizing collaboration and offloading policy for fog-capable devices that aims to reduce the service delay for IoT applications.
Abstract: With the Internet of Things (IoT) becoming a major component of our daily life, understanding how to improve the quality of service for IoT applications through fog computing is becoming an important problem. In this paper, we introduce a general framework for IoT-fog-cloud applications, and propose a delay-minimizing collaboration and offloading policy for fog-capable devices that aims to reduce the service delay for IoT applications. We then develop an analytical model to evaluate our policy and show how the proposed framework helps to reduce IoT service delay.

Journal ArticleDOI
TL;DR: The BLE beacon’s cutting-edge applications, the interoperability between packet profiles, the reliability of its signal detection and distance estimation methods, the sustainability of its low energy, and its deployment constraints are discussed to identify research opportunities and directions.
Abstract: While the Internet of Things (IoT) is driving a transformation of current society toward a smarter one, new challenges and opportunities have arisen to accommodate the demands of IoT development. Low power wireless devices are, undoubtedly, the most viable solution for diverse IoT use cases. Among such devices, Bluetooth low energy (BLE) beacons have emerged as one of the most promising due to the ubiquity of Bluetooth-compatible devices, such as iPhones and Android smartphones. However, for BLE beacons to continue penetrating the IoT ecosystem in a holistic manner, interdisciplinary research is needed to ensure seamless integration. This paper consolidates the information on the state-of-the-art BLE beacon, from its application and deployment cases, hardware requirements, and casing design to its software and protocol design, and it delivers a timely review of the related research challenges. In particular, the BLE beacon’s cutting-edge applications, the interoperability between packet profiles, the reliability of its signal detection and distance estimation methods, the sustainability of its low energy, and its deployment constraints are discussed to identify research opportunities and directions.

Journal ArticleDOI
Hongzhi Guo1, Jiajia Liu1, Jie Zhang1, Wen Sun1, Nei Kato2 
TL;DR: This paper provides this paper to study the MECO problem in ultradense IoT networks, and proposes a two-tier game-theoretic greedy offloading scheme as the solution.
Abstract: The emergence of massive Internet of Things (IoT) mobile devices (MDs) and the deployment of ultradense 5G cells have promoted the evolution of IoT toward ultradense IoT networks. In order to meet the diverse quality-of-service and quality of experience demands from the ever-increasing IoT applications, the ultradense IoT networks face unprecedented challenges. Among them, a fundamental one is how to address the conflict between the resource-hungry IoT mobile applications and the resource-constrained IoT MDs. By offloading the IoT MDs’ computation tasks to the edge servers deployed at the radio access infrastructures, including macro base station (MBS) and small cells, mobile-edge computation offloading (MECO) provides us a promising solution. However, note that available MECO research mostly focused on single-tier base station scenario and computation offloading between the MDs and the edge server connected to the MBS. Little works can be found on performing MECO in ultradense IoT networks, i.e., a multiuser ultradense edge server scenario. Toward this end, we provide this paper to study the MECO problem in ultradense IoT networks, and propose a two-tier game-theoretic greedy offloading scheme as our solution. Extensive numerical results corroborate the superior performance of conducting computation offloading among multiple edge servers in ultradense IoT networks.

Journal ArticleDOI
TL;DR: This work studies a trajectory-based interaction time prediction algorithm to cope with an unstable network topology and high rate of disconnection in SIoVs and proposes a cooperative quality-aware system model, which focuses on a reliability assurance strategy and quality optimization method.
Abstract: Because of the enormous potential to guarantee road safety and improve driving experience, social Internet of Vehicle (SIoV) is becoming a hot research topic in both academic and industrial circles. As the ever-increasing variety, quantity, and intelligence of on-board equipment, along with the ever-growing demand for service quality of automobiles, the way to provide users with a range of security-related and user-oriented vehicular applications has become significant. This paper concentrates on the design of a service access system in SIoVs, which focuses on a reliability assurance strategy and quality optimization method. First, in lieu of the instability of vehicular devices, a dynamic access service evaluation scheme is investigated, which explores the potential relevance of vehicles by constructing their social relationships. Next, this work studies a trajectory-based interaction time prediction algorithm to cope with an unstable network topology and high rate of disconnection in SIoVs. At last, a cooperative quality-aware system model is proposed for service access in SIoVs. Simulation results demonstrate the effectiveness of the proposed scheme.

Journal ArticleDOI
TL;DR: The practical application of the democratization of medical devices for both patients and health-care providers is described and unexplored research directions and potential trends to solve uncharted research problems are identified.
Abstract: The Internet of Medical Things (IoMT) designates the interconnection of communication-enabled medical-grade devices and their integration to wider-scale health networks in order to improve patients’ health. However, because of the critical nature of health-related systems, the IoMT still faces numerous challenges, more particularly in terms of reliability, safety, and security. In this paper, we present a comprehensive literature review of recent contributions focused on improving the IoMT through the use of formal methodologies provided by the cyber-physical systems community. We describe the practical application of the democratization of medical devices for both patients and health-care providers. We also identify unexplored research directions and potential trends to solve uncharted research problems.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a cooperative AmBC (CABC) system in which the reader recovers information not only from the A-BD, but also from the RF source.
Abstract: Ambient backscatter communication (AmBC) enables a passive backscatter device to transmit information to a reader using ambient RF signals, and has emerged as a promising solution to green Internet-of-Things (IoT). Conventional AmBC receivers are interested in recovering the information from the ambient backscatter device (A-BD) only. In this paper, we propose a cooperative AmBC (CABC) system in which the reader recovers information not only from the A-BD, but also from the RF source. We first establish the system model for the CABC system from spread spectrum and spectrum sharing perspectives. Then, for flat fading channels, we derive the optimal maximum-likelihood (ML) detector, suboptimal linear detectors as well as successive interference-cancellation (SIC) based detectors. For frequency-selective fading channels, the system model for the CABC system over ambient orthogonal frequency division multiplexing carriers is proposed, upon which a low-complexity optimal ML detector is derived. For both kinds of channels, the bit-error-rate expressions for the proposed detectors are derived in closed forms. Finally, extensive numerical results have shown that, when the A-BD signal and the RF-source signal have equal symbol period, the proposed SIC-based detectors can achieve near-ML detection performance for typical application scenarios, and when the A-BD symbol period is longer than the RF-source symbol period, the existence of backscattered signal in the CABC system can enhance the ML detection performance of the RF-source signal, thanks to the beneficial effect of the backscatter link when the A-BD transmits at a lower rate than the RF source.

Journal ArticleDOI
TL;DR: A new Q-learning-based transmission scheduling mechanism using deep learning for the CIoT is proposed to solve the problem of how to achieve the appropriate strategy to transmit packets of different buffers through multiple channels to maximize the system throughput.
Abstract: Cognitive networks (CNs) are one of the key enablers for the Internet of Things (IoT), where CNs will play an important role in the future Internet in several application scenarios, such as healthcare, agriculture, environment monitoring, and smart metering. However, the current low packet transmission efficiency of IoT faces a problem of the crowded spectrum for the rapidly increasing popularities of various wireless applications. Hence, the IoT that uses the advantages of cognitive technology, namely the cognitive radio-based IoT (CIoT), is a promising solution for IoT applications. A major challenge in CIoT is the packet transmission efficiency using CNs. Therefore, a new Q-learning-based transmission scheduling mechanism using deep learning for the CIoT is proposed to solve the problem of how to achieve the appropriate strategy to transmit packets of different buffers through multiple channels to maximize the system throughput. A Markov decision process-based model is formulated to describe the state transformation of the system. A relay is used to transmit packets to the sink for the other nodes. To maximize the system utility in different system states, the reinforcement learning method, i.e., the Q learning algorithm, is introduced to help the relay to find the optimal strategy. In addition, the stacked auto-encoders deep learning model is used to establish the mapping between the state and the action to accelerate the solution of the problem. Finally, the experimental results demonstrate that the new action selection method can converge after a certain number of iterations. Compared with other algorithms, the proposed method can better transmit packets with less power consumption and packet loss.

Journal ArticleDOI
TL;DR: The results reveal that by utilizing the proposed mechanism, more users benefit from computing services in comparison to an existing offloading mechanism, which significantly reduces the computation delay and enables low-latency fog computing services for delay-sensitive IoT applications.
Abstract: Fog computing, which provides low-latency computing services at the network edge, is an enabler for the emerging Internet of Things (IoT) systems. In this paper, we study the allocation of fog computing resources to the IoT users in a hierarchical computing paradigm including fog and remote cloud computing services. We formulate a computation offloading game to model the competition between IoT users and allocate the limited processing power of fog nodes efficiently. Each user aims to maximize its own quality of experience (QoE), which reflects its satisfaction of using computing services in terms of the reduction in computation energy and delay. Utilizing a potential game approach, we prove the existence of a pure Nash equilibrium (NE) and provide an upper bound for the price of anarchy. Since the time complexity to reach the equilibrium increases exponentially in the number of users, we further propose a near-optimal resource allocation mechanism and prove that in a system with ${N}$ IoT users, it achieves an $\epsilon$ -NE in ${O}$ ( ${N}/\epsilon$ ) time. Through numerical studies, we evaluate the users’ QoE as well as the equilibrium efficiency. Our results reveal that by utilizing the proposed mechanism, more users benefit from computing services in comparison to an existing offloading mechanism. We further show that our proposed mechanism significantly reduces the computation delay and enables low-latency fog computing services for delay-sensitive IoT applications.

Journal ArticleDOI
Abstract: With the fast development of Internet of Things (IoT), the fifth generation (5G) wireless networks need to provide massive connectivity of IoT devices and meet the demand for low latency. To satisfy these requirements, nonorthogonal multiple access (NOMA) has been recognized as a promising solution for 5G networks to significantly improve the network capacity. In parallel with the development of NOMA techniques, mobile edge computing (MEC) is becoming one of the key emerging technologies to reduce the latency and improve the quality of service (QoS) for 5G networks. In order to capture the potential gains of NOMA in the context of MEC, this paper proposes an edge computing aware NOMA technique which can enjoy the benefits of uplink NOMA in reducing MEC users’ uplink energy consumption. To this end, we formulate an NOMA-based optimization framework which minimizes the energy consumption of MEC users via optimizing the user clustering, computing and communication resource allocation, and transmit powers. In particular, similar to frequency resource blocks (RBs), we divide the computing capacity available at the cloudlet to computing RBs. Accordingly, we explore the joint allocation of the frequency and computing RBs to the users that are assigned to different order indices within the NOMA clusters. We also design an efficient heuristic algorithm for user clustering and RBs allocation, and formulate a convex optimization problem for the power control to be solved independently per NOMA cluster. The performance of the proposed NOMA scheme is evaluated via simulations.

Journal ArticleDOI
TL;DR: It is concluded that middleware plays a crucial role in IoT solutions and the proposed architectural approach can be used as a reference model for IoT middleware.
Abstract: Internet of Things (IoT) is a term used to describe an environment where billions of objects, constrained in terms of resources (“things”), are connected to the Internet, and interacting autonomously. With so many objects connected in IoT solutions, the environment in which they are placed becomes smarter. A software, called middleware, plays a key role since it is responsible for most of the intelligence in IoT, integrating data from devices, allowing them to communicate, and make decisions based on collected data. Then, considering requirements of IoT platforms, a reference architecture model for IoT middleware is analyzed, detailing the best operation approaches of each proposed module, as well as proposes basic security features for this type of software. This paper elaborates on a systematic review of the related literature, exploring the differences between the current Internet and IoT-based systems, presenting a deep discussion of the challenges and future perspectives on IoT middleware. Finally, it highlights the difficulties for achieving and enforcing a universal standard. Thus, it is concluded that middleware plays a crucial role in IoT solutions and the proposed architectural approach can be used as a reference model for IoT middleware.

Journal ArticleDOI
TL;DR: Fogs can largely improve the performance of smart city analytics services than cloud only model in terms of job blocking probability and service utility.
Abstract: Analysis of Internet of Things (IoT) sensor data is a key for achieving city smartness. In this paper a multitier fog computing model with large-scale data analytics service is proposed for smart cities applications. The multitier fog is consisted of ad-hoc fogs and dedicated fogs with opportunistic and dedicated computing resources, respectively. The proposed new fog computing model with clear functional modules is able to mitigate the potential problems of dedicated computing infrastructure and slow response in cloud computing. We run analytics benchmark experiments over fogs formed by Rapsberry Pi computers with a distributed computing engine to measure computing performance of various analytics tasks, and create easy-to-use workload models. Quality of services (QoS) aware admission control, offloading, and resource allocation schemes are designed to support data analytics services, and maximize analytics service utilities. Availability and cost models of networking and computing resources are taken into account in QoS scheme design. A scalable system level simulator is developed to evaluate the fog-based analytics service and the QoS management schemes. Experiment results demonstrate the efficiency of analytics services over multitier fogs and the effectiveness of the proposed QoS schemes. Fogs can largely improve the performance of smart city analytics services than cloud only model in terms of job blocking probability and service utility.

Journal ArticleDOI
TL;DR: Narrowband IoT (NB-IoT), as a licensed LPWAN technology, is developed based on the existing long-term evolution specifications and facilities, and henceforth can be viewed as a promising candidate for smart grid communications.
Abstract: The low power wide area network (LPWAN) technologies, which is now embracing a booming era with the development in the Internet of Things (IoT), may offer a brand new solution for current smart grid communications due to their excellent features of low power, long range, and high capacity. The mission-critical smart grid communications require secure and reliable connections between the utilities and the devices with high quality of service (QoS). This is difficult to achieve for unlicensed LPWAN technologies due to the crowded license-free band. Narrowband IoT (NB-IoT), as a licensed LPWAN technology, is developed based on the existing long-term evolution specifications and facilities. Thus, it is able to provide cellular-level QoS, and henceforth can be viewed as a promising candidate for smart grid communications. In this paper, we introduce NB-IoT to the smart grid and compare it with the existing representative communication technologies in the context of smart grid communications in terms of data rate, latency, range, etc. The overall requirements of communications in the smart grid from both quantitative and qualitative perspectives are comprehensively investigated and each of them is carefully examined for NB-IoT. We further explore the representative applications in the smart grid and analyze the corresponding feasibility of NB-IoT. Moreover, the performance of NB-IoT in typical scenarios of the smart grid communication environments, such as urban and rural areas, is carefully evaluated via Monte Carlo simulations.

Journal ArticleDOI
TL;DR: This work considers a deep learning-based prediction and partially overlapping channel assignment to propose a novel intelligent channel assignment algorithm, which can intelligently avoid potential congestion and quickly assign suitable channels in SDN-IoT.
Abstract: Due to the fast increase of sensing data and quick response requirement in the Internet of Things (IoT) delivery network, the high speed transmission has emerged as an important issue. Assigning suitable channels in the wireless IoT delivery network is a basic guarantee of high speed transmission. However, the high dynamics of traffic load (TL) make the conventional fixed channel assignment algorithm ineffective. Recently, the software defined networking-based IoT (SDN-IoT) is proposed to improve the transmission quality. Besides this, the intelligent technique of deep learning is widely researched in high computational SDN. Hence, we first propose a novel deep learning-based TL prediction algorithm to forecast future TL and congestion in network. Then, a deep learning-based partially channel assignment algorithm is proposed to intelligently allocate channels to each link in the SDN-IoT network. Finally, we consider a deep learning-based prediction and partially overlapping channel assignment to propose a novel intelligent channel assignment algorithm, which can intelligently avoid potential congestion and quickly assign suitable channels in SDN-IoT. The simulation result demonstrates that our proposal significantly outperforms conventional channel assignment algorithms.