scispace - formally typeset
Search or ask a question
Author

Yixue Hao

Other affiliations: Henan University
Bio: Yixue Hao is an academic researcher from Huazhong University of Science and Technology. The author has contributed to research in topics: Cloud computing & Computer science. The author has an hindex of 24, co-authored 52 publications receiving 3513 citations. Previous affiliations of Yixue Hao include Henan University.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: This paper investigates the task offloading problem in ultra-dense network aiming to minimize the delay while saving the battery life of user’s equipment and proposes an efficient offloading scheme which can reduce 20% of the task duration with 30% energy saving.
Abstract: With the development of recent innovative applications (e.g., augment reality, self-driving, and various cognitive applications), more and more computation-intensive and data-intensive tasks are delay-sensitive. Mobile edge computing in ultra-dense network is expected as an effective solution for meeting the low latency demand. However, the distributed computing resource in edge cloud and energy dynamics in the battery of mobile device makes it challenging to offload tasks for users. In this paper, leveraging the idea of software defined network, we investigate the task offloading problem in ultra-dense network aiming to minimize the delay while saving the battery life of user’s equipment. Specifically, we formulate the task offloading problem as a mixed integer non-linear program which is NP-hard. In order to solve it, we transform this optimization problem into two sub-problems, i.e., task placement sub-problem and resource allocation sub-problem. Based on the solution of the two sub-problems, we propose an efficient offloading scheme. Simulation results prove that the proposed scheme can reduce 20% of the task duration with 30% energy saving, compared with random and uniform task offloading schemes.

821 citations

Journal ArticleDOI
TL;DR: This paper streamline machine learning algorithms for effective prediction of chronic disease outbreak in disease-frequent communities by proposing a new convolutional neural network (CNN)-based multimodal disease risk prediction algorithm using structured and unstructured data from hospital.
Abstract: With big data growth in biomedical and healthcare communities, accurate analysis of medical data benefits early disease detection, patient care, and community services. However, the analysis accuracy is reduced when the quality of medical data is incomplete. Moreover, different regions exhibit unique characteristics of certain regional diseases, which may weaken the prediction of disease outbreaks. In this paper, we streamline machine learning algorithms for effective prediction of chronic disease outbreak in disease-frequent communities. We experiment the modified prediction models over real-life hospital data collected from central China in 2013–2015. To overcome the difficulty of incomplete data, we use a latent factor model to reconstruct the missing data. We experiment on a regional chronic disease of cerebral infarction. We propose a new convolutional neural network (CNN)-based multimodal disease risk prediction algorithm using structured and unstructured data from hospital. To the best of our knowledge, none of the existing work focused on both data types in the area of medical big data analytics. Compared with several typical prediction algorithms, the prediction accuracy of our proposed algorithm reaches 94.8% with a convergence speed, which is faster than that of the CNN-based unimodal disease risk prediction algorithm.

764 citations

Journal ArticleDOI
TL;DR: The background and state-of-the-art of the narrow-band Internet of Things (NB-IoT) is reviewed, including smart cities, smart buildings, intelligent environment monitoring, intelligent user services, and smart metering, and five intelligent applications are analyzed.
Abstract: In this paper, we review the background and state-of-the-art of the narrow-band Internet of Things (NB-IoT). We first introduce NB-IoT general background, development history, and standardization. Then, we present NB-IoT features through the review of current national and international studies on NB-IoT technology, where we focus on basic theories and key technologies, i.e., connection count analysis theory, delay analysis theory, coverage enhancement mechanism, ultra-low power consumption technology, and coupling relationship between signaling and data. Subsequently, we compare several performances of NB-IoT and other wireless and mobile communication technologies in aspects of latency, security, availability, data transmission rate, energy consumption, spectral efficiency, and coverage area. Moreover, we analyze five intelligent applications of NB-IoT, including smart cities, smart buildings, intelligent environment monitoring, intelligent user services, and smart metering. Finally, we summarize security requirements of NB-IoT, which need to be solved urgently. These discussions aim to provide a comprehensive overview of NB-IoT, which can help readers to understand clearly the scientific problems and future research directions of NB-IoT.

346 citations

Journal ArticleDOI
TL;DR: This paper categorizes computation offloading into three modes: remote cloud service mode, connected ad hoc cloudletService mode, and opportunistic ad hocCloudlet service mode and conducts a detailed analytic study for the proposed three modes of computation offload at ad hoccloudlet.
Abstract: As mobile devices are equipped with more memory and computational capability, a novel peer-to-peer communication model for mobile cloud computing is proposed to interconnect nearby mobile devices through various short range radio communication technologies to form mobile cloudlets, where every mobile device works as either a computational service provider or a client of a service requester. Though this kind of computation offloading benefits compute-intensive applications, the corresponding service models and analytics tools are remaining open issues. In this paper we categorize computation offloading into three modes: remote cloud service mode, connected ad hoc cloudlet service mode, and opportunistic ad hoc cloudlet service mode. We also conduct a detailed analytic study for the proposed three modes of computation offloading at ad hoc cloudlet.

227 citations

Journal ArticleDOI
TL;DR: The experiments show that the ECC-based healthcare system provides a better user experience and optimizes the computing resources reasonably, as well as significantly improving in the survival rates of patients in a sudden emergency.

221 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper bridges the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas, and provides an encyclopedic review of mobile and Wireless networking research based on deep learning, which is categorize by different domains.
Abstract: The rapid uptake of mobile devices and the rising popularity of mobile applications and services pose unprecedented demands on mobile and wireless networking infrastructure. Upcoming 5G systems are evolving to support exploding mobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as to maximize user experience. Fulfilling these tasks is challenging, as mobile environments are increasingly complex, heterogeneous, and evolving. One potential solution is to resort to advanced machine learning techniques, in order to help manage the rise in data volumes and algorithm-driven applications. The recent success of deep learning underpins new and powerful tools that tackle problems in this space. In this paper, we bridge the gap between deep learning and mobile and wireless networking research, by presenting a comprehensive survey of the crossovers between the two areas. We first briefly introduce essential background and state-of-the-art in deep learning techniques with potential applications to networking. We then discuss several techniques and platforms that facilitate the efficient deployment of deep learning onto mobile systems. Subsequently, we provide an encyclopedic review of mobile and wireless networking research based on deep learning, which we categorize by different domains. Drawing from our experience, we discuss how to tailor deep learning to mobile environments. We complete this survey by pinpointing current challenges and open future directions for research.

975 citations

Journal ArticleDOI
TL;DR: This paper investigates the task offloading problem in ultra-dense network aiming to minimize the delay while saving the battery life of user’s equipment and proposes an efficient offloading scheme which can reduce 20% of the task duration with 30% energy saving.
Abstract: With the development of recent innovative applications (e.g., augment reality, self-driving, and various cognitive applications), more and more computation-intensive and data-intensive tasks are delay-sensitive. Mobile edge computing in ultra-dense network is expected as an effective solution for meeting the low latency demand. However, the distributed computing resource in edge cloud and energy dynamics in the battery of mobile device makes it challenging to offload tasks for users. In this paper, leveraging the idea of software defined network, we investigate the task offloading problem in ultra-dense network aiming to minimize the delay while saving the battery life of user’s equipment. Specifically, we formulate the task offloading problem as a mixed integer non-linear program which is NP-hard. In order to solve it, we transform this optimization problem into two sub-problems, i.e., task placement sub-problem and resource allocation sub-problem. Based on the solution of the two sub-problems, we propose an efficient offloading scheme. Simulation results prove that the proposed scheme can reduce 20% of the task duration with 30% energy saving, compared with random and uniform task offloading schemes.

821 citations

Journal ArticleDOI
TL;DR: A detailed review of the security-related challenges and sources of threat in the IoT applications is presented and four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.
Abstract: The Internet of Things (IoT) is the next era of communication. Using the IoT, physical objects can be empowered to create, receive, and exchange data in a seamless manner. Various IoT applications focus on automating different tasks and are trying to empower the inanimate physical objects to act without any human intervention. The existing and upcoming IoT applications are highly promising to increase the level of comfort, efficiency, and automation for the users. To be able to implement such a world in an ever-growing fashion requires high security, privacy, authentication, and recovery from attacks. In this regard, it is imperative to make the required changes in the architecture of the IoT applications for achieving end-to-end secure IoT environments. In this paper, a detailed review of the security-related challenges and sources of threat in the IoT applications is presented. After discussing the security issues, various emerging and existing technologies focused on achieving a high degree of trust in the IoT applications are discussed. Four different technologies, blockchain, fog computing, edge computing, and machine learning, to increase the level of security in IoT are discussed.

800 citations

Journal ArticleDOI
TL;DR: This paper provides a tutorial on fog computing and its related computing paradigms, including their similarities and differences, and provides a taxonomy of research topics in fog computing.

783 citations