scispace - formally typeset
Search or ask a question
Book ChapterDOI

A Research Review on Energy Consumption of Different Frameworks in Mobile Cloud Computing

01 Jan 2019-pp 129-142
TL;DR: This paper analyzes energy consumption by executing on mobile device or remote cloud, and offloading method and level of partitioning are implemented by exploring different parameters based on frameworks and summarizes comparison between different energy offloading techniques.
Abstract: The mobile cloud computing (MCC) is an emerging technology, and its popularity is increasing drastically day by day. Mobile device has been constrained from low battery power, processing capabilities, and limited storage capacity, and MCC is facing several security issues. Users are expecting more computing power and security from mobile devices, in order to support user, the mobile computing integrate with cloud computing (CC) to form a MCC. Computation offloading improves computing features of smartphones (battery power, storage, processing capabilities) as well as user experience with device. In this paper, our main focus is to analyze energy consumption by executing on mobile device or remote cloud, and offloading method and level of partitioning are implemented by exploring different parameters based on frameworks. We summarize comparison between different energy offloading techniques.
Citations
More filters
Journal ArticleDOI
TL;DR: Various communication protocols, namely Zigbee, Bluetooth, Near Field Communication (NFC), LoRA, etc. are presented, and the difference between different communication protocols is provided.
Abstract: Internet of Things (IoT) consists of sensors embed with physical objects that are connected to the Internet and able to establish the communication between them without human intervene applications are industry, transportation, healthcare, robotics, smart agriculture, etc. The communication technology plays a crucial role in IoT to transfer the data from one place to another place through Internet. This paper presents various communication protocols, namely Zigbee, Bluetooth, Near Field Communication (NFC), LoRA, etc. Later, it provides the difference between different communication protocols. Finally, the overall discussion about the communication protocols in IoT.

66 citations

Journal ArticleDOI
TL;DR: The data generated by the built-in sensors are utilized by smartphones, including smartphones, and wearable devices to power smart homes and provide real-time information about their occupants.
Abstract: Advancements in sensor and hardware technology have surged the growth of smart devices (SDs), including smartphones, and wearable devices The data generated by the built-in sensors are utilized by

10 citations

Journal ArticleDOI
TL;DR: This paper has proposed Response time aware task scheduling in the multi-cloudlet environment (RTTSMCE) to deal with two problems, a cloudlet server selection based on response time and a task scheduling among cloudlets using load balancing algorithms to optimize response time of the cloud server.
Abstract: Mobile cloud computing can address power consumption problems of the mobile device by offloading application from the mobile device to Remote cloud. But Latency issues are raised due to the long ph...

6 citations

Book ChapterDOI
01 Jan 2021
TL;DR: In this article, a file compression system for big data as system utility software, and the users would also be able to use it on the desktop and lossless compression takes place in this work.
Abstract: The world is surrounded by technology. There are lots of devices everywhere around us. It is impossible to imagine our lives without technology, as we have got dependent on it for most of our work. One of the primary functions for which we use technology or computers especially is to store and transfer data from a host system or network to another one having similar credentials. The restriction in the capacity of computers means that there’s restriction on amount of data which can be stored or has to transport. So, in order to tackle this problem, computer scientists came up with data compression algorithms. A file compression system’s objective is to build an efficient software which can help to reduce the size of user files to smaller bytes so that it can easily be transferred over a slower Internet connection and it takes less space on the disk. Data compression or the diminishing of rate of bit includes encoding data utilizing less number of bits as compared to the first portrayal. Compression can be of two writes lossless and lossy. The first one decreases bits by recognizing and disposing of measurable excesses, and due to this reason, no data is lost or every info is retained. The latter type lessens record estimate by expelling pointless or less vital data. This paper proposed a file compression system for big data as system utility software, and the users would also be able to use it on the desktop and lossless compression takes place in this work.

2 citations

Book ChapterDOI
01 Jan 2022
TL;DR: In this article , a better approach using machine learning approaches like KNN, decision tree, SVM and logistic regression to predict defaulters was proposed, which can help banks conserve their manpower and fiscal resources by reducing the number of steps they have to take in order to check if somebody is eligible for a loan.
Abstract: AbstractLoans are a very fundamental source of any bank’s revenue, so they work tirelessly to make sure that they only give loans to customers who will not default on the monthly payments. They pay a lot of attention to this issue and use various ways to detect and predict the default behaviors of their customers. However, a lot of the time, because of human error, they may fail to see some key information. This paper proposes a better approach using machine learning approaches like KNN, decision tree, SVM and logistic regression to predict defaulters. The accuracy of these methods will also be tested using metrics like log loss, Jaccard similarity coefficient and F1 Score. These metrics are compared to determine the accuracy of prediction. This can help banks conserve their manpower and fiscal resources by reducing the number of steps they have to take in order to check if somebody is eligible for a loan.KeywordsMachine learningLoan predictionBankingCredit risk managementPredictorClassifiersPython

1 citations

References
More filters
ReportDOI
28 Sep 2011
TL;DR: This cloud model promotes availability and is composed of five essential characteristics, three service models, and four deployment models.
Abstract: Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics, three service models, and four deployment models.

15,145 citations

Journal ArticleDOI
TL;DR: The clouds are clearing the clouds away from the true potential and obstacles posed by this computing capability.
Abstract: Clearing the clouds away from the true potential and obstacles posed by this computing capability.

9,282 citations

Journal ArticleDOI
TL;DR: This paper defines Cloud computing and provides the architecture for creating Clouds with market-oriented resource allocation by leveraging technologies such as Virtual Machines (VMs), and provides insights on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain Service Level Agreement (SLA) oriented resource allocation.

5,850 citations

Journal ArticleDOI
TL;DR: The results from a proof-of-concept prototype suggest that VM technology can indeed help meet the need for rapid customization of infrastructure for diverse applications, and this article discusses the technical obstacles to these transformations and proposes a new architecture for overcoming them.
Abstract: Mobile computing continuously evolve through the sustained effort of many researchers. It seamlessly augments users' cognitive abilities via compute-intensive capabilities such as speech recognition, natural language processing, etc. By thus empowering mobile users, we could transform many areas of human activity. This article discusses the technical obstacles to these transformations and proposes a new architecture for overcoming them. In this architecture, a mobile user exploits virtual machine (VM) technology to rapidly instantiate customized service software on a nearby cloudlet and then uses that service over a wireless LAN; the mobile device typically functions as a thin client with respect to the service. A cloudlet is a trusted, resource-rich computer or cluster of computers that's well-connected to the Internet and available for use by nearby mobile devices. Our strategy of leveraging transiently customized proximate infrastructure as a mobile device moves with its user through the physical world is called cloudlet-based, resource-rich, mobile computing. Crisp interactive response, which is essential for seamless augmentation of human cognition, is easily achieved in this architecture because of the cloudlet's physical proximity and one-hop network latency. Using a cloudlet also simplifies the challenge of meeting the peak bandwidth demand of multiple users interactively generating and receiving media such as high-definition video and high-resolution images. Rapid customization of infrastructure for diverse applications emerges as a critical requirement, and our results from a proof-of-concept prototype suggest that VM technology can indeed help meet this requirement.

3,599 citations

Proceedings ArticleDOI
15 Jun 2010
TL;DR: MAUI supports fine-grained code offload to maximize energy savings with minimal burden on the programmer, and decides at run-time which methods should be remotely executed, driven by an optimization engine that achieves the best energy savings possible under the mobile device's current connectivity constrains.
Abstract: This paper presents MAUI, a system that enables fine-grained energy-aware offload of mobile code to the infrastructure. Previous approaches to these problems either relied heavily on programmer support to partition an application, or they were coarse-grained requiring full process (or full VM) migration. MAUI uses the benefits of a managed code environment to offer the best of both worlds: it supports fine-grained code offload to maximize energy savings with minimal burden on the programmer. MAUI decides at run-time which methods should be remotely executed, driven by an optimization engine that achieves the best energy savings possible under the mobile device's current connectivity constrains. In our evaluation, we show that MAUI enables: 1) a resource-intensive face recognition application that consumes an order of magnitude less energy, 2) a latency-sensitive arcade game application that doubles its refresh rate, and 3) a voice-based language translation application that bypasses the limitations of the smartphone environment by executing unsupported components remotely.

2,530 citations