scispace - formally typeset
Search or ask a question
Author

Dasari Naga Raju

Bio: Dasari Naga Raju is an academic researcher from VIT University. The author has contributed to research in topics: Grid computing & Cloud computing. The author has an hindex of 2, co-authored 4 publications receiving 16 citations. Previous affiliations of Dasari Naga Raju include Sri Venkateswara College of Engineering.

Papers
More filters
Proceedings ArticleDOI
01 May 2017
TL;DR: Virtual Cloud Learning Automata algorithm (VCLA) is proposed for selecting the suitable nodes to create the ad hoc virtual cloud and the experimental results show the effectiveness of VCLA when compared to the process without LA.
Abstract: Mobile technology has a major role in every day life. The limitations of the mobile devices cause some serious issues in the performance of an application. The emerging mobile environment needs computational support from the external environment called cloud computing. The mobile devices establish the communication to the cloud through the wireless medium. The property of the mobile devices is not static. So, the link failures occur frequently and this ultimately leads to communication failure. To overcome this issue in the mobile cloud computing (MCC), an ad hoc networking model which uses the available mobile devices within the range is proposed. Virtual Cloud Learning Automata algorithm (VCLA) is proposed for selecting the suitable nodes to create the ad hoc virtual cloud. Some of the nodes are selected as optimal nodes by VCLA on which the computational offloading is performed. The experimental results show the effectiveness of VCLA when compared to the process without LA.

16 citations

Book ChapterDOI
01 Jan 2018
TL;DR: The ant colony optimization algorithm is used to reduce the energy consumption and execution time of the tasks, and it can be implemented in PERMA-G framework and it is an extended version of the previous work.
Abstract: Grid computing is treated as one of the emerging fields in distributed computing; it exploits the services like sharing of resources and scheduling of workflows. One of the major issues in grid computing is resource scheduling, this can be handled using the ant colony optimization algorithm, and it can be implemented in PERMA-G framework and it is an extended version of our previous work. The ant colony optimization is used to reduce the energy consumption and execution time of the tasks. It follows the nature of ant colony mechanism to compute the total execution time and power consumption of the tasks scheduled dynamically, the experimental results show the performance of the proposed model.

4 citations

Book ChapterDOI
01 Jan 2018
TL;DR: This chapter concludes by posing some questions to develop the future work in semantic indexing, active learning, semi-supervised learning, domain adaptation modelling, data sampling, and data abstractions.
Abstract: In recent years, big data analytics is the major research area where the researchers are focused. Complex structures are trained at each level to simplify the data abstractions. Deep learning algorithms are one of the promising researches for automation of complex data extraction from large data sets. Deep learning mechanisms produce better results in machine learning, such as computer vision, improved classification modelling, probabilistic models of data samples, and invariant data sets. The challenges handled by the big data are fast information retrieval, semantic indexing, extracting complex patterns, and data tagging. Some investigations are concentrated on integration of deep learning approaches with big data analytics which pose some severe challenges like scalability, high dimensionality, data streaming, and distributed computing. Finally, the chapter concludes by posing some questions to develop the future work in semantic indexing, active learning, semi-supervised learning, domain adaptation modelling, data sampling, and data abstractions.

3 citations

Journal ArticleDOI
08 Sep 2016
TL;DR: A cloud-based smart grid data management model and an optimisation algorithm to resolve the challenges of such environment and minimise the data storage prices are proposed.
Abstract: The smart grid is an advanced power grid with efficient two-way communication of power and data. Due to the modular nature of network and users, the data storage and processing were required. Cloud computing is one of the efficient technology for storing and managing such data in the real world. To solve such data management issues, we proposed a cloud-based smart grid data management (CSGDM) model and an optimisation algorithm to resolve the challenges of such environment. The proposed algorithm deals with the grid utilisation and power usage cost calculation with an optimisation mechanism for cost function to minimise the data storage prices.

1 citations


Cited by
More filters
Book ChapterDOI
01 Jan 2020
TL;DR: The concept of fuzzy system is playing a pivotal role in designing such strategies in which it makes use of the concept fuzzy sets, fuzzy logic and reasoning, fuzzy controllers, rough sets, etc.
Abstract: The eagle expresses of cloud computing plays a pivotal role in the development of technology. The technology momentum is gaining cloud system environment across the emerging computational intelligence as artificial intelligence in IOTs. It works on the concept of everywhere, everything, and every time service on demand for end users. Developers with innovative ideas for greenhorn Web services do not want the massive capital outlays in hardware to deploy their service or the human expense to regulate it. Cloud computing playing a pivotal role in providing the optimized algorithms for the issues in cloud computing which are the global challenges. The problem aims to solve in such a way that it will provide an optimized solution. Among all these research areas, the works are developing at a rapid speed. The rapid vision, models, and challenges for efficient optimization of resource are in cloud computing world. The key role is of allocating these efficient resources and making the algorithms for its time and cost optimization keeping in consideration of its quality of services and characteristics. These both affecting the performance of these techniques is a major drawback due to low accuracy and large computational complexity of the algorithms. Therefore, the concept of fuzzy system is playing a pivotal role in designing such strategies in which it makes use of the concept fuzzy sets, fuzzy logic and reasoning, fuzzy controllers, rough sets, etc. As per the scenario, the approach of the research is based on technology acceptance model (TAM) and the rough set theory (RST). RST is a great method for making a large difference in qualitative analysis situations. It is a technique to find the knowledge discovery and handle the problems such as inductive reasoning, automatic classification, pattern recognition, learning algorithms, and data reduction. The rough set theory is the new method in cloud service selection so that the best services are provided for cloud users and efficient service improvement for cloud providers. There square measure numerous routes by that its measured, the variety of them square measure as per the following: Feature alternative, security services, algorithmic, mathematical, etc. The simulation of the work is finished at intervals the merchandise utilized for the formation of philosophy framework. The simulation that deals with the IoT services provided by the IoT service supplier to the user is the best utilization with the parameters and ontology technique.

21 citations

Journal ArticleDOI
01 Jan 2020
TL;DR: The main aim is to connect toﻡ� the £1.5bn internet, and to enhance the quality of life for all users.
Abstract: The main aim of Internet of Things (IoT) is to get every “thing” (sensors, smart cameras, wearable devices, and smart home appliances) to connect to the internet. Henceforth to produce the high volume of data required for data processing between IoT devices, large storage and the huge number of applications to offer cloud computing as a service. The purpose of IoT-based-cloud is to manage the resources, and effective utilization of tasks in cloud. The end user applications are essential to enhance the QoS parameters. As per the QoS parameters, the service provider makes the speed up of tasks. There is a requirement for assigning responsibilities based on priority. The cloud services are increased to the network edge, and the planned model is under the Fog computing paradigm to reduce the makespan of time. The priority based fuzzy scheduling approach is brought by the dynamic feedback-based mechanism. The planned mechanism is verified with the diverse prevailing algorithms and evidenced that planned methodology is supported by effective results. KeywoRdS Cloud With IoT, Feedback Based Mechanism, Fog Computing, Fuzzy, Latency, Resource Management, Virtual Machines

18 citations

Journal ArticleDOI
Yingjie Wang1, Lei Wu1, Xiusheng Yuan1, Xiao Liu2, Xuejun Li1 
TL;DR: This paper proposes an energy-efficient and deadline-aware task offloading strategy based on the channel constraint-based strategy (CC-NAIWPSO), which can obtain a near-optimal offloading plan that can consume less energy while meeting the deadlines.
Abstract: Energy efficiency is a fundamental problem due to the fact that numerous tasks are running on mobile devices with limited resources. Mobile cloud computing (MCC) technology can offload computation-intensive tasks from mobile devices onto powerful cloud servers, which can significantly reduce the energy consumption of mobile devices and thus enhance their capabilities. In MCC, mobile devices transmit data through the wireless channel. However, since the state of the channel is dynamic, offloading at a low transmission rate will result in the serious waste of time and energy, which further degrades the quality of service (QoS). To address this problem, this paper proposes an energy-efficient and deadline-aware task offloading strategy based on the channel constraint, with the goal of minimizing the energy consumption of mobile devices while satisfying the deadlines constraints of mobile cloud workflows. Specifically, we first formulate a task offloading decision model that combines the channel state with task attributes such as the workload and the size of the data transmission to determine whether the task needs to be offloaded or not. Afterward, we apply it to a new adaptive inertia weight-based particle swarm optimization (NAIWPSO) algorithm to create our channel constraint-based strategy (CC-NAIWPSO), which can obtain a near-optimal offloading plan that can consume less energy while meeting the deadlines. The experimental results show that our proposed task offloading strategy can outperform other strategies with respect to the energy consumption of mobile devices, the execution time of mobile cloud workflows, and the running time of algorithms.

16 citations

Journal ArticleDOI
Huaming Wu1
TL;DR: An analytical queueing model for delayed offloading systems with intermittent connectivity is developed and a deadline is set to detect offloading failures when taking the impatient jobs and service interruptions into account.
Abstract: Mobile devices and wireless network links are sometimes unreliable and unstable owing to user mobility in mobile wireless environments. As a result, failures may occur during the offloading process, which has a huge influence on the offloading performance. Worse still, it is challenging to make high-level offloading decisions since they closely rely on an accurate predicting model, especially when encountering with offloading failures. Unfortunately, due to the high mobility and heterogeneous network environments, such a model is currently not available to researchers and practitioners. To fill that gap, this letter develops an analytical queueing model for delayed offloading systems with intermittent connectivity. We set a deadline to detect offloading failures when taking the impatient jobs and service interruptions into account. Furthermore, for different types of mobile applications, a tradeoff analysis between energy saving and latency reducing is analyzed on the basis of an energy–response time weighted product metric. The proposed offloading model can be effectively used for describing complex and realistic offloading systems with the presence of failures.

14 citations

Book ChapterDOI
01 Jan 2019
TL;DR: This paper analyzes energy consumption by executing on mobile device or remote cloud, and offloading method and level of partitioning are implemented by exploring different parameters based on frameworks and summarizes comparison between different energy offloading techniques.
Abstract: The mobile cloud computing (MCC) is an emerging technology, and its popularity is increasing drastically day by day. Mobile device has been constrained from low battery power, processing capabilities, and limited storage capacity, and MCC is facing several security issues. Users are expecting more computing power and security from mobile devices, in order to support user, the mobile computing integrate with cloud computing (CC) to form a MCC. Computation offloading improves computing features of smartphones (battery power, storage, processing capabilities) as well as user experience with device. In this paper, our main focus is to analyze energy consumption by executing on mobile device or remote cloud, and offloading method and level of partitioning are implemented by exploring different parameters based on frameworks. We summarize comparison between different energy offloading techniques.

11 citations