scispace - formally typeset
S

Sunitha Basodi

Researcher at Georgia State University

Publications -  25
Citations -  359

Sunitha Basodi is an academic researcher from Georgia State University. The author has contributed to research in topics: Deep learning & Computer science. The author has an hindex of 7, co-authored 19 publications receiving 121 citations.

Papers
More filters
Journal ArticleDOI

A Survey on Algorithms for Intelligent Computing and Smart City Applications

TL;DR: In this paper, the authors investigated the current stage of the smart city and various intelligent algorithms to make cities smarter, along with specific examples, are discussed and analyzed, and the authors described the framework of a smart city in accordance with the given definition.
Journal ArticleDOI

Multivariate time series dataset for space weather data analytics

TL;DR: A comprehensive, multivariate time series (MVTS) dataset extracted from solar photospheric vector magnetograms in Spaceweather HMI Active Region Patch (SHARP) series is introduced and made openly accessible.
Journal ArticleDOI

Adaptive computation offloading and resource allocation strategy in a mobile edge computing environment

TL;DR: This paper proposes an adaptive task offloading and resource allocation algorithm in the MEC environment that has the best performance in reducing the task average response time and the total system energy consumption, improving the system utility, which meets the profits of users and service providers.
Posted Content

Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm.

TL;DR: This article proposes to use a variable length genetic algorithm (GA) to systematically and automatically tune the hyperparameters of a CNN to improve its performance and shows that the algorithm can find good CNN hyperparameter efficiently.
Journal ArticleDOI

Gradient Amplification: An Efficient Way to Train Deep Neural Networks

TL;DR: In this paper, the authors propose gradient amplification approach for training deep learning models to prevent vanishing gradients and also develop a training strategy to enable or disable gradient amplification method across several epochs with different learning rates.