scispace - formally typeset
Search or ask a question
Author

S. Sureshkumar

Bio: S. Sureshkumar is an academic researcher. The author has contributed to research in topics: Package development process & Cohesion (computer science). The author has an hindex of 1, co-authored 1 publications receiving 7 citations.

Papers
More filters
Proceedings ArticleDOI
01 Jan 2012
TL;DR: Intelligent Risk Analysis Model (IRAM) for reusability in developing component service oriented software, which are developed using eminent object oriented programming language paradigm Java is introduced.
Abstract: Software development has nowadays evolved into a extreme change that uses the best modules being run in various closed and open source software. Extracting the best component and fit them into an ongoing component based development software poses great challenges to run the software error free with desired outcomes. In this proposal we have introduced an `Intelligent Risk Analysis Model (IRAM) for reusability in developing component service oriented software, which are developed using eminent object oriented programming language paradigm Java. This model performs surfing operation into Java object oriented programming modules warehouse, which consists of several availability modules of various projects, and gives a list of suitable module. Those modules are tested on cohesion /coupling testing analysis to determine the individual strength and binding capacity of modules before performing Regression test. Final desirable outcomes are tested on performing regression test while integrating successful level module with domain being developed and proper deployment is made on successful expected outcome of the product. Risk analysis of software project is analyzed for transformation of reusability components risk before transformation Analysis, adaptable risk Analysis, and reusability risk Analysis for package in Java programming. In this paper IRAM model to focus Java package the domain and size of package, integration and dependency relationship of package, measuring the coupling and cohesion in risk metric for development and implementation of reusability package without any risk factor.

7 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The performance assessment affirms that AGA based ANN model outperforms other techniques and hence can be used for earlier aging-resilient reusability optimization for WoS software design.
Abstract: Ensuring the aging resilient software design can be of paramount significance to enable faultless software system. Particularly assessing reusability extent of the software components can enable efficient software design. The probability of aging proneness can be characterized based on key OO-SM like cohesion, coupling and complexity of a software component. In this paper, aging resilient software reusability prediction model is proposed for object oriented design based Web of Service (WoS) software systems. This work introduces multilevel optimization to accomplish a novel reusability prediction model. Considering coupling, cohesion and complexity as the software characteristics to signify aging proneness, six CK metrics; WMC, CBO, DIT, LCOM, NOC, and RFC are obtained from 100 WoS software. The extracted CK metrics are processed for min–max normalization that alleviates data-unbalancing and hence avoids saturation during learning. The 10-fold Cross-validation followed by outlier detection is considered to enrich data quality for further feature extraction. To reduce computational overheads RSA algorithm is applied. SoftAudit tool is applied to estimate reusability of each class, while binary ULR estimates calculates (reuse proneness) threshold. Applying different classification algorithms such as LM, ANN algorithms, ELM, and evolutionary computing enriched ANN reuse-proneness prediction has been done. The performance assessment affirms that AGA based ANN model outperforms other techniques and hence can be used for earlier aging-resilient reusability optimization for WoS software design.

26 citations

Journal ArticleDOI
13 Feb 2020
TL;DR: The allocation of task and placement of virtual machine problems is explained in the single fog computing environment and the result shows that the proposed framework improves QoS in fog environment.
Abstract: In today's world, large group migration of applications to the fog computing is registered in the information technology world. The main issue in fog computing is providing enhanced quality of service (QoS). QoS management consists of various method used for allocating fog‐user applications in the virtual environment and selecting suitable method for allocating virtual resources to physical resource. The resources allocation in effective manner in the fog environment is also a major problem in fog computing; it occurs when the infrastructure is build using light‐weight computing devices. In this article, the allocation of task and placement of virtual machine problems is explained in the single fog computing environment. The experiment is done and the result shows that the proposed framework improves QoS in fog environment.

16 citations

Journal ArticleDOI
TL;DR: This research has extracted the Big Data SIoT using the well-known model named MapReduce framework and the implementation of the proposed GA-EHO is done by using some machine learning classifiers for classifying the data and the efficiency is predicted for the proposed work.
Abstract: Several novel applications and services of networking for the IoT are supported by the Social Internet of Things (SIoT) in a more productive and powerful way. SIoTs are the recent hot topics rather than other extensions of IoTs. In this research, the authors have extracted the Big Data SIoT using the well-known model named MapReduce framework. Moreover, the unwanted data and noise from the database are reduced using the Gabor filter, and the big databases are mapped and reduced using the Hadoop MapReduce (HMR) technique for improving the efficiency of the proposed GA-EHO. Furthermore, the feature selection using GA-EHO is processed on the filtered dataset. The implementation of the proposed system is done by using some machine learning classifiers for classifying the data and the efficiency is predicted for the proposed work. From the simulation results, the specificity, maximum accuracy, and sensitivity of the proposed GA-EHO are produced about 87.88%, 99.1%, and 81%. Also, the results are compared with other existing techniques.

5 citations

Journal ArticleDOI
01 Oct 2017

5 citations