S
Sudhish N. George
Researcher at National Institute of Technology Calicut
Publications - Ā 91
Citations - Ā 874
Sudhish N. George is an academic researcher from National Institute of Technology Calicut. The author has contributed to research in topics: Computer science & Encryption. The author has an hindex of 14, co-authored 78 publications receiving 568 citations.
Papers
More filters
Proceedings Article
An Efficient Framework for the Clustering of Human Activity Datausing Kernelized Robust Covariance Descriptors
TL;DR: In this paper , a new method for the efficient clustering of human activity data is proposed, which relies on the skeletal data recorded with the help of motion capture (mocap) systems to achieve the goal.
Proceedings Article
A New Optimization Model for the Restoration of the Deteriorated Hyperspectral Images
TL;DR: In this paper , a tensor svd based low rank decomposition together with spatial spectral total variation (SSTV) regularization was proposed for removing the noise artefacts in hyperspectral images.
Proceedings ArticleDOI
Dictionary learning based sparse coefficients for speech recognition in noisy environment
TL;DR: Sparse feature derived from the dictionary of signal atoms using sparse coding is used for feature extraction and it is observed that the proposed method can achieve better performance than the others.
Proceedings ArticleDOI
A Tensor Non-Convex Low Rank And Sparse Constrained Band Selection Scheme For Clustering Of Hyperspectral Paper Data
TL;DR: In this article , a tensor based band selection scheme using submodule clustering is proposed, where this framework is capable of preserving the spatial structure of individual spectral bands in hyperspectral imaging data.
Proceedings ArticleDOI
Performance optimization of static switch fed-three phase induction motors using PSO
K. Nafeesa,Sudhish N. George +1 more
TL;DR: Results show that optimum firing angle obtained through PSO algorithm is matching with the graphical method, and by setting this firing angle, the machine can be started like direct on line starting with better starting performance.