scispace - formally typeset
M

Md. Shahjahan

Researcher at Khulna University of Engineering & Technology

Publications -  70
Citations -  802

Md. Shahjahan is an academic researcher from Khulna University of Engineering & Technology. The author has contributed to research in topics: Artificial neural network & Backpropagation. The author has an hindex of 11, co-authored 68 publications receiving 660 citations. Previous affiliations of Md. Shahjahan include University of Fukui & Khulna University.

Papers
More filters
Journal ArticleDOI

A new local search based hybrid genetic algorithm for feature selection

TL;DR: A new hybrid genetic algorithm (HGA) for feature selection (FS), called HGAFS, which produces consistently better performances on selecting the subsets of salient features with resulting better classification accuracies.
Journal ArticleDOI

A new hybrid ant colony optimization algorithm for feature selection

TL;DR: A new hybrid ant colony optimization (ACO) algorithm for feature selection (FS), called ACOFS, using a neural network, has a remarkable ability to generate reduced-size subsets of salient features while yielding significant classification accuracy.
Journal ArticleDOI

A Paper Currency Recognition System Using Negatively Correlated Neural Network Ensemble

TL;DR: A currency recognition system using ensemble neural network (ENN), which is able to recognize highly noisy or old image of TAKA, and reduces the chances of misclassification than a single network and ensemble network with independent training.
Book ChapterDOI

An Automatic Speaker Recognition System

TL;DR: Speaker Recognition is the process of identifying a speaker by analyzing spectral shape of the voice signal by extracting & matching the feature of voice signal through Mel-frequency Cepstrum Co-efficient.
Journal ArticleDOI

Faster training using fusion of activation functions for feed forward neural networks.

TL;DR: The proposed 'Fusion of Activation Functions' (FAF) in which different conventional activation functions (AFs) are combined to compute final activation is investigated to enable the learning to be faster and shown to work better than other AFs used independently in BP.