scispace - formally typeset
Search or ask a question
Topic

Hybrid neural network

About: Hybrid neural network is a research topic. Over the lifetime, 1305 publications have been published within this topic receiving 18223 citations.


Papers
More filters
Book ChapterDOI
01 Jan 2021
TL;DR: In this paper, a distributed modeling approach of deep neural network (DNN) classifier for segmentation-free Telugu word recognition is presented, which is a hybrid neural network model.
Abstract: This paper presents a distributed modeling approach of deep neural network (DNN) classifier for segmentation-free Telugu word recognition. The DNN is a hybrid neural network model. It involves the convolutional neural network (CNN) and recurrent neural network (RNN) layers. Moreover, the connectionist temporal classification (CTC) layer gives the output. The presence of CNN layers requires advanced computer systems consisting of graphics processing units (GPUs) for speedy training of a DNN model. The distributed modeling approach involves the distribution of load across a TPU. As a comparative study, we also train the same model on CPU and GPU systems. Our method aims to handle large batch sizes and reduce modeling time, which is precious to researchers. The word error rate (WER) of the test set is 5%, which is promising.
Proceedings Article
01 Jan 2010
TL;DR: Results of tests have shown, that the same accuracy of solution as for continuous Hopfield-like network can be reached by described here structure in half number of classical Hopfield net iteration.
Abstract: In present paper, we describe completely innovation architecture of artificial neural nets based on Hopfield structure for solving of stereo matching problem. Hybrid neural network consists of classical analogue Hopfield neural network and maximal neural network. The role of analogue Hopfield network is to find of attraction area of global minimum, whereas maximum network is to find accurate location of this minimum. Presented network characterizes by extremely high rate of working with the same accuracy as classical Hopfield-like network. It is very important as far as application and system of visually impaired people supporting are concerned. Considered network was taken under experimental tests with using real stereo pictures as well simulated stereo images. This allows on calculation of errors and direct comparison to classic analogue Hopfield neural network. Results of tests have shown, that the same accuracy of solution as for continuous Hopfield-like network, can be reached by described here structure in half number of classical Hopfield net iteration.
Book ChapterDOI
23 Aug 2012
TL;DR: A hybrid neural network modeling approach was presented and used to model a fedbatch bioreactor, which is comprised of two parts including a partial first principles model, which reflects the a priori knowledge, and a neural network component which serves as a nonparametric approximator of difficult-to-model process parameters.
Abstract: In this paper a hybrid neural network modeling approach was presented and used to model a fedbatch bioreactor. This hybrid model is comprised of two parts including a partial first principles model, which reflects the a priori knowledge, and a neural network component, which serves as a nonparametric approximator of difficult-to-model process parameters. This form of hybrid neural network is useful for modeling processes where a partial model can be derived from simple physical considerations but which also includes terms that are difficult to model from first principles. The hybrid model, once learned, can be used for process control and optimization. The concept of combining proposed multi layer perceptron with first principle knowledge is a powerful one, and goes well bioreactor problem.
01 Jan 2005
TL;DR: The Simple Hierarchical Approximation algorithm ('SHA') achieves comparable results intermsofaccuracy without theaddedcomplexity introduced by theother types of hidden neurons.
Abstract: Theapproximation algorithm introduced by AsimRoyetal.(lJ generates a hybrid neural network withRBFneurons andothertypes ofhidden neuronsfor function approximation. Thenetwork istrained instages, withRBF neurons attheearly stages corresponding to general features inthespaceandthoseinlater stages corresponding tomorespecific features. Theothertypes ofhidden neurons areaddedwitha viewtoimproving generalization andreducing thenumberofRBF neurons. Thealgorithm useslinear programming todesign and train thehybrid network. We investigate simplifying the algorithm withaviewtoeliminating theneedfortheother types ofhidden neurons andlinear programming. The Simple Hierarchical Approximation algorithm ('SHA') achieves comparable results intermsofaccuracy without theaddedcomplexity introduced bytheothertypesof hidden neurons.
01 Jan 2012
TL;DR: In this paper, an improved artificial bee colony alg orithm based back-propagation neural network training method was proposed for fast and improved convergence rate of the hybrid neural network learning method.
Abstract: Back-propagation algorithm is one of the most widely used and popul ar techniques to optimize the feed forward neural network training. Nature inspired meta -heuristic algorithms also provide derivative -free solution to optimize complex problem. Artificial bee colony algorithm is a nature inspired meta -heuristic algorithm, mimicking the foraging or food source searching behaviour of bees in a bee colony and this algorithm is implemented in several applications for an improved optimized outcome. The proposed method in this paper includes an improved artificial bee colony alg orithm based back-propagation neural network training method for fast and improved convergence rate of the hybrid neural network learning method. The result is analysed with the genetic algorithm based back -propagation method, and it is another hybridizedprocedure of its kind. Analysis is performed over standard data sets, reflecting the light of efficiency of proposed method in terms of convergence speed and rate.

Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
89% related
Feature extraction
111.8K papers, 2.1M citations
88% related
Fuzzy logic
151.2K papers, 2.3M citations
85% related
Convolutional neural network
74.7K papers, 2M citations
84% related
Deep learning
79.8K papers, 2.1M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20233
20228
2021128
2020119
2019104
201863