scispace - formally typeset
Open Access

Concentration information in time: analog neural networks with applications to speech recognition problems.

About
The article was published on 1987-12-01 and is currently open access. It has received 41 citations till now. The article focuses on the topics: Time delay neural network & Neocognitron.

read more

Citations
More filters
Book

Connectionist Speech Recognition: A Hybrid Approach

TL;DR: Connectionist Speech Recognition: A Hybrid Approach describes the theory and implementation of a method to incorporate neural network approaches into state-of-the-art continuous speech recognition systems based on Hidden Markov Models (HMMs) to improve their performance.
Journal ArticleDOI

Gradient calculations for dynamic recurrent neural networks: a survey

TL;DR: The author discusses advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones and presents some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks.
Journal ArticleDOI

Review of neural networks for speech recognition

TL;DR: Further work is necessary for large-vocabulary continuous-speech problems, to develop training algorithms that progressively build internal word models, and to develop compact VLSI neural net hardware.
Journal ArticleDOI

Links between Markov models and multilayer perceptrons

TL;DR: It is shown theoretically and experimentally that the outputs of the MLP approximate the probability distribution over output classes conditioned on the input, i.e. the maximum a posteriori probabilities.
Journal ArticleDOI

Original Contribution: The gamma model-A new neural model for temporal processing

TL;DR: The gamma neural model as mentioned in this paper is a neural network architecture for processing temporal patterns, where only current signal values are presented to the neural net, which adapts its own internal memory to store the past.
Related Papers (5)