scispace - formally typeset
Proceedings ArticleDOI

Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks

TLDR
This paper presents a novel method for training RNNs to label unsegmented sequences directly, thereby solving both problems of sequence learning and post-processing.
Abstract
Many real-world sequence learning tasks require the prediction of sequences of labels from noisy, unsegmented input data. In speech recognition, for example, an acoustic signal is transcribed into words or sub-word units. Recurrent neural networks (RNNs) are powerful sequence learners that would seem well suited to such tasks. However, because they require pre-segmented training data, and post-processing to transform their outputs into label sequences, their applicability has so far been limited. This paper presents a novel method for training RNNs to label unsegmented sequences directly, thereby solving both problems. An experiment on the TIMIT speech corpus demonstrates its advantages over both a baseline HMM and a hybrid HMM-RNN.

read more

Citations
More filters
Proceedings ArticleDOI

End-To-End Named Entity And Semantic Concept Extraction From Speech

TL;DR: An end-to-end approach that directly extracts named entities from speech, though a unique neural architecture is explored, and this approach applied to semantic concept extraction, through a slot filling task known as a spoken language understanding problem, and also observes an improvement in comparison to a pipeline approach.
Proceedings ArticleDOI

DeepSniffer: A DNN Model Extraction Framework Based on Learning Architectural Hints

TL;DR: DeepSniffer as discussed by the authors proposes a learning-based model extraction framework to obtain the complete model architecture information without any prior knowledge of the victim model, which is robust to architectural and system noises introduced by the complex memory hierarchy and diverse run-time system optimizations.
Journal ArticleDOI

Neural Networks for Modeling and Control of Particle Accelerators

TL;DR: In this paper, a neural network-based control system for particle accelerators is described. And the authors describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator controller systems, and describe a neural networks based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility.
Proceedings ArticleDOI

Forward Attention in Sequence- To-Sequence Acoustic Modeling for Speech Synthesis

TL;DR: Experimental results show that the proposed forward attention method achieves faster convergence speed and higher stability than the baseline attention method, and can also help improve the naturalness of synthetic speech and control the speed of syntheticspeech effectively.
Proceedings ArticleDOI

Evaluating Sequence-to-Sequence Models for Handwritten Text Recognition

TL;DR: An attention-based sequence-to-sequence model that combines a convolutional neural network as a generic feature extractor with a recurrent neural network to encode both the visual information, as well as the temporal context between characters in the input image, and uses a separate recurrent network to decode the actual character sequence.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

A tutorial on hidden Markov models and selected applications in speech recognition

TL;DR: In this paper, the authors provide an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and give practical details on methods of implementation of the theory along with a description of selected applications of HMMs to distinct problems in speech recognition.
Book

Neural networks for pattern recognition

TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Proceedings Article

Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data

TL;DR: This work presents iterative parameter estimation algorithms for conditional random fields and compares the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data.
Related Papers (5)