scispace - formally typeset
Proceedings ArticleDOI

Text recognition using deep BLSTM networks

TLDR
A Deep Bidirectional Long Short Term Memory (LSTM) based Recurrent Neural Network architecture for text recognition that uses Connectionist Temporal Classification (CTC) for training to learn the labels of an unsegmented sequence with unknown alignment.
Abstract
This paper presents a Deep Bidirectional Long Short Term Memory (LSTM) based Recurrent Neural Network architecture for text recognition. This architecture uses Connectionist Temporal Classification (CTC) for training to learn the labels of an unsegmented sequence with unknown alignment. This work is motivated by the results of Deep Neural Networks for isolated numeral recognition and improved speech recognition using Deep BLSTM based approaches. Deep BLSTM architecture is chosen due to its ability to access long range context, learn sequence alignment and work without the need of segmented data. Due to the use of CTC and forward backward algorithms for alignment of output labels, there are no unicode re-ordering issues, thus no need of lexicon or postprocessing schemes. This is a script independent and segmentation free approach. This system has been implemented for the recognition of unsegmented words of printed Oriya text. This system achieves 4.18% character level error and 12.11% word error rate on printed Oriya text.

read more

Citations
More filters
Journal ArticleDOI

Tool wear prediction using convolutional bidirectional LSTM networks

TL;DR: Experimental results indicate that HLLSTM can reduce the mean absolute error of real tool wear value two-fold and accurately predict tool wear.
Journal ArticleDOI

Power entity recognition based on bidirectional long short-term memory and conditional random fields

TL;DR: The results indicated that the CRF model, with an accuracy of 83%, can better identify the power entities compared to the BLSTM and can thus be applied to the entity extraction for knowledge graph construction in the power field.
Journal ArticleDOI

A Graph Convolutional Stacked Bidirectional Unidirectional-LSTM Neural Network for Metro Ridership Prediction

TL;DR: Zhang et al. as discussed by the authors proposed a parallel-structured deep learning model that consists of a graph convolution network and a stacked bidirectional unidirectional long short-term memory network (GCN-SBULSTM).
Journal ArticleDOI

Recognizing Bangla Handwritten Numeral Utilizing Deep Long Short Term Memory

TL;DR: The proposed BHNR with deep LSTM (BNHR-DLSTM) standardizes the composed numeral images first and then utilizes two layers of L STM to characterize singular numerals and indicates agreeable recognition precision and beat other conspicuous existing methods.
Proceedings ArticleDOI

Towards Situational Awareness of Botnet Activity in the Internet of Things

TL;DR: The proposed model addresses the issue of detection, and returns high accuracy and low loss metrics for four attack vectors used by the mirai botnet malware, with only one attack vector shown to be difficult to detect and predict.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

A fast learning algorithm for deep belief nets

TL;DR: A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.
Proceedings ArticleDOI

Speech recognition with deep recurrent neural networks

TL;DR: This paper investigates deep recurrent neural networks, which combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.
Posted Content

Speech Recognition with Deep Recurrent Neural Networks

TL;DR: In this paper, deep recurrent neural networks (RNNs) are used to combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.
Proceedings ArticleDOI

Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks

TL;DR: This paper presents a novel method for training RNNs to label unsegmented sequences directly, thereby solving both problems of sequence learning and post-processing.
Related Papers (5)