scispace - formally typeset
A

Ahmad Emami

Researcher at Johns Hopkins University

Publications -  12
Citations -  313

Ahmad Emami is an academic researcher from Johns Hopkins University. The author has contributed to research in topics: Language model & Perplexity. The author has an hindex of 8, co-authored 12 publications receiving 310 citations.

Papers
More filters
Journal ArticleDOI

A neural syntactic language model

TL;DR: The neural syntactic based model achieves the best published results in perplexity and WER for the given data sets and comparisons with the standard and neural net based N-gram models with arbitrarily long contexts show that the syntactic information is in fact very helpful in estimating the word string probability.
Proceedings ArticleDOI

Unfolded recurrent neural networks for speech recognition.

TL;DR: These models are feedforward networks with the property that the unfolded layers which correspond to the recurrent layer have time-shifted inputs and tied weight matrices and can be implemented efficiently through matrix-matrix operations on GPU architectures which makes it scalable for large tasks.
Proceedings ArticleDOI

Using a connectionist model in a syntactical based language model

TL;DR: This work investigates the performance of the Structured Language Model when one of its components is modeled by a connectionist model, and shows that the probability distribution obtained by the model is much less correlated to regular N-grams than the baseline SLM model.
Proceedings ArticleDOI

Large Scale Hierarchical Neural Network Language Models.

TL;DR: Experimental results show how much the WER can be improved by increasing the scale of the NNLM, in terms of model size and training data, however, training time can become very long.
Proceedings ArticleDOI

Random clusterings for language modeling

TL;DR: Experimental results show that the combined random class-based model improves considerably in perplexity (PPL) and word error rate (WER) over both the n-gram and baseline class- based models.