X
Xunying Liu
Researcher at The Chinese University of Hong Kong
Publications - 207
Citations - 3652
Xunying Liu is an academic researcher from The Chinese University of Hong Kong. The author has contributed to research in topics: Computer science & Word error rate. The author has an hindex of 27, co-authored 171 publications receiving 2531 citations. Previous affiliations of Xunying Liu include University of Sheffield & University of Cambridge.
Papers
More filters
Proceedings ArticleDOI
The MGB challenge: Evaluating multi-genre broadcast media recognition
Peter Bell,Mark J. F. Gales,Thomas Hain,Jonathan Kilgour,Pierre Lanchantin,Xunying Liu,A McParland,Steve Renals,Oscar Saz,Mirjam Wester,Philip C. Woodland +10 more
TL;DR: An evaluation focused on speech recognition, speaker diarization, and "lightly supervised" alignment of BBC TV recordings at ASRU 2015 is described, and the results obtained are summarized.
Proceedings ArticleDOI
Recurrent neural network language model adaptation for multi-genre broadcast speech recognition
Xie Chen,Tian Tan,Xunying Liu,Pierre Lanchantin,Moquan Wan,Mark J. F. Gales,Philip C. Woodland +6 more
TL;DR: Experiments using a state-of-theart LVCSR system showed adaptation could yield perplexity reductions of 8% relatively over the baseline RNNLM and small but consistent word error rate reductions.
Proceedings ArticleDOI
Efficient lattice rescoring using recurrent neural network language models
TL;DR: Two novel lattice rescoring methods for RNNLMs are investigated, one of which uses an n-gram style clustering of history contexts and the other exploits the distance measure between hidden history vectors.
Proceedings ArticleDOI
Improved Neural Network Based Language Modelling and Adaptation
TL;DR: A novel NNLM adaptation method using a cascaded network is proposed and consistent WER reductions were obtained on a state-of-the-art Arabic LVCSR task over conventional NNLMs.
Proceedings ArticleDOI
Recurrent neural network language model training with noise contrastive estimation for speech recognition
TL;DR: Noise contrastive estimation (NCE) is explored in RNNLM training and is insensitive to the output layer size, resulting in a doubling in training speed on a GPU and a 56 times speed up in test time evaluation on a CPU.