scispace - formally typeset
Y

Yangyang Shi

Researcher at Facebook

Publications -  162
Citations -  1820

Yangyang Shi is an academic researcher from Facebook. The author has contributed to research in topics: Computer science & Medicine. The author has an hindex of 14, co-authored 61 publications receiving 1157 citations. Previous affiliations of Yangyang Shi include Beijing University of Civil Engineering and Architecture & Microsoft.

Papers
More filters
Proceedings ArticleDOI

Spoken language understanding using long short-term memory neural networks

TL;DR: This paper investigates using long short-term memory (LSTM) neural networks, which contain input, output and forgetting gates and are more advanced than simple RNN, for the word labeling task and proposes a regression model on top of the LSTM un-normalized scores to explicitly model output-label dependence.
Proceedings ArticleDOI

Recurrent neural networks for language understanding.

TL;DR: This paper modify the architecture to perform Language Understanding, and advance the state-of-the-art for the widely used ATIS dataset.
Journal ArticleDOI

Search for the chiral magnetic effect with isobar collisions at sNN=200 GeV by the STAR Collaboration at the BNL Relativistic Heavy Ion Collider

Mustafa Muzameal Suleman Abdallah, +377 more
- 03 Jan 2022 - 
TL;DR: In this paper , a blind analysis of a large data sample of approximately 3.8 billion isobar collisions of Ru4496+Ru4496 and Zr4096+Zr 4096 at sNN=200 GeV was performed.
Proceedings ArticleDOI

Contextual spoken language understanding using recurrent neural networks

TL;DR: The proposed method obtains new state-of-the-art results on ATIS and improved performances over baseline techniques such as conditional random fields (CRFs) on a large context-sensitive SLU dataset.
Posted Content

Emformer: Efficient Memory Transformer Based Acoustic Model For Low Latency Streaming Speech Recognition

TL;DR: An efficient memory transformer Emformer for low latency streaming speech recognition where the long-range history context is distilled into an augmented memory bank to reduce self-attention’s computation complexity.