K
Kazuya Kawakami
Researcher at Google
Publications - 15
Citations - 4236
Kazuya Kawakami is an academic researcher from Google. The author has contributed to research in topics: Language model & Context (language use). The author has an hindex of 9, co-authored 14 publications receiving 3295 citations.
Papers
More filters
Proceedings ArticleDOI
Neural Architectures for Named Entity Recognition
TL;DR: Comunicacio presentada a la 2016 Conference of the North American Chapter of the Association for Computational Linguistics, celebrada a San Diego (CA, EUA) els dies 12 a 17 of juny 2016.
Proceedings ArticleDOI
Learning Robust and Multilingual Speech Representations
TL;DR: In this paper, the authors learn representations from up to 8000 hours of diverse and noisy speech data and evaluate the representations by looking at their robustness to domain shifts and their ability to improve recognition performance in many languages.
Posted Content
Learning Robust and Multilingual Speech Representations
TL;DR: This paper learns representations from up to 8000 hours of diverse and noisy speech data and evaluates the representations byLooking at their robustness to domain shifts and their ability to improve recognition performance in many languages finds that the representations confer significant robustness advantages to the resulting recognition systems.
Posted Content
Learning to Create and Reuse Words in Open-Vocabulary Neural Language Modeling
TL;DR: The authors augment a hierarchical LSTM language model that generates sequences of word tokens character-by-character with a caching mechanism that learns to reuse previously generated words to capture the "bursty" distribution of such words.
Posted Content
Learning to Represent Words in Context with Multilingual Supervision
Kazuya Kawakami,Chris Dyer +1 more
TL;DR: A neural network architecture based on bidirectional LSTMs to compute representations of words in the sentential contexts suitable for context-sensitive word representations, and obtains state-of-the-art results on all of these.