A
Antonios Anastasopoulos
Researcher at George Mason University
Publications - 128
Citations - 2549
Antonios Anastasopoulos is an academic researcher from George Mason University. The author has contributed to research in topics: Computer science & Machine translation. The author has an hindex of 21, co-authored 99 publications receiving 1742 citations. Previous affiliations of Antonios Anastasopoulos include Carnegie Mellon University & University of Notre Dame.
Papers
More filters
Posted Content
DyNet: The Dynamic Neural Network Toolkit
Graham Neubig,Chris Dyer,Yoav Goldberg,Austin Matthews,Waleed Ammar,Antonios Anastasopoulos,Miguel Ballesteros,David Chiang,Daniel Clothiaux,Trevor Cohn,Kevin Duh,Manaal Faruqui,Cynthia Gan,Dan Garrette,Yangfeng Ji,Lingpeng Kong,Adhiguna Kuncoro,Gaurav Kumar,Chaitanya Malaviya,Paul Michel,Yusuke Oda,Matthew Richardson,Naomi Saphra,Swabha Swayamdipta,Pengcheng Yin +24 more
TL;DR: DyNet is a toolkit for implementing neural network models based on dynamic declaration of network structure that has an optimized C++ backend and lightweight graph representation and is designed to allow users to implement their models in a way that is idiomatic in their preferred programming language.
Proceedings ArticleDOI
An attentional model for speech translation without transcription
TL;DR: On the more challenging speech-to-word alignment task, the model nearly matches GIZA++’s performance on gold transcriptions, but without recourse to transcriptions or to a lexicon.
Proceedings ArticleDOI
Tied Multitask Learning for Neural Speech Translation
TL;DR: The authors explore multitask models for neural translation of speech, augmenting them in order to reflect two intuitive notions: transitivity and invertibility, and show that the application of these notions on jointly trained models improves performance.
Proceedings ArticleDOI
Investigating Meta-Learning Algorithms for Low-Resource Natural Language Understanding Tasks
TL;DR: This paper explores the model-agnostic meta-learning algorithm (MAML) and its variants for low-resource NLU tasks and empirically demonstrates that the learned representations can be adapted to new tasks efficiently and effectively.
Proceedings ArticleDOI
Choosing Transfer Languages for Cross-Lingual Learning.
Yu-Hsiang Lin,Chian-Yu Chen,Jean Lee,Zirui Li,Yuyan Zhang,Mengzhou Xia,Shruti Rijhwani,Junxian He,Zhisong Zhang,Xuezhe Ma,Antonios Anastasopoulos,Patrick Littell,Graham Neubig +12 more
TL;DR: This paper considers the task of automatically selecting optimal transfer languages as a ranking problem, and builds models that consider the aforementioned features to perform this prediction, and demonstrates that this model predicts good transfer languages much better than ad hoc baselines considering single features in isolation.