R
Rumen Dangovski
Researcher at Massachusetts Institute of Technology
Publications - 37
Citations - 178
Rumen Dangovski is an academic researcher from Massachusetts Institute of Technology. The author has contributed to research in topics: Computer science & Commutator. The author has an hindex of 4, co-authored 25 publications receiving 55 citations.
Papers
More filters
Proceedings ArticleDOI
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
Yung-Sung Chuang,Rumen Dangovski,Hongyin Luo,Yan Zhang,Shiyu Chang,Marin Soljacic,Shang-Wen Li,Wen-tau Yih,Yoon Jun Kim,James Glass +9 more
TL;DR: This work proposes DiffCSE, an unsupervised contrastive learning framework for learning sentence embeddings that are sensitive to the difference between the original sentence and an edited sentence, and shows that DiffSCE is an instance of equivariant Contrastive learning, which generalizes contrastivelearning and learns representations that are insensitive to certain types of augmentations and sensitive to other “harmful” types of augations.
Journal ArticleDOI
Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications
TL;DR: This work derives a phase-coded representation of the memory state, Rotational Unit of Memory (RUM), that unifies the concepts of unitary learning and associative memory and shows experimentally that RNNs based on RUMs can solve basic sequential tasks such as memory copying and memory recall much better than LSTMs/GRUs.
Proceedings Article
Equivariant Self-Supervised Learning: Encouraging Equivariance in Representations
Rumen Dangovski,Li Jing,Charlotte Loh,Seungwook Han,Akash Krishna Srivastava,Brian Cheung,Pritti Agrawal,Marin Soljacic +7 more
Journal ArticleDOI
Weitzenböck derivations of free metabelian Lie algebras
TL;DR: In this paper, it was shown that the vector space of the constants (L d / L d ) δ in the commutator ideal of the metabelian Lie algebra is a finitely generated K [ X d ] δ -module.
Posted Content
On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning.
TL;DR: This work devise an approach to solve Wasserstein-Procrustes in a direct way, which can be used to refine and to improve popular UCL methods such as iterative closest point (ICP), multilingual unsupervised and supervised embeddings (MUSE) and supervised Procruste methods.