R
Ruiyi Zhang
Researcher at Duke University
Publications - 83
Citations - 1165
Ruiyi Zhang is an academic researcher from Duke University. The author has contributed to research in topics: Computer science & Reinforcement learning. The author has an hindex of 16, co-authored 56 publications receiving 813 citations.
Papers
More filters
Proceedings Article
Adversarial Text Generation via Feature-Mover's Distance
Liqun Chen,Shuyang Dai,Chenyang Tao,Haichao Zhang,Zhe Gan,Dinghan Shen,Yizhe Zhang,Guoyin Wang,Ruiyi Zhang,Lawrence Carin +9 more
TL;DR: This work proposes to improve text-generation GAN via a novel approach inspired by optimal transport, which leads to a highly discriminative critic and easy-to-optimize objective, overcoming the mode-collapsing and brittle-training problems in existing methods.
Proceedings ArticleDOI
Topic-Guided Variational Auto-Encoder for Text Generation.
Wenlin Wang,Zhe Gan,Hongteng Xu,Ruiyi Zhang,Guoyin Wang,Dinghan Shen,Changyou Chen,Lawrence Carin +7 more
TL;DR: Experimental results show that the TGVAE outperforms its competitors on both unconditional and conditional text generation, which can also generate semantically-meaningful sentences with various topics.
Proceedings Article
GenDICE: Generalized Offline Estimation of Stationary Values
TL;DR: This work proves the consistency of the method under general conditions, provides a detailed error analysis, and demonstrates strong empirical performance on benchmark tasks, including off-line PageRank and off-policy policy evaluation.
Posted Content
A Unified Particle-Optimization Framework for Scalable Bayesian Sampling
TL;DR: In this article, a particle-optimization framework based on Wasserstein gradient flows was proposed to unify SG-MCMC and SVGD, and to allow new algorithms to be developed.
Posted Content
Topic-guided variational autoencoders for text generation
Wenlin Wang,Zhe Gan,Hongteng Xu,Ruiyi Zhang,Guoyin Wang,Dinghan Shen,Changyou Chen,Lawrence Carin +7 more
TL;DR: This article proposed a topic-guided VAE model for text generation, which specifies the prior as a Gaussian mixture model (GMM) parametrized by a neural topic module, which provides guidance to generate sentences under the topic.