R
Ruslan Salakhutdinov
Researcher at Carnegie Mellon University
Publications - 457
Citations - 142495
Ruslan Salakhutdinov is an academic researcher from Carnegie Mellon University. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 107, co-authored 410 publications receiving 115921 citations. Previous affiliations of Ruslan Salakhutdinov include Carnegie Learning & University of Toronto.
Papers
More filters
Grounding Language Models to Images for Multimodal Inputs and Outputs
TL;DR: This paper proposed a method to ground pretrained text-only language models to the visual domain, enabling them to process arbitrarily interleaved image-and-text data, and generate text overlaid with retrieved images.
Journal ArticleDOI
MultiViz: An Analysis Benchmark for Visualizing and Understanding Multimodal Models
Paul Pu Liang,Yiwei Lyu,Gunjan Chhablani,Nihal Jain,Zihao Deng,Xingbo Wang,Louis-Philippe Morency,Ruslan Salakhutdinov +7 more
TL;DR: The complementary stages in MULTIVIZ together enable users to simulate model predictions, assign interpretable concepts to features, perform error analysis on model misclassifications, and use insights from error analysis to debug models.
Informedia @ TRECVID 2018: Ad-hoc Video Search, Video to Text Description, Activities in Extended video.
Jia Chen,Shizhe Chen,Qin Jin,Alexander G. Hauptmann,Po-Yao Huang,Junwei Liang,Vaibhav,Xiaojun Chang,Jiang Liu,Ting-Yao Hu,Wenhe Liu,Wei Ke,Wayner Barrios,Haroon Idrees,Dong Hyun Yoo,Yaser Sheikh,Ruslan Salakhutdinov,Kris M. Kitani,Dong Huang +18 more
Posted Content
Instabilities of Offline RL with Pre-Trained Neural Representation
TL;DR: In this paper, the authors study the effect of sample-efficient offline RL using pre-trained neural networks and show that substantial error amplification does occur even when using such pretrained representations.
Posted Content
A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning
TL;DR: In this article, the Hilbert-Schmidt Independence Criterion (HSIC) was applied to the Barlow Twins' method, which was shown to be a contrastive learning approach that is free of negative samples.