scispace - formally typeset
Open AccessProceedings ArticleDOI

Deep Neural Networks for YouTube Recommendations

Paul Covington, +2 more
- pp 191-198
TLDR
This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.
Abstract: 
YouTube represents one of the largest scale and most sophisticated industrial recommendation systems in existence. In this paper, we describe the system at a high level and focus on the dramatic performance improvements brought by deep learning. The paper is split according to the classic two-stage information retrieval dichotomy: first, we detail a deep candidate generation model and then describe a separate deep ranking model. We also provide practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.

read more

Citations
More filters
Proceedings ArticleDOI

Latent Cross: Making Use of Context in Recurrent Recommender Systems

TL;DR: This paper offers "Latent Cross," an easy-to-use technique to incorporate contextual data in the RNN by embedding the context feature first and then performing an element-wise product of the context embedding with model's hidden states.
Posted Content

Graph Neural Networks in Recommender Systems: A Survey

TL;DR: This article provides a taxonomy of GNN-based recommendation models according to the types of information used and recommendation tasks and systematically analyze the challenges of applying GNN on different types of data.
Proceedings ArticleDOI

Top-K Off-Policy Correction for a REINFORCE Recommender System

TL;DR: This work presents a general recipe of addressing biases in a production top-K recommender system at Youtube, built with a policy-gradient-based algorithm, i.e. REINFORCE, and proposes a noveltop-K off-policy correction to account for the policy recommending multiple items at a time.
Proceedings ArticleDOI

Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems

TL;DR: In this paper, the authors proposed a knowledge-aware graph neural network with label smoothness regularization (KGNN-LS) to compute personalized item embeddings by first applying a trainable function that identifies important knowledge graph relationships for a given user and then transform the knowledge graph into a user-specific weighted graph.
Posted Content

Variational Autoencoders for Collaborative Filtering

TL;DR: In this paper, a variational autoencoder (VAE) was extended to collaborative filtering for implicit feedback, and a generative model with multinomial likelihood and Bayesian inference for parameter estimation was proposed.
References
More filters
Book ChapterDOI

I and J

Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Posted Content

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Sergey Ioffe, +1 more
- 11 Feb 2015 - 
TL;DR: Batch Normalization as mentioned in this paper normalizes layer inputs for each training mini-batch to reduce the internal covariate shift in deep neural networks, and achieves state-of-the-art performance on ImageNet.
Posted Content

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: In this paper, the Skip-gram model is used to learn high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships and improve both the quality of the vectors and the training speed.