scispace - formally typeset
Open AccessProceedings ArticleDOI

Deep Neural Networks for YouTube Recommendations

Paul Covington, +2 more
- pp 191-198
TLDR
This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.
Abstract
YouTube represents one of the largest scale and most sophisticated industrial recommendation systems in existence. In this paper, we describe the system at a high level and focus on the dramatic performance improvements brought by deep learning. The paper is split according to the classic two-stage information retrieval dichotomy: first, we detail a deep candidate generation model and then describe a separate deep ranking model. We also provide practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.

read more

Citations
More filters
Posted Content

Learning Audio Embeddings with User Listening Data for Content-based Music Recommendation.

TL;DR: This work first explores user listening history and demographics to construct a user embedding representing the user’s music preference, and extracts audio embeddings as features for music genre classification tasks.
Journal ArticleDOI

MPClan: Protocol Suite for Privacy-Conscious Computations

TL;DR: In this paper , a semi-honest protocol is proposed to support higher resiliency in an honest-majority setting with efficiency of the online phase at the centre stage. But the protocol requires only half of the parties, except for one-time verification towards the end, and provides security with fairness.
Posted Content

Learning Cross-Domain Representation with Multi-Graph Neural Network.

TL;DR: A novel model - Deep Multi-Graph Embedding (DMGE) is proposed to learn cross-domain representation in an unsupervised manner and a multiple-gradient descent optimizer is presented for efficiently training the model.
Journal ArticleDOI

Representation Learning and Pairwise Ranking for Implicit Feedback in Recommendation Systems

TL;DR: A novel ranking framework for collaborative filtering with the overall aim of learning user preferences over items by minimizing a pairwise ranking loss is proposed and it is demonstrated that the approach is very competitive with the best state-of-the-art collaborative filtering techniques proposed for implicit feedback.
Journal ArticleDOI

Facing Cold-Start: A Live TV Recommender System Based on Neural Networks

TL;DR: A viewing environment model called DeepTV is proposed that considers viewing behavior records and electronic program guides and includes a feature generation process and a model construction process that significantly outperforms baseline algorithms.
References
More filters
Book ChapterDOI

I and J

Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Posted Content

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Batch Normalization as mentioned in this paper normalizes layer inputs for each training mini-batch to reduce the internal covariate shift in deep neural networks, and achieves state-of-the-art performance on ImageNet.
Posted Content

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: In this paper, the Skip-gram model is used to learn high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships and improve both the quality of the vectors and the training speed.