scispace - formally typeset
Open AccessProceedings ArticleDOI

Deep Neural Networks for YouTube Recommendations

Paul Covington, +2 more
- pp 191-198
Reads0
Chats0
TLDR
This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.
Abstract
YouTube represents one of the largest scale and most sophisticated industrial recommendation systems in existence. In this paper, we describe the system at a high level and focus on the dramatic performance improvements brought by deep learning. The paper is split according to the classic two-stage information retrieval dichotomy: first, we detail a deep candidate generation model and then describe a separate deep ranking model. We also provide practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.

read more

Citations
More filters
Posted Content

CABaRet: Leveraging Recommendation Systems for Mobile Edge Caching

TL;DR: In this paper, the authors propose an approach that enables cache-aware recommendations without requiring a network and content provider collaboration, and leverage information provided publicly by the recommendation system, and build a system that provides cache-friendly and high-quality recommendations.
Proceedings ArticleDOI

ST-PIL: Spatial-Temporal Periodic Interest Learning for Next Point-of-Interest Recommendation

TL;DR: Zhang et al. as mentioned in this paper proposed to learn spatial-temporal periodic interest by learning the temporal periodic interest of daily granularity, then utilizing intra-level attention to form long-term interest.
Proceedings ArticleDOI

SPACE: locality-aware processing in heterogeneous memory for personalized recommendations

TL;DR: Zhang et al. as discussed by the authors proposed a novel heterogeneous memory architecture, called Space, which is efficient in terms of performance and energy, and exploits a compute-capable 3D-stacked DRAM with DIMMs for personalized recommendations.
Posted ContentDOI

Item Recommendation with Variational Autoencoders and Heterogenous Priors

TL;DR: This work incorporates user-dependent priors in the latent VAE space to encode users' preferences as functions of the review text to improve recommendation quality in collaborative filtering with side information.
Dissertation

Recommandation contextuelle de services : application à la recommandation d'évènements culturels dans la ville intelligente

TL;DR: Les algorithmes de bandits-manchots pour les systemes de recommandation sensibles au contexte font aujourd’hui l’objet de nombreuses etudes d’evenements culturels dans the ville intelligente.
References
More filters
Book ChapterDOI

I and J

Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Posted Content

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Batch Normalization as mentioned in this paper normalizes layer inputs for each training mini-batch to reduce the internal covariate shift in deep neural networks, and achieves state-of-the-art performance on ImageNet.
Posted Content

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: In this paper, the Skip-gram model is used to learn high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships and improve both the quality of the vectors and the training speed.