DeepFM: a factorization-machine based neural network for CTR prediction
Huifeng Guo,Ruiming Tang,Yunming Ye,Zhenguo Li,Xiuqiang He +4 more
- pp 1725-1731
TLDR
This paper shows that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions, and combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture.Abstract:
Learning sophisticated feature interactions behind user behaviors is critical in maximizing CTR for recommender systems. Despite great progress, existing methods seem to have a strong bias towards low- or high-order interactions, or require expertise feature engineering. In this paper, we show that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions. The proposed model, DeepFM, combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture. Compared to the latest Wide & Deep model from Google, DeepFM has a shared input to its "wide" and "deep" parts, with no need of feature engineering besides raw features. Comprehensive experiments are conducted to demonstrate the effectiveness and efficiency of DeepFM over the existing models for CTR prediction, on both benchmark data and commercial data.read more
Citations
More filters
Posted Content
DKN: Deep Knowledge-Aware Network for News Recommendation
TL;DR: A deep knowledge-aware network (DKN) that incorporates knowledge graph representation into news recommendation and achieves substantial gains over state-of-the-art deep recommendation models is proposed.
Proceedings ArticleDOI
AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
TL;DR: An effective and efficient method called the AutoInt to automatically learn the high-order feature interactions of input features and map both the numerical and categorical features into the same low-dimensional space is proposed.
Posted Content
Deep Learning Recommendation Model for Personalization and Recommendation Systems
Maxim Naumov,Dheevatsa Mudigere,Hao-Jun Michael Shi,Jianyu Huang,Narayanan Sundaraman,Jongsoo Park,Xiaodong Wang,Udit Gupta,Carole-Jean Wu,Alisson Gusatti Azzolini,Dmytro Dzhulgakov,Andrey Mallevich,Ilia Cherniavskii,Yinghai Lu,Raghuraman Krishnamoorthi,Ansha Yu,Volodymyr Kondratenko,Stephanie Pereira,Xianjie Chen,Wenlin Chen,Vijay Rao,Bill Jia,Liang Xiong,Misha Smelyanskiy +23 more
TL;DR: A state-of-the-art deep learning recommendation model (DLRM) is developed and its implementation in both PyTorch and Caffe2 frameworks is provided and a specialized parallelization scheme utilizing model parallelism on the embedding tables to mitigate memory constraints while exploiting data parallelism to scale-out compute from the fully-connected layers is designed.
Proceedings ArticleDOI
Multi-Task Feature Learning for Knowledge Graph Enhanced Recommendation
TL;DR: This paper considers knowledge graphs as the source of side information and proposes MKR, a Multi-task feature learning approach for Knowledge graph enhanced Recommendation, a deep end-to-end framework that utilizes knowledge graph embedding task to assist recommendation task.
Proceedings ArticleDOI
A Neural Influence Diffusion Model for Social Recommendation
TL;DR: Zhang et al. as discussed by the authors proposed a deep influence propagation model to stimulate how users are influenced by the recursive social diffusion process for social recommendation, which can be applied when the user~(item) attributes or the social network structure is not available.
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Journal Article
Dropout: a simple way to prevent neural networks from overfitting
TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Proceedings ArticleDOI
Deep Neural Networks for YouTube Recommendations
TL;DR: This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.
Proceedings ArticleDOI
Factorization Machines
TL;DR: Factorization Machines (FM) are introduced which are a new model class that combines the advantages of Support Vector Machines (SVM) with factorization models and can mimic these models just by specifying the input data (i.e. the feature vectors).
Proceedings ArticleDOI
Restricted Boltzmann machines for collaborative filtering
TL;DR: This paper shows how a class of two-layer undirected graphical models, called Restricted Boltzmann Machines (RBM's), can be used to model tabular data, such as user's ratings of movies, and demonstrates that RBM's can be successfully applied to the Netflix data set.