DeepFM: a factorization-machine based neural network for CTR prediction
Huifeng Guo,Ruiming Tang,Yunming Ye,Zhenguo Li,Xiuqiang He +4 more
- pp 1725-1731
TLDR
This paper shows that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions, and combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture.Abstract:
Learning sophisticated feature interactions behind user behaviors is critical in maximizing CTR for recommender systems. Despite great progress, existing methods seem to have a strong bias towards low- or high-order interactions, or require expertise feature engineering. In this paper, we show that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions. The proposed model, DeepFM, combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture. Compared to the latest Wide & Deep model from Google, DeepFM has a shared input to its "wide" and "deep" parts, with no need of feature engineering besides raw features. Comprehensive experiments are conducted to demonstrate the effectiveness and efficiency of DeepFM over the existing models for CTR prediction, on both benchmark data and commercial data.read more
Citations
More filters
Proceedings ArticleDOI
ATBRG: Adaptive Target-Behavior Relational Graph Network for Effective Recommendation
TL;DR: A new framework named Adaptive Target-Behavior Relational Graph network (ATBRG) is proposed to effectively capture structural relations of target user-item pairs over KG, and empirical results show that ATBRG consistently and significantly outperforms state-of-the-art methods.
Posted Content
Understanding Capacity-Driven Scale-Out Neural Recommendation Inference
Michael Lui,Yavuz Yetim,Özgür Özkan,Zhuoran Zhao,Shin-Yeh Tsai,Carole-Jean Wu,Mark Hempstead +6 more
TL;DR: This work specifically explores latency-bounded inference systems, compared to the throughput-oriented training systems of other recent works, and finds that the latency and compute overheads of distributed inference are largely attributed to a model's static embedding table distribution and sparsity of inference request inputs.
Posted Content
Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures
TL;DR: The applicability of Direct Feedback Alignment to neural view synthesis, recommender systems, geometric learning, and natural language processing is studied, and it is shown that challenging tasks can be tackled in the absence of weight transport.
Posted Content
A Survey on Neural Recommendation: From Collaborative Filtering to Content and Context Enriched Recommendation.
TL;DR: A systematic review on neural recommender models is conducted, aiming to summarize the field from the perspective of recommendation modeling, and discusses some promising directions in this field, including benchmarking recommender systems, graph reasoning based recommendation models, and explainable and fair recommendations for social good.
Proceedings ArticleDOI
Learning to Embed Categorical Features without Embedding Tables for Recommendation
Wang-Cheng Kang,Derek Zhiyuan Cheng,Tiansheng Yao,Xinyang Yi,Ting Chen,Lichan Hong,Ed H. Chi +6 more
TL;DR: Deep Hash Embedding (DHE) as mentioned in this paper replaces the traditional embedding tables by a deep embedding network to compute embeddings on the fly, which can handle high-cardinality features and unseen feature values.
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Journal Article
Dropout: a simple way to prevent neural networks from overfitting
TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Proceedings ArticleDOI
Deep Neural Networks for YouTube Recommendations
TL;DR: This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact.
Proceedings ArticleDOI
Factorization Machines
TL;DR: Factorization Machines (FM) are introduced which are a new model class that combines the advantages of Support Vector Machines (SVM) with factorization models and can mimic these models just by specifying the input data (i.e. the feature vectors).
Proceedings ArticleDOI
Restricted Boltzmann machines for collaborative filtering
TL;DR: This paper shows how a class of two-layer undirected graphical models, called Restricted Boltzmann Machines (RBM's), can be used to model tabular data, such as user's ratings of movies, and demonstrates that RBM's can be successfully applied to the Netflix data set.