Proceedings ArticleDOI
Gradient boosting factorization machines
Chen Cheng,Fen Xia,Tong Zhang,Irwin King,Michael R. Lyu +4 more
- pp 265-272
Reads0
Chats0
TLDR
A novel Gradient Boosting Factorization Machine (GBFM) model is proposed to incorporate feature selection algorithm with Factorization Machines into a unified framework and the efficiency and effectiveness of the algorithm compared to other state-of-the-art methods are demonstrated.Abstract:
Recommendation techniques have been well developed in the past decades. Most of them build models only based on user item rating matrix. However, in real world, there is plenty of auxiliary information available in recommendation systems. We can utilize these information as additional features to improve recommendation performance. We refer to recommendation with auxiliary information as context-aware recommendation. Context-aware Factorization Machines (FM) is one of the most successful context-aware recommendation models. FM models pairwise interactions between all features, in such way, a certain feature latent vector is shared to compute the factorized parameters it involved. In practice, there are tens of context features and not all the pairwise feature interactions are useful. Thus, one important challenge for context-aware recommendation is how to effectively select "good" interaction features. In this paper, we focus on solving this problem and propose a greedy interaction feature selection algorithm based on gradient boosting. Then we propose a novel Gradient Boosting Factorization Machine (GBFM) model to incorporate feature selection algorithm with Factorization Machines into a unified framework. The experimental results on both synthetic and real datasets demonstrate the efficiency and effectiveness of our algorithm compared to other state-of-the-art methods.read more
Citations
More filters
Proceedings ArticleDOI
AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
TL;DR: An effective and efficient method called the AutoInt to automatically learn the high-order feature interactions of input features and map both the numerical and categorical features into the same low-dimensional space is proposed.
Proceedings ArticleDOI
AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
TL;DR: In this article, a multi-head self-attentive neural network with residual connections is proposed to explicitly model the feature interactions in the low-dimensional space, which can be applied to both numerical and categorical input features.
Posted Content
Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks
TL;DR: In this paper, an attentional factorization machine (AFM) is proposed to learn the importance of each feature interaction from data via a neural attention network, which outperforms Wide&Deep and DeepCross with a much simpler structure and fewer model parameters.
Proceedings Article
A boosting algorithm for item recommendation with implicit feedback
TL;DR: A boosting algorithm named AdaBPR (Adaptive Boosting Personalized Ranking) is proposed for top-N item recommendation using users' implicit feedback and demonstrates its effectiveness on three datasets compared with strong baseline algorithms.
Proceedings ArticleDOI
AutoCross: Automatic Feature Crossing for Tabular Data in Real-World Applications
Luo Yuanfei,Mengshuo Wang,Hao Zhou,Quanming Yao,Wei-Wei Tu,Yuqiang Chen,Wenyuan Dai,Qiang Yang +7 more
TL;DR: In this paper, an automatic feature crossing tool provided by 4Paradigm to its customers, ranging from banks, hospitals, to Internet corporations, enables efficient generation of high-order cross features, which is not yet visited by existing works.
References
More filters
Journal ArticleDOI
Greedy function approximation: A gradient boosting machine.
TL;DR: A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Journal ArticleDOI
Tensor Decompositions and Applications
Tamara G. Kolda,Brett W. Bader +1 more
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Proceedings ArticleDOI
Item-based collaborative filtering recommendation algorithms
TL;DR: This paper analyzes item-based collaborative ltering techniques and suggests that item- based algorithms provide dramatically better performance than user-based algorithms, while at the same time providing better quality than the best available userbased algorithms.
Journal ArticleDOI
Additive Logistic Regression : A Statistical View of Boosting
TL;DR: This work shows that this seemingly mysterious phenomenon of boosting can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood, and develops more direct approximations and shows that they exhibit nearly identical results to boosting.