scispace - formally typeset
Proceedings ArticleDOI

DeepRec: On-device Deep Learning for Privacy-Preserving Sequential Recommendation in Mobile Commerce

Reads0
Chats0
TLDR
This paper proposes DeepRec, an on-device deep learning framework of mining interaction behaviors for sequential recommendation without sending any raw data or intermediate results out of the device, preserving user privacy maximally.
Abstract
Sequential recommendation techniques are considered to be a promising way of providing better user experience in mobile commerce by learning sequential interests within user historical interaction behaviors. However, the recently increasing focus on privacy concerns, such as the General Data Protection Regulation (GDPR), can significantly affect the deployment of state-of-the-art sequential recommendation techniques, because user behavior data are no longer allowed to be arbitrarily used without the user’s explicit permission. To address the issue, this paper proposes DeepRec, an on-device deep learning framework of mining interaction behaviors for sequential recommendation without sending any raw data or intermediate results out of the device, preserving user privacy maximally. DeepRec constructs a global model using data collected before GDPR and fine-tunes a personal model continuously on individual mobile devices using data collected after GDPR. DeepRec employs the model pruning and embedding sparsity techniques to reduce the computation and network overhead, making the model training process practical on computation-constraint mobile devices. Evaluation results show that DeepRec can achieve comparable recommendation accuracy to existing centralized recommendation approaches with small computation overhead and up to 10x reduction in network overhead.

read more

Citations
More filters
Proceedings ArticleDOI

On-Device Next-Item Recommendation with Self-Supervised Knowledge Distillation

TL;DR: A self-supervised knowledge distillation framework is developed which enables the compressed model (student) to distill the essential information lying in the raw data, and improves the long-tail item recommendation through an embedding-recombination strategy with the original model (teacher).
Journal ArticleDOI

Recommendation Systems: An Insight Into Current Development and Future Research Challenges

- 01 Jan 2022 - 
TL;DR: In this paper , an extension to the standard taxonomy is presented to better reflect the latest research trends, including the diverse use of content and temporal information, and the main evaluation metrics adopted by researchers and identify the most commonly used benchmarks.
Journal ArticleDOI

Recommendation Systems: An Insight Into Current Development and Future Research Challenges

TL;DR: A gentle introduction to recommendation systems is provided, describing the task they are designed to solve and the challenges faced in research, and an extension to the standard taxonomy is presented, to better reflect the latest research trends, including the diverse use of content and temporal information.
Journal ArticleDOI

A Generic Federated Recommendation Framework via Fake Marks and Secret Sharing

TL;DR: This paper proposes a lossless and generic federated recommendation framework via fake marks and secret sharing (FMSS), which can not only protect the two types of users’ privacy, without sacrificing the recommendation performance, but can also be applied to most recommendation algorithms for rating prediction, item ranking, and sequential recommendation.
Journal ArticleDOI

Efficient On-Device Session-Based Recommendation

TL;DR: Extensive experimental results on two benchmark datasets demonstrate that compared with existing methods, the proposed on-device recommender not only achieves an 8x inference speedup with a large compression ratio but also shows superior recommendation performance.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Proceedings ArticleDOI

Glove: Global Vectors for Word Representation

TL;DR: A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Posted Content

Efficient Estimation of Word Representations in Vector Space

TL;DR: This paper proposed two novel model architectures for computing continuous vector representations of words from very large data sets, and the quality of these representations is measured in a word similarity task and the results are compared to the previously best performing techniques based on different types of neural networks.
Posted Content

Distilling the Knowledge in a Neural Network

TL;DR: This work shows that it can significantly improve the acoustic model of a heavily used commercial system by distilling the knowledge in an ensemble of models into a single model and introduces a new type of ensemble composed of one or more full models and many specialist models which learn to distinguish fine-grained classes that the full models confuse.
Related Papers (5)