Back to Search Start Over

Interest-oriented Universal User Representation via Contrastive Learning

Authors :
Sun, Qinghui
Gu, Jie
Yang, Bei
Xu, XiaoXiao
Xu, Renjun
Gao, Shangde
Liu, Hong
Xu, Huan
Publication Year :
2021
Publisher :
arXiv, 2021.

Abstract

User representation is essential for providing high-quality commercial services in industry. Universal user representation has received many interests recently, with which we can be free from the cumbersome work of training a specific model for each downstream application. In this paper, we attempt to improve universal user representation from two points of views. First, a contrastive self-supervised learning paradigm is presented to guide the representation model training. It provides a unified framework that allows for long-term or short-term interest representation learning in a data-driven manner. Moreover, a novel multi-interest extraction module is presented. The module introduces an interest dictionary to capture principal interests of the given user, and then generate his/her interest-oriented representations via behavior aggregation. Experimental results demonstrate the effectiveness and applicability of the learned user representations.<br />Comment: 8 pages, during peer review

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....ace8e8f714df36c34d7a02109f995393
Full Text :
https://doi.org/10.48550/arxiv.2109.08865