Back to Search Start Over

Disentangled Representation with Cross Experts Covariance Loss for Multi-Domain Recommendation

Authors :
Lin, Zhutian
Pan, Junwei
Yu, Haibin
Xiao, Xi
Wang, Ximei
Feng, Zhixiang
Wen, Shifeng
Huang, Shudong
Xiao, Lei
Jiang, Jie
Publication Year :
2024

Abstract

Multi-domain learning (MDL) has emerged as a prominent research area aimed at enhancing the quality of personalized services. The key challenge in MDL lies in striking a balance between learning commonalities across domains while preserving the distinct characteristics of each domain. However, this gives rise to a challenging dilemma. On one hand, a model needs to leverage domain-specific modules, such as experts or embeddings, to preserve the uniqueness of each domain. On the other hand, due to the long-tailed distributions observed in real-world domains, some tail domains may lack sufficient samples to fully learn their corresponding modules. Unfortunately, existing approaches have not adequately addressed this dilemma. To address this issue, we propose a novel model called Crocodile, which stands for Cross-experts Covariance Loss for Disentangled Learning. Crocodile adopts a multi-embedding paradigm to facilitate model learning and employs a Covariance Loss on these embeddings to disentangle them. This disentanglement enables the model to capture diverse user interests across domains effectively. Additionally, we introduce a novel gating mechanism to further enhance the capabilities of Crocodile. Through empirical analysis, we demonstrate that our proposed method successfully resolves these two challenges and outperforms all state-of-the-art methods on publicly available datasets. We firmly believe that the analytical perspectives and design concept of disentanglement presented in our work can pave the way for future research in the field of MDL.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.12706
Document Type :
Working Paper