1. Long tail service recommendation based on cross-view and contrastive learning.
- Author
-
Yu, Dongjin, Yu, Ting, Wang, Dongjing, and Wang, Sixuan
- Subjects
- *
IMPLICIT learning , *WEB services , *QUALITY of service , *SERENDIPITY - Abstract
Choosing appropriate Web services to create new applications plays a significant role for service-based development. The general service recommendation approaches usually prefer to suggest popular and frequently used services. However, nowadays users are becoming increasingly interested in long tail services, which are novel but less concerned. Unfortunately, few of the existing service recommendation methods explicitly takes such long tail services into consideration, leading to the poor diversity and serendipity of recommendation results. Compared with the general service recommendation, the major challenges of recommending long tail services come from: (1) more sparse historical invocation records, and (2) lower quality of textual service description. To address the above challenges, in this paper, we put forward a method for Long tail Service Recommendation based on Cross-view and Contrastive learning (LSRCC). More specifically, we propose a text view that employs recursive neural network, multilayer perceptron and attention mechanism to learn the semantic representations of application and service in a more precise way. In addition, we present an invocation view that employs LightGCN (Light Graph Convolution Network) and local level contrastive learning to capture the implicit connections among applications, services and their tags. Finally, we introduce global level contrastive learning to further integrate the above two views for the long tail service recommendation. The extensive experiments on real-world datasets demonstrate that LSRCC outperforms the state-of-the-art baselines in terms of diversity and tail coverage, with the satisfactory precision and recall, in long tail service recommendation. • A long tail service recommendation method is proposed for application creation. • Contrastive learning is used globally and locally to capture implicit connection. • Double views are used to mitigate the problem of data sparsity and inferiority. • Extensive experiments on the real datasets show its superiority over baselines. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF