Back to Search Start Over

TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism.

Authors :
Zhang, Xianrong
Li, Ran
Wang, Simin
Li, Xintong
Sun, Zhe
Source :
Computational Intelligence & Neuroscience; 8/29/2022, p1-17, 17p
Publication Year :
2022

Abstract

The click-through rate (CTR) prediction task is used to estimate the probabilities of users clicking on recommended items, which are extremely important in recommender systems. Recently, the deep factorization machine (DeepFM) algorithm was proposed. The DeepFM algorithm incorporates a factorization machine (FM) to learn not only low-order features but also the interactions of higher-order features. However, DeepFM lacks user diversity representations and does not consider the text. In view of this, we propose a text-attention FM (TAFM) based on the DeepFM algorithm. First, the attention mechanism in the TAFM algorithm is used to address the diverse representations of users and goods and to mine the features that are most interesting to users. Second, the TAFM model can fully learn text features through its text component, text attention component, and N-gram text feature extraction component, which can fully explore potential user preferences and the diversity among user interests. In addition, the convolutional autoencoder in the TAFM can learn some higher-level features, and the higher-order feature mining process is more comprehensive. On the public dataset, the better performing models in the existing models are deep cross network (DCN), DeepFM, and product-based neural network (PNN), respectively, and the AUC score metrics of these models hover between 0.698 and 0.699. The AUC score of our design model is 0.730, which is at least 3% higher than that of the existing models. The accuracy metric of our model is at least 0.1 percentage points higher than that of existing models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
16875265
Database :
Complementary Index
Journal :
Computational Intelligence & Neuroscience
Publication Type :
Academic Journal
Accession number :
158785029
Full Text :
https://doi.org/10.1155/2022/1775496