Back to Search
Start Over
Tensor-Based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations
- Source :
- IEEE Access. 11:6357-6371
- Publication Year :
- 2023
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2023.
-
Abstract
- Self-attentive transformer models have recently been shown to solve the next item recommendation task very efficiently. The learned attention weights capture sequential dynamics in user behavior and generalize well. Motivated by the special structure of learned parameter space, we question if it is possible to mimic it with an alternative and more lightweight approach. We develop a new tensor factorization-based model that ingrains the structural knowledge about sequential data within the learning process. We demonstrate how certain properties of a self-attention network can be reproduced with our approach based on special Hankel matrix representation. The resulting model has a shallow linear architecture and compares competitively to its neural counterpart.<br />15 pages, 6 figures, submitted to IEEE Access
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Artificial Intelligence (cs.AI)
General Computer Science
Computer Science - Artificial Intelligence
Statistics - Machine Learning
General Engineering
Machine Learning (stat.ML)
General Materials Science
Electrical and Electronic Engineering
Information Retrieval (cs.IR)
Machine Learning (cs.LG)
Computer Science - Information Retrieval
Subjects
Details
- ISSN :
- 21693536
- Volume :
- 11
- Database :
- OpenAIRE
- Journal :
- IEEE Access
- Accession number :
- edsair.doi.dedup.....859970cea5b90bacfc958f5389131e47