Back to Search
Start Over
BERT Goes Shopping: Comparing Distributional Models for Product Representations
- Source :
- Proceedings of The 4th Workshop on e-Commerce and NLP.
- Publication Year :
- 2021
- Publisher :
- Association for Computational Linguistics, 2021.
-
Abstract
- Word embeddings (e.g., word2vec) have been applied successfully to eCommerce products through~\textit{prod2vec}. Inspired by the recent performance improvements on several NLP tasks brought by contextualized embeddings, we propose to transfer BERT-like architectures to eCommerce: our model -- ~\textit{Prod2BERT} -- is trained to generate representations of products through masked session modeling. Through extensive experiments over multiple shops, different tasks, and a range of design choices, we systematically compare the accuracy of~\textit{Prod2BERT} and~\textit{prod2vec} embeddings: while~\textit{Prod2BERT} is found to be superior in several scenarios, we highlight the importance of resources and hyperparameters in the best performing models. Finally, we provide guidelines to practitioners for training embeddings under a variety of computational and data constraints.<br />Comment: Updated version. Published as a workshop paper at ECNLP 4 at ACL-IJCNLP 2021
- Subjects :
- FOS: Computer and information sciences
Hyperparameter
Computer Science - Computation and Language
business.industry
Computer science
Machine learning
computer.software_genre
Session (web analytics)
Computer Science - Information Retrieval
Variety (cybernetics)
Range (mathematics)
Word2vec
Artificial intelligence
Product (category theory)
business
Computation and Language (cs.CL)
computer
Information Retrieval (cs.IR)
Word (computer architecture)
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of The 4th Workshop on e-Commerce and NLP
- Accession number :
- edsair.doi.dedup.....1833e1395e319f2a2f930b9425e15d2b