Back to Search
Start Over
Learning to Co-Embed Queries and Documents.
- Source :
- Electronics (2079-9292); Nov2022, Vol. 11 Issue 22, p3694, 25p
- Publication Year :
- 2022
-
Abstract
- Learning to Rank (L2R) methods that utilize machine learning techniques to solve the ranking problems have been widely studied in the field of information retrieval. Existing methods usually concatenate query and document features as training input, without explicit understanding of relevance between queries and documents, especially in pairwise based ranking approach. Thus, it is an interesting question whether we can devise an algorithm that effectively describes the relation between queries and documents to learn a better ranking model without incurring huge parameter costs. In this paper, we present a Gaussian Embedding model for Ranking (GERank), an architecture for co-embedding queries and documents, such that each query or document is represented by a Gaussian distribution with mean and variance. Our GERank optimizes an energy-based loss based on the pairwise ranking framework. Additionally, the KL-divergence is utilized to measure the relevance between queries and documents. Experimental results on two LETOR datasets and one TREC dataset demonstrate that our model obtains a remarkable improvement in the ranking performance compared with the state-of-the-art retrieval models. [ABSTRACT FROM AUTHOR]
- Subjects :
- GAUSSIAN distribution
INFORMATION retrieval
MACHINE learning
PROBLEM solving
Subjects
Details
- Language :
- English
- ISSN :
- 20799292
- Volume :
- 11
- Issue :
- 22
- Database :
- Complementary Index
- Journal :
- Electronics (2079-9292)
- Publication Type :
- Academic Journal
- Accession number :
- 160432144
- Full Text :
- https://doi.org/10.3390/electronics11223694