1. Rethinking Large-scale Pre-ranking System: Entire-chain Cross-domain Models
- Author
-
Song, Jinbo, Huang, Ruoran, Wang, Xinyang, Huang, Wei, Yu, Qian, Chen, Mingming, Yao, Yafei, Fan, Chaosheng, Peng, Changping, Lin, Zhangang, Hu, Jinghe, and Shao, Jingping
- Subjects
Computer Science - Information Retrieval ,Computer Science - Artificial Intelligence ,Computer Science - Machine Learning - Abstract
Industrial systems such as recommender systems and online advertising, have been widely equipped with multi-stage architectures, which are divided into several cascaded modules, including matching, pre-ranking, ranking and re-ranking. As a critical bridge between matching and ranking, existing pre-ranking approaches mainly endure sample selection bias (SSB) problem owing to ignoring the entire-chain data dependence, resulting in sub-optimal performances. In this paper, we rethink pre-ranking system from the perspective of the entire sample space, and propose Entire-chain Cross-domain Models (ECM), which leverage samples from the whole cascaded stages to effectively alleviate SSB problem. Besides, we design a fine-grained neural structure named ECMM to further improve the pre-ranking accuracy. Specifically, we propose a cross-domain multi-tower neural network to comprehensively predict for each stage result, and introduce the sub-networking routing strategy with $L0$ regularization to reduce computational costs. Evaluations on real-world large-scale traffic logs demonstrate that our pre-ranking models outperform SOTA methods while time consumption is maintained within an acceptable level, which achieves better trade-off between efficiency and effectiveness., Comment: 5 pages, 2 figures
- Published
- 2023
- Full Text
- View/download PDF