1. Reducing Symbiosis Bias Through Better A/B Tests of Recommendation Algorithms
- Author
-
Brennan, Jennifer, Cong, Yahu, Yu, Yiwei, Lin, Lina, Peng, Yajun, Meng, Changping, Han, Ningren, Pouget-Abadie, Jean, and Holtz, David
- Subjects
Statistics - Methodology - Abstract
It is increasingly common in digital environments to use A/B tests to compare the performance of recommendation algorithms. However, such experiments often violate the stable unit treatment value assumption (SUTVA), particularly SUTVA's "no hidden treatments" assumption, due to the shared data between algorithms being compared. This results in a novel form of bias, which we term "symbiosis bias," where the performance of each algorithm is influenced by the training data generated by its competitor. In this paper, we investigate three experimental designs--cluster-randomized, data-diverted, and user-corpus co-diverted experiments--aimed at mitigating symbiosis bias. We present a theoretical model of symbiosis bias and simulate the impact of each design in dynamic recommendation environments. Our results show that while each design reduces symbiosis bias to some extent, they also introduce new challenges, such as reduced training data in data-diverted experiments. We further validate the existence of symbiosis bias using data from a large-scale A/B test conducted on a global recommender system, demonstrating that symbiosis bias affects treatment effect estimates in the field. Our findings provide actionable insights for researchers and practitioners seeking to design experiments that accurately capture algorithmic performance without bias in treatment effect estimates introduced by shared data., Comment: Work-in-progress
- Published
- 2023