1. Tighter Risk Bounds for Mixtures of Experts
- Author
-
Akretche, Wissam, LeBlanc, Frédéric, and Marchand, Mario
- Subjects
Computer Science - Machine Learning ,Computer Science - Cryptography and Security ,Statistics - Machine Learning - Abstract
In this work, we provide upper bounds on the risk of mixtures of experts by imposing local differential privacy (LDP) on their gating mechanism. These theoretical guarantees are tailored to mixtures of experts that utilize the one-out-of-$n$ gating mechanism, as opposed to the conventional $n$-out-of-$n$ mechanism. The bounds exhibit logarithmic dependence on the number of experts, and encapsulate the dependence on the gating mechanism in the LDP parameter, making them significantly tighter than existing bounds, under reasonable conditions. Experimental results support our theory, demonstrating that our approach enhances the generalization ability of mixtures of experts and validating the feasibility of imposing LDP on the gating mechanism.
- Published
- 2024