1. Distributed Stochastic Bilevel Optimization: Improved Complexity and Heterogeneity Analysis
- Author
-
Niu, Youcheng, Xu, Jinming, Sun, Ying, Huang, Yan, and Chai, Li
- Subjects
Mathematics - Optimization and Control ,Electrical Engineering and Systems Science - Systems and Control - Abstract
This paper consider solving a class of nonconvex-strongly-convex distributed stochastic bilevel optimization (DSBO) problems with personalized inner-level objectives. Most existing algorithms require computational loops for hypergradient estimation, leading to computational inefficiency. Moreover, the impact of data heterogeneity on convergence in bilevel problems is not explicitly characterized yet. To address these issues, we propose LoPA, a loopless personalized distributed algorithm that leverages a tracking mechanism for iterative approximation of inner-level solutions and Hessian-inverse matrices without relying on extra computation loops. Our theoretical analysis explicitly characterizes the heterogeneity across nodes denoted by $b$, and establishes a sublinear rate of ${{\mathcal{O}}}( {\frac{{1}}{{(1-\rho)^2K}} + \frac{{b^{\frac{2}{3}} }}{{\left( {1 - \rho } \right)^{\frac{2}{3}}K^{\frac{2}{3}} }} + \frac{{1}}{\sqrt{ K }} ( \sigma_{\rm{p}} + \frac{1}{\sqrt{m}}\sigma_{\rm{c}}) } )$ without the boundedness of local hypergradients, where $\sigma_{\rm p}$ and $\sigma_{\rm c}$ represent the gradient sampling variances associated with the inner- and outer-level variables, respectively. We also develop a variant of LoPA based on gradient tracking to eliminate the impact of data heterogeneity, yielding an improved rate of ${{\mathcal{O}}}(\frac{{{1}}}{{ (1-\rho)^4K }} + \frac{1}{{\sqrt{K}}}( \sigma_{\rm{p}} + \frac{1}{\sqrt{m}}\sigma_{\rm{c}} ) )$. The computational complexity of LoPA is of ${{\mathcal{O}}}({\epsilon^{-2}})$ to an $\epsilon$-stationary point, matching the communication complexity due to the loopless structure, which outperforms existing counterparts for DSBO. Numerical experiments validate the effectiveness of the proposed algorithm., Comment: 50 pages, 22 figures
- Published
- 2023