Back to Search Start Over

On $\ell_p$-hyperparameter Learning via Bilevel Nonsmooth Optimization

Authors :
Okuno, Takayuki
Takeda, Akiko
Kawana, Akihiro
Watanabe, Motokazu
Source :
Journal of Machine Learning Research 22 (2021) 1-47
Publication Year :
2018

Abstract

We propose a bilevel optimization strategy for selecting the best hyperparameter value for the nonsmooth $\ell_p$ regularizer with $0<p\le 1$. The concerned bilevel optimization problem has a nonsmooth, possibly nonconvex, $\ell_p$-regularized problem as the lower-level problem. Despite the recent popularity of nonconvex $\ell_p$-regularizer and the usefulness of bilevel optimization for selecting hyperparameters, algorithms for such bilevel problems have not been studied because of the difficulty of $\ell_p$-regularizer. Our contribution is the proposal of the first algorithm equipped with a theoretical guarantee for finding the best hyperparameter of $\ell_p$-regularized supervised learning problems. Specifically, we propose a smoothing-type algorithm for the above mentioned bilevel optimization problems and provide a theoretical convergence guarantee for the algorithm. Indeed, since optimality conditions are not known for such bilevel optimization problems so far, new necessary optimality conditions, which are called the SB-KKT conditions, are derived and it is shown that a sequence generated by the proposed algorithm actually accumulates at a point satisfying the SB-KKT conditions under some mild assumptions. The proposed algorithm is simple and scalable as our numerical comparison to Bayesian optimization and grid search indicates.

Details

Database :
arXiv
Journal :
Journal of Machine Learning Research 22 (2021) 1-47
Publication Type :
Report
Accession number :
edsarx.1806.01520
Document Type :
Working Paper