1. Learning with optimal interpolation norms
- Author
-
Andrew McDonald, Massimiliano Pontil, Charles A. Micchelli, and Patrick L. Combettes
- Subjects
Convex analysis ,Mathematical optimization ,Applied Mathematics ,020206 networking & telecommunications ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,Linear map ,Iterated function ,Theory of computation ,0202 electrical engineering, electronic engineering, information engineering ,Feature (machine learning) ,Minification ,0101 mathematics ,Convex function ,Interpolation ,Mathematics - Abstract
We analyze a class of norms defined via an optimal interpolation problem involving the composition of norms and a linear operator. This construction, known as infimal postcomposition in convex analysis, is shown to encompass various norms which have been used as regularizers in machine learning, signal processing, and statistics. In particular, these include the latent group lasso, the overlapping group lasso, and certain norms used for learning tensors. We establish basic properties of this class of norms and we provide dual norms. The extension to more general classes of convex functions is also discussed. A stochastic block-coordinate version of the Douglas-Rachford algorithm is devised to solve minimization problems involving these regularizers. A prominent feature of the algorithm is that it yields iterates that converge to a solution in the case of nonsmooth losses and random block updates. Finally, we present numerical experiments with problems employing the latent group lasso penalty.
- Published
- 2018
- Full Text
- View/download PDF