Back to Search Start Over

Coverage-Based Designs Improve Sample Mining and Hyperparameter Optimization.

Authors :
Muniraju, Gowtham
Kailkhura, Bhavya
Thiagarajan, Jayaraman J.
Bremer, Peer-Timo
Tepedelenlioglu, Cihan
Spanias, Andreas
Source :
IEEE Transactions on Neural Networks & Learning Systems; Mar2021, Vol. 32 Issue 3, p1241-1253, 13p
Publication Year :
2021

Abstract

Sampling one or more effective solutions from large search spaces is a recurring idea in machine learning (ML), and sequential optimization has become a popular solution. Typical examples include data summarization, sample mining for predictive modeling, and hyperparameter optimization. Existing solutions attempt to adaptively trade off between global exploration and local exploitation, in which the initial exploratory sample is critical to their success. While discrepancy-based samples have become the de facto approach for exploration, results from computer graphics suggest that coverage-based designs, e.g., Poisson disk sampling, can be a superior alternative. In order to successfully adopt coverage-based sample designs to ML applications, which were originally developed for 2-D image analysis, we propose fundamental advances by constructing a parameterized family of designs with provably improved coverage characteristics and developing algorithms for effective sample synthesis. Using experiments in sample mining and hyperparameter optimization for supervised learning, we show that our approach consistently outperforms the existing exploratory sampling methods in both blind exploration and sequential search with Bayesian optimization. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
32
Issue :
3
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
149122068
Full Text :
https://doi.org/10.1109/TNNLS.2020.2982936