Back to Search Start Over

Ensemble Multi-Quantiles: Adaptively Flexible Distribution Prediction for Uncertainty Quantification

Authors :
Yan, Xing
Su, Yonghua
Ma, Wenxuan
Source :
IEEE Transactions on Pattern Analysis and Machine Intelligence; November 2023, Vol. 45 Issue: 11 p13068-13082, 15p
Publication Year :
2023

Abstract

We propose a novel, succinct, and effective approach for distribution prediction to quantify uncertainty in machine learning. It incorporates adaptively flexible distribution prediction of <inline-formula><tex-math notation="LaTeX">$\mathbb {P}(\mathbf {y}|\mathbf {X}=x)$</tex-math><alternatives><mml:math><mml:mrow><mml:mi mathvariant="double-struck">P</mml:mi><mml:mo>(</mml:mo><mml:mi mathvariant="bold">y</mml:mi><mml:mo>|</mml:mo><mml:mi mathvariant="bold">X</mml:mi><mml:mo>=</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="ma-ieq1-3288028.gif"/></alternatives></inline-formula> in regression tasks. This conditional distribution's quantiles of probability levels spreading the interval (0,1) are boosted by additive models which are designed by us with intuitions and interpretability. We seek an adaptive balance between the structural integrity and the flexibility for <inline-formula><tex-math notation="LaTeX">$\mathbb {P}(\mathbf {y}|\mathbf {X}=x)$</tex-math><alternatives><mml:math><mml:mrow><mml:mi mathvariant="double-struck">P</mml:mi><mml:mo>(</mml:mo><mml:mi mathvariant="bold">y</mml:mi><mml:mo>|</mml:mo><mml:mi mathvariant="bold">X</mml:mi><mml:mo>=</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="ma-ieq2-3288028.gif"/></alternatives></inline-formula>, while Gaussian assumption results in a lack of flexibility for real data and highly flexible approaches (e.g., estimating the quantiles separately without a distribution structure) inevitably have drawbacks and may not lead to good generalization. This ensemble multi-quantiles approach called EMQ proposed by us is totally data-driven, and can gradually depart from Gaussian and discover the optimal conditional distribution in the boosting. On extensive regression tasks from UCI datasets, we show that EMQ achieves state-of-the-art performance comparing to many recent uncertainty quantification methods. Visualization results further illustrate the necessity and the merits of such an ensemble model.

Details

Language :
English
ISSN :
01628828
Volume :
45
Issue :
11
Database :
Supplemental Index
Journal :
IEEE Transactions on Pattern Analysis and Machine Intelligence
Publication Type :
Periodical
Accession number :
ejs64146939
Full Text :
https://doi.org/10.1109/TPAMI.2023.3288028