Back to Search Start Over

Regression via Arbitrary Quantile Modeling

Authors :
Zhang, Faen
Fan, Xinyu
Xu, Hui
Zhou, Pengcheng
He, Yujian
Liu, Junlong
Publication Year :
2019
Publisher :
arXiv, 2019.

Abstract

In the regression problem, L1 and L2 are the most commonly used loss functions, which produce mean predictions with different biases. However, the predictions are neither robust nor adequate enough since they only capture a few conditional distributions instead of the whole distribution, especially for small datasets. To address this problem, we proposed arbitrary quantile modeling to regulate the prediction, which achieved better performance compared to traditional loss functions. More specifically, a new distribution regression method, Deep Distribution Regression (DDR), is proposed to estimate arbitrary quantiles of the response variable. Our DDR method consists of two models: a Q model, which predicts the corresponding value for arbitrary quantile, and an F model, which predicts the corresponding quantile for arbitrary value. Furthermore, the duality between Q and F models enables us to design a novel loss function for joint training and perform a dual inference mechanism. Our experiments demonstrate that our DDR-joint and DDR-disjoint methods outperform previous methods such as AdaBoost, random forest, LightGBM, and neural networks both in terms of mean and quantile prediction.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....cb94ecce8939dfd0930a219a813aa3ca
Full Text :
https://doi.org/10.48550/arxiv.1911.05441