Back to Search Start Over

Bayesian Model Averaging Naive Bayes (BMA-NB): Averaging over an Exponential Number of Feature Models in Linear Time

Authors :
Ga Wu
Scott Sanner
Rodrigo Oliveira
Source :
Proceedings of the AAAI Conference on Artificial Intelligence. 29
Publication Year :
2015
Publisher :
Association for the Advancement of Artificial Intelligence (AAAI), 2015.

Abstract

Naive Bayes (NB) is well-known to be a simple but effective classifier, especially when combined with feature selection. Unfortunately, feature selection methods are often greedy and thus cannot guarantee an optimal feature set is selected. An alternative to feature selection is to use Bayesian model averaging (BMA), which computes a weighted average over multiple predictors; when the different predictor models correspond to different feature sets, BMA has the advantage over feature selection that its predictions tend to have lower variance on average in comparison to any single model. In this paper, we show for the first time that it is possible to exactly evaluate BMA over the exponentially-sized powerset of NB feature models in linear-time in the number of features; this yields an algorithm about as expensive to train as a single NB model with all features, but yet provably converges to the globally optimal feature subset in the asymptotic limit of data. We evaluate this novel BMA-NB classifier on a range of datasets showing that it never underperforms NB (as expected) and sometimes offers performance competitive (or superior) to classifiers such as SVMs and logistic regression while taking a fraction of the time to train.

Subjects

Subjects :
General Medicine

Details

ISSN :
23743468 and 21595399
Volume :
29
Database :
OpenAIRE
Journal :
Proceedings of the AAAI Conference on Artificial Intelligence
Accession number :
edsair.doi...........4e4a0e1794f9294fa59e12d9815d420d
Full Text :
https://doi.org/10.1609/aaai.v29i1.9634