Back to Search
Start Over
One Dependence Augmented Naive Bayes.
- Source :
- Advanced Data Mining & Applications (9783540278948); 2005, p186-194, 9p
- Publication Year :
- 2005
-
Abstract
- In real-world data mining applications, an accurate ranking is as important as an accurate classification. Naive Bayes has been widely used in data mining as a simple and effective classification and ranking algorithm. Since its conditional independence assumption is rarely true, numerous algorithms have been proposed to improve naive Bayes, for example, SBC[1] and TAN[2]. Indeed, the experimental results show that SBC and TAN achieve a significant improvement in term of classification accuracy. However, unfortunately, our experiments also show that SBC and TAN perform even worse than naive Bayes in ranking measured by AUC[3,4](the area under the Receiver Operating Characteristics curve). This fact raises the question of whether we can improve Naive Bayes with both accurate classification and ranking? In this paper, responding to this question, we present a new learning algorithm called One Dependence Augmented Naive Bayes(ODANB). Our motivation is to develop a new algorithm to improve Naive Bayes' performance not only on classification measured by accuracy but also on ranking measured by AUC. We experimentally tested our algorithm, using the whole 36 UCI datasets recommended by Weka[5], and compared it to Naive Bayes, SBC and TAN. The experimental results show that our algorithm outperforms all the other algorithms significantly in yielding accurate ranking, yet at the same time outperforms all the other algorithms slightly in terms of classification accuracy. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISBNs :
- 9783540278948
- Database :
- Complementary Index
- Journal :
- Advanced Data Mining & Applications (9783540278948)
- Publication Type :
- Book
- Accession number :
- 32864173
- Full Text :
- https://doi.org/10.1007/11527503_22