Back to Search Start Over

Multi independent latent component extension of naive Bayes classifier

Authors :
Alireza Hediehloo
Nima Shiri Harzevili
Sasan H. Alizadeh
Source :
Knowledge-Based Systems. 213:106646
Publication Year :
2021
Publisher :
Elsevier BV, 2021.

Abstract

Naive Bayes (NB) classifier ease of use along with its remarkable performance has led many researchers to extend the scope of its applications to real-world domains by relaxing the conditional independence assumption of features given the information about the class variable. However, fulfilling this objective, most of the generalizations, cut their own way through compromising the model’s simplicity, make more complex classifiers with a substantial deviation from the original one. Multi Independent Latent Component Naive Bayes Classifier (MILC-NB) leverages a set of latent variables to preserve the overall structure of naive Bayes classifier while rectifying its major restriction. Each latent variable is responsible for keeping a subset of conditionally dependent features d-connected within a component, and the set of features is divided into non-overlapping partitions across different components accordingly. We prove that components are conditionally independent given the information about the class variable which allows us to devise novel mathematical methods with a substantial reduction in the complexities of classification and learning. Experiments on 34 datasets obtained from the OpenML repository indicate that MILC-NB outperforms state-of-the-art classifiers in terms of area under the ROC curve (AUC) and classification accuracy (ACC).

Details

ISSN :
09507051
Volume :
213
Database :
OpenAIRE
Journal :
Knowledge-Based Systems
Accession number :
edsair.doi...........c900cc54da9c717fab28540d6551def4