## Sisterhood of Classifiers: A Comparative Study of Naive Bayes and Noisy-or Networks

Citations: | 1 - 0 self |

### BibTeX

@MISC{Chen_sisterhoodof,

author = {David Chen},

title = {Sisterhood of Classifiers: A Comparative Study of Naive Bayes and Noisy-or Networks},

year = {}

}

### OpenURL

### Abstract

Classification is a task central to many machine learning problems. In this paper we examine two Bayesian network classifiers, the naive Bayes and the noisy-or models. They are of particular interest because of their simple structures. We compare them on two dimensions: expressive power and ability to learn. As it turns out, naive Bayes, noisy-or, and logistic regression classifiers all have equivalent expressiveness. We show mathematical derivations of how to transform a classifer in one model into the other two. These classifiers differ on their ability to learn though. We conducted an experiment confirming the intuition that naive Bayes performs better than noisy-or when the data fits its independence assumptions, and vice versa. However, we still do not have a clear set of criteria for determining under exactly what conditions would each classifier excel. Further study of the strenghts and weaknesses of each classifier should provide deeper insight on how to improve the current models. One possible extension would be to combine the naive Bayes and noisy-or model so that the network will more closely depict the actual relationship between the attributes. 1