## Lazy Learning of Bayesian Rules (2000)

### Cached

### Download Links

- [www.csse.monash.edu]
- [sci2s.ugr.es]
- [www.csse.monash.edu]
- [www.cm.deakin.edu.au]
- [www3.cm.deakin.edu.au]
- [www3.cm.deakin.edu.au]
- DBLP

### Other Repositories/Bibliography

Venue: | Machine Learning |

Citations: | 39 - 8 self |

### BibTeX

@INPROCEEDINGS{Zheng00lazylearning,

author = {Zijian Zheng and Geoffrey I. Webb},

title = {Lazy Learning of Bayesian Rules},

booktitle = {Machine Learning},

year = {2000},

pages = {53--84},

publisher = {Kluwer Academic Publishers}

}

### OpenURL

### Abstract

The naive Bayesian classifier provides a simple and e#ective approach to classifier learning, but its attribute independence assumption is often violated in the real world. A number of approaches have sought to alleviate this problem. A Bayesian tree learning algorithm builds a decision tree, and generates a local naive Bayesian classifier at each leaf. The tests leading to a leaf can alleviate attribute inter-dependencies for the local naive Bayesian classifier. However, Bayesian tree learning still su#ers from the small disjunct problem of tree learning. While inferred Bayesian trees demonstrate low average prediction error rates, there is reason to believe that error rates will be higher for those leaves with few training examples. This paper proposes the application of lazy learning techniques to Bayesian tree induction and presents the resulting lazy Bayesian rule learning algorithm, called Lbr. This algorithm can be justified by a variant of Bayes theorem which supports a weaker conditional attribute independence assumption than is required by naive Bayes. For each test example, it builds a most appropriate rule with a local naive Bayesian classifier as its consequent. It is demonstrated that the computational requirements of Lbr are reasonable in a wide cross-section of natural domains. Experiments with these domains show that, on average, this new algorithm obtains lower error rates significantly more often than the reverse in comparison to a naive Bayesian classifier, C4.5, a Bayesian tree learning algorithm, a constructive Bayesian classifier that eliminates attributes and constructs new attributes using Cartesian products of existing nominal attributes, and a lazy decision tree learning algorithm. It also outperforms, although the result is not statisticall...