## Discretization for naive-Bayes learning: managing discretization bias and variance (2003)

### Cached

### Download Links

- [www.csse.monash.edu]
- [www.csse.monash.edu.au]
- [www.csse.monash.edu]
- []
- [www.csse.monash.edu.au]
- DBLP

### Other Repositories/Bibliography

Citations: | 17 - 5 self |

### BibTeX

@MISC{Yang03discretizationfor,

author = {Ying Yang and Geoffrey I. Webb},

title = {Discretization for naive-Bayes learning: managing discretization bias and variance},

year = {2003}

}

### Years of Citing Articles

### OpenURL

### Abstract

Quantitative attributes are usually discretized in naive-Bayes learning. We prove a theorem that explains why discretization can be effective for naive-Bayes learning. The use of different discretization techniques can be expected to affect the classification bias and variance of generated naive-Bayes classifiers, effects we name discretization bias and variance. We argue that by properly managing discretization bias and variance, we can effectively reduce naive-Bayes classification error. In particular, we propose proportional k-interval discretization and equal size discretization, two efficient heuristic discretization methods that are able to effectively manage discretization bias and variance by tuning discretized interval size and interval number. We empirically evaluate our new techniques against five key discretization methods for naive-Bayes classifiers. The experimental results support our theoretical arguments by showing that naive-Bayes classifiers trained on data discretized by our new methods are able to achieve lower classification error than those trained on data discretized by alternative discretization methods.