## Theoretical Foundations Of Linear And Order Statistics Combiners For Neural Pattern Classifiers (1996)

Venue: | IEEE Transactions on neural networks |

Citations: | 30 - 5 self |

### BibTeX

@TECHREPORT{Tumer96theoreticalfoundations,

author = {Kagan Tumer and Joydeep Ghosh},

title = {Theoretical Foundations Of Linear And Order Statistics Combiners For Neural Pattern Classifiers},

institution = {IEEE Transactions on neural networks},

year = {1996}

}

### OpenURL

### Abstract

: Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This paper provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and the order statistics combiners introduced in this paper. We show that combining networks in output space reduces the variance of the actual decision region boundaries around the optimum boundary. For linear combiners, we show that in the absence of classifier bias, the added classification error is proportional to the boundary variance. For non-linear combiners, we show analytically that the selection of the median, the maximum and in general the ith order statistic improves classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions...