## Decision templates for multiple classifier fusion: an experimental comparison (2001)

### Cached

### Download Links

- [www.informatics.bangor.ac.uk]
- [www.ph.tn.tudelft.nl]
- [www.bangor.ac.uk]
- DBLP

### Other Repositories/Bibliography

Venue: | Pattern Recognition |

Citations: | 102 - 8 self |

### BibTeX

@ARTICLE{Kuncheva01decisiontemplates,

author = {Ludmila I. Kuncheva and James C. Bezdek and Robert P. W. Duin},

title = {Decision templates for multiple classifier fusion: an experimental comparison},

journal = {Pattern Recognition},

year = {2001},

volume = {34},

pages = {299--314}

}

### Years of Citing Articles

### OpenURL

### Abstract

Multiple classifier fusion may generate more accurate classification than each of the constituent classifiers. Fusion is often based on fixed combination rules like the product and average. Only under strict probabilistic conditions can these rules be justified. We present here a simple rule for adapting the class combiner to the application. c decision templates (one per class) are estimated with the same training set that is used for the set of classifiers. These templates are then matched to the decision profile of new incoming objects by some similarity measure. We compare 11 versions of our model with 14 other techniques for classifier fusion on the Satimage and Phoneme datasets from the database ELENA. Our results show that decision templates based on integral type measures of similarity are superior to the other schemes on both data sets.

### Citations

4828 |
Neural Networks for Pattern Recognition
- Bishop
- 1995
(Show Context)
Citation Context ...ent names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks (=-=[15, 16]-=-); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system... |

3921 |
Pattern Classification and Scene Analysis
- Duda, Hart
- 1973
(Show Context)
Citation Context ...r way to look at the fusion problem: we can treat the classifier outputs simply as the input to a second-level classifier, and use classical pattern recognition techniques for the second level design =-=[33]-=-. The use of traditional feature-based classifiers in this approach is difficult because the class distributions in the intermediate feature space are not well-behaved (there will be many points in th... |

995 | On combining classifiers
- Kittler, Hatef, et al.
- 1998
(Show Context)
Citation Context ...sing to see how well the simple aggregation rules, with no second-level training, compete with the more sophisticated ones. This is probably the reason that simple aggregation continues to be popular =-=[5, 40]-=-. One problem with simple aggregation is, that although they have good overall performance, it is not clear which one is good for a particular data set. The Product 18sand Minimum, for example, gave e... |

774 |
Adaptive mixtures of local experts
- Jacobs, Jordan, et al.
- 1991
(Show Context)
Citation Context ... accuracy is an important research topic with different names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts (=-=[11, 12, 13, 14]-=-); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N000... |

513 |
Fuzzy Sets and Systems: Theory and Applications
- Dubois, Prade
- 1980
(Show Context)
Citation Context ...al set with L · c elements, various fuzzy measures of similarity can be used. Let A and B be fuzzy sets on U = {u1, . . . , un}. In this study we used the following four proper measures of similarity =-=[46]-=-: S1(A, B) ≡ where � ζ � is the relative cardinality of the fuzzy set ζ on U � ζ �= 1 n where A∇B is the symmetric difference defined by the Hamming distance � A ∩ B � , (6) � A ∪ B � n� µζ(ui). (7) i... |

368 |
Methods of combining multiple classifiers and their applications to handwriting recognition
- Xu, Krzyzak, et al.
- 1992
(Show Context)
Citation Context ...Class-indifferent fusion. 1 Introduction Combining classifiers to achieve higher accuracy is an important research topic with different names in the literature: • combination of multiple classifiers (=-=[1, 2, 3, 4, 5]-=-); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([2... |

140 |
Methods of combining experts’ probability assessments
- Jacobs
- 1995
(Show Context)
Citation Context ... accuracy is an important research topic with different names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts (=-=[11, 12, 13, 14]-=-); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N000... |

140 | Pattern Classi and Scene Analysis - Duda, Hart - 1973 |

131 |
A method of combining multiple experts for the recognition of unconstrained handwritten numerals
- Huang, Suen
- 1995
(Show Context)
Citation Context ... Table 3: Classifier fusion techniques First level Training at fusion level output↓ No Yes C1: C2: Crisp Majority [35] CC1: CC2: Min, Max, OWA [37], Average, Product, [38, 5] Behavior-Knowledge Space =-=[36]-=- “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks [44, 13], Dempster-S... |

128 |
Combination of multiple classifiers using local accuracy estimates. IEEE Dans. on Pattern Analysis and Machine Intelligence
- Woods
- 1997
(Show Context)
Citation Context ...Class-indifferent fusion. 1 Introduction Combining classifiers to achieve higher accuracy is an important research topic with different names in the literature: • combination of multiple classifiers (=-=[1, 2, 3, 4, 5]-=-); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([2... |

123 |
Information combination operators for data fusion: A comparative review with classification
- Bloch
- 1996
(Show Context)
Citation Context ...n Combining classifiers to achieve higher accuracy is an important research topic with different names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion (=-=[6, 7, 8, 9, 10]-=-); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]... |

121 | Algorithms for optimal linear combinations of neural networks
- Hashem
- 1997
(Show Context)
Citation Context ...ority [35] CC1: CC2: Min, Max, OWA [37], Average, Product, [38, 5] Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations =-=[41, 42, 43]-=-, Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks [44, 13], Dempster-Shafer [32, 2, 4], Decision Templates be performed right away. Others, like the fuzzy integral and the probabilistic ... |

109 |
Application of majority voting to pattern recognition: An analysis of its behavior and performance
- Lam, Suen
- 1997
(Show Context)
Citation Context ...decision profile for class 3 class → 1 2 3 4 D1(x) 0 0 1 0 D2(x) 0 0 1 0 D3(x) 0 0 1 0 Table 3: Classifier fusion techniques First level Training at fusion level output↓ No Yes C1: C2: Crisp Majority =-=[35]-=- CC1: CC2: Min, Max, OWA [37], Average, Product, [38, 5] Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 4... |

105 |
Combining the results of several neural network classifiers
- Rogova
- 1994
(Show Context)
Citation Context ...e” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks [44, 13], Dempster-Shafer =-=[32, 2, 4]-=-, Decision Templates be performed right away. Others, like the fuzzy integral and the probabilistic product, train a small number of parameters. Table 3 gives our grouping of classifier fusion methods... |

105 |
Fuzzy Models and Algorithms for Pattern Recognition
- Bezdek, Keller, et al.
- 1999
(Show Context)
Citation Context ...estimates of) 2sthe posterior probabilities for the classes, given x, i.e. µ i D (x) = P (i|x). Alternatively, µiD (x) can be viewed as typicalness, belief, certainty, possibility, etc. Bezdek et al. =-=[34]-=- define three types of classifiers: 1. Crisp classifier: µ i D (x) ∈ {0, 1}, � c i=1 µi D (x) = 1, ∀x ∈ ℜn ; 2. Fuzzy classifier: µ i D (x) ∈ [0, 1], � c i=1 µi D (x) = 1, ∀x ∈ ℜn ; (Probabilistic int... |

96 | Convergence results for the EM approach to mixture of experts architectures
- Jordan, Xu
- 1995
(Show Context)
Citation Context ... accuracy is an important research topic with different names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts (=-=[11, 12, 13, 14]-=-); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N000... |

80 |
Democracy in neural nets: Voting schemes for classification
- Battiti, Colla
- 1994
(Show Context)
Citation Context ...5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers (=-=[20]-=-); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles ([16, 22]), • divide-and-conquer classifiers ... |

75 |
Boosting and other ensemble methods
- Drucker, Cortes, et al.
- 1994
(Show Context)
Citation Context ...ent names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks (=-=[15, 16]-=-); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system... |

61 | Combining estimators using non-constant weighting function
- Tresp, Taniguchi
- 1995
(Show Context)
Citation Context ...ority [35] CC1: CC2: Min, Max, OWA [37], Average, Product, [38, 5] Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations =-=[41, 42, 43]-=-, Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks [44, 13], Dempster-Shafer [32, 2, 4], Decision Templates be performed right away. Others, like the fuzzy integral and the probabilistic ... |

57 |
Consensus theoretic classification methods
- Benediktsson, Swain
- 1992
(Show Context)
Citation Context ...nation of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation (=-=[17, 18, 19]-=-); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles ([16, 22... |

55 |
Combining multiple neural networks by fuzzy integral for robust classification
- Cho, Kim
- 1995
(Show Context)
Citation Context ...n Combining classifiers to achieve higher accuracy is an important research topic with different names in the literature: • combination of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion (=-=[6, 7, 8, 9, 10]-=-); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]... |

53 |
Optimal combination of pattern classifiers
- Lam, Suen
- 1995
(Show Context)
Citation Context ...Class-indifferent fusion. 1 Introduction Combining classifiers to achieve higher accuracy is an important research topic with different names in the literature: • combination of multiple classifiers (=-=[1, 2, 3, 4, 5]-=-); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([2... |

42 |
Expected classification error of the fisher linear classifier with pseudo-inverse covariance matrix
- Raudys, Duin
- 1998
(Show Context)
Citation Context ...ell-sampled two class problems. For undersampled datasets LDC (and even more severely QDC) suffer from unstable covariance matrix estimates. Our Fisher implementation uses the pseudo-inverse approach =-=[49]-=- in those situations. 5 Experiments We used two data sets from the ELENA database (anonymous ftp at ftp.dice.ucl.ac.be, directory pub/neural-nets/ELENA/databases). Results with the same data using cla... |

32 |
Evaluation of adaptive mixtures of competing experts
- Nowlan, Hinton
- 1991
(Show Context)
Citation Context |

32 | Fuzzy Models and Algorithms - Bezdek, Keller, et al. - 1999 |

31 |
Multiple network fusion using fuzzy logic
- Cho, Kim
- 1995
(Show Context)
Citation Context ... general interpretation of classifier outputs as the support for the classes is the basis of fuzzy aggregation methods, examples of which are simple connectives between fuzzy sets, the fuzzy integral =-=[6, 29, 7, 30, 9, 31]-=-, and Dempster-Shafer fusion [32, 2, 4]. There is another way to look at the fusion problem: we can treat the classifier outputs simply as the input to a second-level classifier, and use classical pat... |

28 |
Fusion of handwritten word classifiers
- Gader, Mohamed, et al.
- 1996
(Show Context)
Citation Context ...ments they outperformed the Average, which is viewed as the favorite in this group [5]. Fuzzy integral. In our experiments the fuzzy integral using a λ-fuzzy measure rates in the middle. Gader et al. =-=[7]-=- report the results from a handwritten word recognition problem, where the fuzzy integral dramatically outperforms various neural networks. The authors attribute this to the efficient way in which the... |

28 | Local linear perceptrons for classification
- Alpaydın, Jordan
- 1996
(Show Context)
Citation Context ... for the vicinity of x is given the highest credit when assigning the class label to x. We can nominate exactly one classifier to make the decision, as in [26], or more than one “local expert”, as in =-=[27, 11]-=-. Classifier fusion assumes that all classifiers are trained over the whole feature space, and are thereby considered as competitive rather than complementary [18, 4]. Multiple classifier outputs are ... |

25 |
Logistic discrimination
- Anderson
- 1972
(Show Context)
Citation Context ...risp class label In the CI2 category we also use some well known classifiers: linear and quadratic discriminant classifiers (LDC and QDC assuming normal densities [33]), the logistic classifier (LOG) =-=[48]-=-, and Fisher’s discriminant (FSH) [33]. LDC and Fisher are identical for well-sampled two class problems. For undersampled datasets LDC (and even more severely QDC) suffer from unstable covariance mat... |

22 |
Consensus diagnosis: A simulation study
- Ng, Abramson
- 1992
(Show Context)
Citation Context ...nation of multiple classifiers ([1, 2, 3, 4, 5]); • classifier fusion ([6, 7, 8, 9, 10]); • mixture of experts ([11, 12, 13, 14]); • committees of neural networks ([15, 16]); • consensus aggregation (=-=[17, 18, 19]-=-); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles ([16, 22... |

21 |
A composite classifier system design: Concepts and methodology
- Dasarathy, Sheela
(Show Context)
Citation Context ... • consensus aggregation ([17, 18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system (=-=[21]-=-); • classifier ensembles ([16, 22]), • divide-and-conquer classifiers [23]; • pandemonium system of reflective agents [24]; • change-glasses approach to classifier selection [25], etc. The paradigms ... |

20 |
A multiplicative formula for aggregating probability assessments
- Bordley
(Show Context)
Citation Context ..., 0.0032 ) T ; If hardened, minimum, maximum, and product will label x in class 3, whereas the average will put x in class 2. CC2: Probabilistic product [39, 40] is an aggregation formula (derived in =-=[47]-=-) which gives the Bayes decision if the classifiers use mutually independent subsets of features and yield the true posterior probability, di,j(x) = P (i|xj), on their respective feature subspaces, µ ... |

18 | On combinig classi - Kittler, Hatef, et al. - 1998 |

17 | Comparison between product and mean classifier combination rules
- Tax, Duin, et al.
- 1997
(Show Context)
Citation Context ...level Training at fusion level output↓ No Yes C1: C2: Crisp Majority [35] CC1: CC2: Min, Max, OWA [37], Average, Product, [38, 5] Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product =-=[39, 40]-=- Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks [44, 13], Dempster-Shafer [32, 2, 4], Decision Templates be performed... |

16 |
Knowledge Integrations in a Multiple Classifier System
- Lu
- 1996
(Show Context)
Citation Context ...he support for the classes is the basis of fuzzy aggregation methods, examples of which are simple connectives between fuzzy sets, the fuzzy integral [6, 29, 7, 30, 9, 31], and Dempster-Shafer fusion =-=[32, 2, 4]-=-. There is another way to look at the fusion problem: we can treat the classifier outputs simply as the input to a second-level classifier, and use classical pattern recognition techniques for the sec... |

14 | Methods of combining multiple classi and their applications to handwriting recognition - Xu, Krzyzak, et al. - 1992 |

13 | The Pandemonium System of Reflective Agents
- Smieja
(Show Context)
Citation Context ...rch supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles ([16, 22]), • divide-and-conquer classifiers [23]; • pandemonium system of reflective agents =-=[24]-=-; • change-glasses approach to classifier selection [25], etc. The paradigms of these models differ on the: assumptions about classifier dependencies; type of classifier outputs; aggregation strategy ... |

12 |
The combination of multiple classifiers by a neural network approach
- Huang, Liu, et al.
- 1995
(Show Context)
Citation Context ...Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher Logistic classifier Neural networks =-=[44, 13]-=-, Dempster-Shafer [32, 2, 4], Decision Templates be performed right away. Others, like the fuzzy integral and the probabilistic product, train a small number of parameters. Table 3 gives our grouping ... |

12 | Advances in fuzzy integration for pattern recognition - Keller, Gader, et al. - 1994 |

11 |
Erenstein, Method of collective recognition. Moscow: Energoizdat (in Russian
- Rastrigin, H
- 1982
(Show Context)
Citation Context ...or classification, the classifier responsible for the vicinity of x is given the highest credit when assigning the class label to x. We can nominate exactly one classifier to make the decision, as in =-=[26]-=-, or more than one “local expert”, as in [27, 11]. Classifier fusion assumes that all classifiers are trained over the whole feature space, and are thereby considered as competitive rather than comple... |

11 |
An application of owa operators to the aggregation of multiple classification decisions
- Kuncheva
- 1997
(Show Context)
Citation Context ...class → 1 2 3 4 D1(x) 0 0 1 0 D2(x) 0 0 1 0 D3(x) 0 0 1 0 Table 3: Classifier fusion techniques First level Training at fusion level output↓ No Yes C1: C2: Crisp Majority [35] CC1: CC2: Min, Max, OWA =-=[37]-=-, Average, Product, [38, 5] Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher... |

9 |
Multi-layer perceptron ensembles for increased performance and fault-tolerance in pattern recognition tasks
- Filippi, Costa, et al.
- 1994
(Show Context)
Citation Context ...18, 19]); • voting pool of classifiers ([20]); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles (=-=[16, 22]-=-), • divide-and-conquer classifiers [23]; • pandemonium system of reflective agents [24]; • change-glasses approach to classifier selection [25], etc. The paradigms of these models differ on the: assu... |

9 |
Use of fuzzy-logic-inspired features to improve bacterial recognition through classifier fusion
- Wang, Keller, et al.
- 1998
(Show Context)
Citation Context ... general interpretation of classifier outputs as the support for the classes is the basis of fuzzy aggregation methods, examples of which are simple connectives between fuzzy sets, the fuzzy integral =-=[6, 29, 7, 30, 9, 31]-=-, and Dempster-Shafer fusion [32, 2, 4]. There is another way to look at the fusion problem: we can treat the classifier outputs simply as the input to a second-level classifier, and use classical pat... |

8 |
A Divide-and-Conquer methodology for modular supervised neural network design
- Chiang, Fu
- 1994
(Show Context)
Citation Context ...); • dynamic classifier selection ([3]); ∗ Research supported by ONR grant N00014-96-1-0642 1s• composite classifier system ([21]); • classifier ensembles ([16, 22]), • divide-and-conquer classifiers =-=[23]-=-; • pandemonium system of reflective agents [24]; • change-glasses approach to classifier selection [25], etc. The paradigms of these models differ on the: assumptions about classifier dependencies; t... |

8 |
Change-glasses approach in pattern recognition
- Kuncheva
- 1993
(Show Context)
Citation Context ...e classifier system ([21]); • classifier ensembles ([16, 22]), • divide-and-conquer classifiers [23]; • pandemonium system of reflective agents [24]; • change-glasses approach to classifier selection =-=[25]-=-, etc. The paradigms of these models differ on the: assumptions about classifier dependencies; type of classifier outputs; aggregation strategy (global or local); aggregation procedure (a function, a ... |

8 |
Aggregation of multiple classification decisions by fuzzy templates
- Kuncheva, Kounchev, et al.
- 1995
(Show Context)
Citation Context ... dti(k, s)(Z) = �N j=1 Ind(zj, i) dk,s(zj) �N j=1 Ind(zj, , k = 1, . . . , L, s = 1, . . . , c, (4) i) where Ind(zj, i) is an indicator function with value 1 if zj has crisp label i, and 0, otherwise =-=[45]-=-. To simplify the notation DTi(Z) will be denoted by DTi. The decision template DTi for class i is the average of the decision profiles of the elements of the training set Z labeled in class i. When x... |

8 | Optimal Combination of Pattern Classi - Lam, Suen - 1995 |

8 | Combination of multiple classi using local accuracy estimates - Woods, Kegelmeyer, et al. - 1997 |

7 |
Combining classifiers for the recognition of handwritten digits
- van, Duin, et al.
- 1997
(Show Context)
Citation Context ...0 1 0 D2(x) 0 0 1 0 D3(x) 0 0 1 0 Table 3: Classifier fusion techniques First level Training at fusion level output↓ No Yes C1: C2: Crisp Majority [35] CC1: CC2: Min, Max, OWA [37], Average, Product, =-=[38, 5]-=- Behavior-Knowledge Space [36] “Naive” Bayes [4] Probabilistic product [39, 40] Fuzzy integral [6, 7, 9], Trained linear combinations [41, 42, 43], Soft CI2 LDC, QDC, Fisher Logistic classifier Neural... |

7 | Local linear perceptrons for classi - Alpaydin, Jordan - 1996 |