Citations
6460 | Neural networks and pattern recognition - Bishop - 1995 |
3550 | Bagging predictors
- Breiman
- 1996
(Show Context)
Citation Context ...ng, error-correcting output codes, bagging, boosting, mixtures of experts, stacked generalization and cascading. The taxonomy in Jain et al. (2000) is repeated in Table 1 (page 7). 4 Bagging Bagging (=-=Breiman, 1996-=-), a name derived from bootstrap aggregation, was the first effective method of ensemble learning and is one of the simplest methods of arching 1 . The meta-algorithm, which is a special case of model... |
3401 | A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting
- Freund, Schapire
- 1997
(Show Context)
Citation Context ...ation in their mistakes reaches a combined performance that is significantly higher than the best obtainable from the individual nets. In 1995, Yoav Freund and Robert E. Schapire introduced AdaBoost (=-=Freund and Schapire, 1997-=-) (covered in Section 5.1 (page 8)). Cho and Kim (1995) combined the results from multiple neural networks using fuzzy logic which resulted in more accurate classification. Freund (1995) developed a m... |
2165 | Experiments with a new boosting algorithm - Freund, Schapire - 1996 |
1384 | On combining classifiers - Kittler, Hatef, et al. - 1998 |
998 | Statistical pattern recognition: a review
- Jain, Duin, et al.
- 2000
(Show Context)
Citation Context ...arable classifiers Random subspace Parallel Yes No Confidence Needs many comparable classifiers Neural trees Hierarchical Yes No Confidence Handles large numbers of classes Table 1: Ensemble methods (=-=Jain et al., 2000-=-) RN/11/02 Page 7Ensemble Learning Martin Sewell models are weighted according to their success and then the outputs are combined using voting (for classification) or averaging (for regression), thus... |
884 | Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics - Schapire, Freund, et al. - 1998 |
873 | Hierarchical mixtures of experts and the EM algorithm - Jordan, Jacobs - 1994 |
850 | The strength of weak learnability
- Schapire
- 1990
(Show Context)
Citation Context ... to create a single output. Bagging is only effective when using unstable (i.e. a small change in the training set can cause a significant change in the model) non-linear models. 5 Boosting Boosting (=-=Schapire, 1990-=-) is a meta-algorithm which can be viewed as a model averaging method. It is the most widely used ensemble method and one of the most powerful learning ideas introduced in the last twenty years. Origi... |
784 | A short introduction to boosting - Freund, Schapire - 1999 |
710 | Stacked generalization
- Wolpert
- 1992
(Show Context)
Citation Context ...It uses the same training set over and over again (thus it need not be large) and can also combine an arbitrary number of base-learners. 6 Stacked Generalization Stacked generalization (or stacking) (=-=Wolpert, 1992-=-) is a distinct way of combining multiple models, that introduces the concept of a meta learner. Although an attractive idea, it is less widely used than bagging and boosting. Unlike bagging and boost... |
667 | Neural networks ensembles - Hansen, Salamon - 1990 |
600 | An experimental comparison of three methods for constructing ensemble of decision trees - Dietterich |
570 | The random subspace method for constructing decision forests
- Ho
- 1998
(Show Context)
Citation Context ...ochastic discrimination (SD). The method basically takes poor solutions as an input and creates good solutions. Stochastic discrimination looks promising, and later led to the random subspace method (=-=Ho, 1998-=-). Hansen and Salamon (1990) showed the benefits of invoking ensembles of similar neural networks. Wolpert (1992) introduced stacked generalization, a scheme for minimizing the generalization error ra... |
556 | Reducing multiclass to binary: A unifying approach for margin classifiers - Allwein, Schapire, et al. - 2000 |
508 | Boosting a Weak Learning Algorithm by Majority - Freund - 1995 |
471 | Methods of combining multiple classifiers and their applications to handwriting recognition - Xu, Krzyzak, et al. - 1992 |
468 | Neural network ensembles, cross validation, and active learning - Krogh, Vedelsby - 1995 |
370 | Decision Combination in Multiple Classifier Systems - Ho, Hull, et al. - 1994 |
347 | When networks disagree: Ensemble methods for hybrid neural networks - Perrone, Cooper - 1993 |
310 | Bayesian Model Averaging for Linear Regression Models - Raftery, Madigan, et al. |
283 | Popular ensemble methods: An empirical study - Opitz, Maclin - 1999 |
276 | Bayesian model averaging: A tutorial - Hoeting, Madigan, et al. - 1999 |
231 | Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy - Kuncheva, Whitaker |
171 | Combination of Multiple Classifiers using Local Accuracy Estimates - Woods, Bowyer, et al. - 1997 |
161 | Combining classifiers in text categorization - Larkey, Croft - 1996 |
127 | A theoretical study on six classifier fusion strategies - Kuncheva |
114 | Evolutionary Ensembles with Negative Correlation Learning - Liu, Yao, et al. |
108 | Combining multiple classifiers by averaging or by multiplying - Tax, Breukelen, et al. |
99 | Introduction to - Alpaydin - 2004 |
98 | Democracy in Neural Nets: Voting Schemes for Classification - Battiti, Colla - 1994 |
95 | Boosting algorithms: regularization, prediction and model fitting (with discussion - Bühlmann, Hothorn - 2007 |
91 | Combining classifiers: A theoretical framework. Number 1 - Kittler - 1998 |
88 | Analysis of decision boundaries in linearly combined neural classi - Tumer, Ghosh - 1996 |
82 | A mixture model for clustering ensembles - Topchy, Jain, et al. |
81 | Combining multiple weak clusterings - Topchy, Topchy, et al. - 2003 |
79 | Ensemble learning via negative correlation - Liu, Yao - 1999 |
66 | Learning with Ensembles: How Overfitting can be Useful - Sollich, Krogh - 1996 |
65 | Sum versus Vote Fusion in Multiple Classifier Systems - Kittler, Alkoot |
65 | Optimal combinations of pattern classifiers - Lam, Suen - 1995 |
63 | Model selection and model averaging - CLAESKENS, HJORT - 2008 |
61 | Ensemble learning - Dietterich - 2002 |
61 | Kuncheva: Relationships between combination methods and measures of diversity in combining classifiers - Shipp |
56 | Ensembles of learning machines - Valentini, Masulli - 2002 |
55 | A theoretical and experimental analysis of linear combiners for multiple classifier systems - Fumera, Roli - 2005 |
55 | Is independence good for combining classifiers - Kuncheva, Whitaker, et al. |
55 | How boosting the margin can also boost classifier complexity - Reyzin, Schapire - 2006 |
52 | Switching between selection and fusion in combining classifiers: an experiment - Kuncheva - 2002 |
49 | Least squares model averaging - Hansen - 2007 |
48 | Learning ensembles from bites: A scalable and accurate - Chawla, Hall, et al. - 2004 |
47 | Multiple network fusion using fuzzy logic, Neural Networks - Cho, Kim - 1995 |
46 | Designing classifier fusion systems by genetic algorithms - Kuncheva, Jain - 2000 |
46 | Multi-label classification using ensembles of pruned sets - Read, Pfahringer, et al. - 2008 |
45 | Cooperative coevolution of artificial neural network ensembles for pattern classification - Garcı́a-Pedrajas, Hervás-Martı́nez, et al. - 2005 |
45 | Bagging, boosting and the random subspace method for linear classifiers - Skurichina, Duin - 2002 |
43 | B.: Classifier selection for majority voting - Ruta, Gabrys - 2005 |
42 | Stochastic Discrimination - Kleinberg - 1990 |
41 | Limits on the majority vote accuracy in classifier fusion - Kuncheva, Whitaker, et al. - 2003 |
37 | Evidence contrary to the statistical view of boosting, The - Mease, Wyner |
37 | Creating diversity in ensembles using artificial data - Melville, Mooney - 2005 |
36 | Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods - Valentini, Dietterich - 2004 |
35 | Moderate diversity for better cluster ensembles - Hadjitodorov, Kuncheva, et al. - 2006 |
33 | On the equivalence of weak learnability and linear separability: New relaxations and efficient boosting algorithms - Shalev-Shwartz, Singer - 2008 |
26 | Evaluation of stability of k-means cluster ensembles with respect to random initialization - Kuncheva, Vetrov |
26 | Machine Learning An Algorithmic Perspective - Marsland - 2009 |
25 | Leave one out error, stability, and generalization of voting combinations of classifiers - Evgeniou, Pontil, et al. - 2004 |
23 | Weighted cluster ensembles: Methods and analysis - Domeniconi, Al-Razgan |
23 | On the algorithmic implementation of stochastic discrimination - Kleinberg |
22 | Classifier ensembles with a random linear oracle - Kuncheva, Rodriguez - 2007 |
21 | 2010): “A comparison of two model averaging techniques with an application to growth empirics - Magnus, Powell, et al. |
20 |
Diversity in multiple classifier systems
- Kuncheva
- 2005
(Show Context)
Citation Context ...e set of accurate and low-bias classifiers. In March 2005 the journal Information Fusion ran a special issue on ‘Diversity in multiple classifier systems’; Ludmila I. Kuncheva gave a guest editorial (=-=Kuncheva, 2005-=-). Melville and Mooney (2005) presented a new method RN/11/02 Page 4Ensemble Learning Martin Sewell for generating ensembles, DECORATE (Diverse Ensemble Creation by Oppositional Relabeling of Artific... |
17 | Evolving hybrid ensembles of learning machines for better generalisation. Neurocomputing - Chandra, Yao - 2006 |
17 | On robustness of on-line boosting - a competitive study - Leistner, Saffari, et al. - 2009 |
15 | Performance Analysis and Comparison of Linear Combiners for Classifier Fusion - Fumera, Roli - 2002 |
14 | Strategies for teaching layered networks classification tasks - Wittner, Denker - 1988 |
13 | Critic-driven ensemble classification - Miller, Yan - 1999 |
9 | Incremental construction of classifier and discriminant ensembles - Ulas, Semerci, et al. - 2009 |
8 | Predictive learning via rule ensembles. The Annals of Applied Statistics - Friedman, Popescu - 2008 |
6 | Is combining classifiers with stacking better than selecting the best one - Dˇzeroski, ˇZenko - 2004 |
5 | Investigating the influence of the choice of the ensemble members in accuracy and diversity of selection-based and fusion-based methods for ensembles - Canuto, Abreu, et al. |
4 | Bagging for gaussian process regression. Neurocomputing 72(7):1605–1610 - Chen, Ren - 2009 |
4 | Problem-based learning - Rothman, Page - 2002 |
4 | An investigation of the effects of correlation, autocorrelation, and sample size in classifier fusion - Leap - 2004 |
2 | Feature selection for ensembles. In: American Association for ARTIFICIAL - OPITZ - 1999 |
2 | Stabilizing Weak Classifiers: Regularization and Combining Techniques in Discriminant Analysis - SKURICHINA - 2001 |
1 | 8 Learning Martin Sewell Bühlmann - Page - 2010 |
1 | 9 Learning Martin Sewell Hido - Page - 2009 |
1 | 11 Learning Martin Sewell - Page - 2010 |