Results 1  10
of
270
Logistic Regression, AdaBoost and Bregman Distances
, 2000
"... We give a unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances. The striking similarity of the two problems in this framework allows us to design and analyze algorithms for both simultaneously, and to easily adapt al ..."
Abstract

Cited by 261 (44 self)
 Add to MetaCart
We give a unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances. The striking similarity of the two problems in this framework allows us to design and analyze algorithms for both simultaneously, and to easily adapt
Learning Theory, 2000. Logistic Regression, AdaBoost and Bregman Distances
"... Abstract. We give a unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances. The striking similarity of the two problems in this framework allows us to design and analyze algorithms for both simultaneously, and to easi ..."
Abstract
 Add to MetaCart
Abstract. We give a unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances. The striking similarity of the two problems in this framework allows us to design and analyze algorithms for both simultaneously
The Boosting Approach to Machine Learning: An Overview
, 2002
"... Boosting is a general method for improving the accuracy of any given learning algorithm. Focusing primarily on the AdaBoost algorithm, this chapter overviews some of the recent work on boosting including analyses of AdaBoost's training error and generalization error; boosting's connecti ..."
Abstract

Cited by 430 (17 self)
 Add to MetaCart
's connection to game theory and linear programming; the relationship between boosting and logistic regression; extensions of AdaBoost for multiclass classification problems; methods of incorporating human knowledge into boosting; and experimental and applied work using boosting.
Surrogate Maximization/Minimization Algorithms for AdaBoost and the Logistic Regression Model
 In The 21th International Conference on Machine Learning
, 2004
"... Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be regarded as a generalization of expectationmaximization (EM) algorithms. ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be regarded as a generalization of expectationmaximization (EM) algorithms.
A Real generalization of discrete AdaBoost
, 2007
"... Scaling discrete AdaBoost to handle realvalued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream of alg ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
Scaling discrete AdaBoost to handle realvalued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream
Information geometry of UBoost and Bregman divergence
 Neural Computation
, 2004
"... We aim to extend from AdaBoost to UBoost in the paradigm to build up a stronger classification machine in a set of weak learning machines. A geometric understanding for the Bregman divergence defined by a generic function U being convex leads to UBoost method in the framework of information geomet ..."
Abstract

Cited by 34 (9 self)
 Add to MetaCart
We aim to extend from AdaBoost to UBoost in the paradigm to build up a stronger classification machine in a set of weak learning machines. A geometric understanding for the Bregman divergence defined by a generic function U being convex leads to UBoost method in the framework of information
AdaBoost and Forward Stagewise Regression are FirstOrder Convex Optimization Methods
, 2013
"... Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two wellknown boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FSε), by establi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two wellknown boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FSε
7 8 A Real generalization of discrete AdaBoost 7
"... Scaling discrete AdaBoost to handle realvalued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream of alg ..."
Abstract
 Add to MetaCart
Scaling discrete AdaBoost to handle realvalued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream
3D Sound for Virtual Reality and Multimedia
, 2000
"... This paper gives HRTF magnitude data in numerical form for 43 frequencies between 0.212 kHz, the average of 12 studies representing 100 different subjects. However, no phase data is included in the tables; group delay simulation would need to be included in order to account for ITD. In 3D sound ..."
Abstract

Cited by 282 (5 self)
 Add to MetaCart
This paper gives HRTF magnitude data in numerical form for 43 frequencies between 0.212 kHz, the average of 12 studies representing 100 different subjects. However, no phase data is included in the tables; group delay simulation would need to be included in order to account for ITD. In 3D sound
LETTER Communicated by Shunichi Amari Information Geometry of UBoost and Bregman Divergence
"... We aim at an extension of AdaBoost to UBoost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the UBoost method in the framework of information geom ..."
Abstract
 Add to MetaCart
We aim at an extension of AdaBoost to UBoost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the UBoost method in the framework of in
Results 1  10
of
270