Results 1  10
of
1,253
Greedy Function Approximation: A Gradient Boosting Machine
 Annals of Statistics
, 2000
"... Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed for additi ..."
Abstract

Cited by 1000 (13 self)
 Add to MetaCart
Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed
Algorithms for Nonnegative Matrix Factorization
 In NIPS
, 2001
"... Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minim ..."
Abstract

Cited by 1246 (5 self)
 Add to MetaCart
to minimize the conventional least squares error while the other minimizes the generalized KullbackLeibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the ExpectationMaximization algorithm
Choosing multiple parameters for support vector machines
 MACHINE LEARNING
, 2002
"... The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing para ..."
Abstract

Cited by 470 (17 self)
 Add to MetaCart
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing
Convergence of a block coordinate descent method for nondifferentiable minimization
 J. OPTIM THEORY APPL
, 2001
"... We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x1,...,xN) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterate ..."
Abstract

Cited by 298 (3 self)
 Add to MetaCart
We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x1,...,xN) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence
SUBMITTED TO Modélisation Mathématique et Analyse Numérique, 2001 THE STEEPEST DESCENT MINIMIZATION OF DOUBLEWELL STORED ENERGIES DOES NOT YIELD VECTORIAL MICROSTRUCTURES
"... ABSTRACT. We prove that the Steepest Descent algorithm applied to the minimization of total stored energies with rankone related rotationally symmetric energy wells does not produce relaxing vectorial microstructures with nontrivial Young measures. 1. ..."
Abstract
 Add to MetaCart
ABSTRACT. We prove that the Steepest Descent algorithm applied to the minimization of total stored energies with rankone related rotationally symmetric energy wells does not produce relaxing vectorial microstructures with nontrivial Young measures. 1.
Boosting Algorithms as Gradient Descent
, 2000
"... Much recent attention, both experimental and theoretical, has been focussed on classification algorithms which produce voted combinations of classifiers. Recent theoretical work has shown that the impressive generalization performance of algorithms like AdaBoost can be attributed to the classifier h ..."
Abstract

Cited by 156 (1 self)
 Add to MetaCart
having large margins on the training data. We present an abstract algorithm for finding linear combinations of functions that minimize arbitrary cost functionals (i.e functionals that do not necessarily depend on the margin). Many existing voting methods can be shown to be special cases of this abstract
Geodesic Active Regions and Level Set Methods for Supervised Texture Segmentation
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2002
"... This paper presents a novel variational framework to deal with frame partition problems in Computer Vision. This framework exploits boundary and regionbased segmentation modules under a curvebased optimization objective function. The task of supervised texture segmentation is considered to demonst ..."
Abstract

Cited by 312 (9 self)
 Add to MetaCart
by unifying region and boundarybased information as an improved Geodesic Active Contour Model. The defined objective function is minimized using a gradientdescent method where a level set approach is used to implement the obtained PDE. According to this PDE, the curve propagation towards the final solution
Feature selection for SVMs
 Advances in Neural Information Processing Systems 13
, 2000
"... We introduce a method of feature selection for Support Vector Machines. The method is based upon finding those features which minimize bounds on the leaveoneout error. This search can be efficiently performed via gradient descent. The resulting algorithms are shown to be superior to some standard ..."
Abstract

Cited by 282 (17 self)
 Add to MetaCart
We introduce a method of feature selection for Support Vector Machines. The method is based upon finding those features which minimize bounds on the leaveoneout error. This search can be efficiently performed via gradient descent. The resulting algorithms are shown to be superior to some standard
Descent heuristics for unconstrained minimization
"... Semidefinite relaxations often provide excellent starting points for nonconvex problems with multiple local minimizers. This work aims to find a local minimizer within a certain neighborhood of the starting point and with a small objective value. Several approaches are motivated and compared with ea ..."
Abstract
 Add to MetaCart
with each other. Key words. Descent method, unconstrained minimization, local minimizer. 1
Results 1  10
of
1,253