Results 1 
3 of
3
Statistical Learning Algorithms Based on Bregman Distances
, 1997
"... We present a class of statistical learning algorithms formulated in terms of minimizing Bregman distances, a family of generalized entropy measures associated with convex functions. The inductive learning scheme is akin to growing a decision tree, with the Bregman distance filling the role of the im ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
We present a class of statistical learning algorithms formulated in terms of minimizing Bregman distances, a family of generalized entropy measures associated with convex functions. The inductive learning scheme is akin to growing a decision tree, with the Bregman distance filling the role of the impurity function in treebased classifiers. Our approach is based on two components. In the feature selection step, each linear constraint in a pool of candidate features is evaluated by the reduction in Bregman distance that would result from adding it to the model. In the constraint satisfaction step, all of the parameters are adjusted to minimize the Bregman distance subject to the chosen constraints. We introduce a new iterative estimation algorithm for carrying out both the feature selection and constraint satisfaction steps, and outline a proof of the convergence of these algorithms. 1 Introduction In this paper we present a class of statistical learning algorithms formulated in terms...
On the linear convergence of descent methods for convex essentially smooth minimization
 SIAM J. Control Optim
, 1992
"... Dedicated to those courageous people who, on June 4, 1989, sacrificed their lives in ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
Dedicated to those courageous people who, on June 4, 1989, sacrificed their lives in
Projections Onto Convex Sets (POCS) Based Optimization by Lifting
, 1306
"... Two new optimization techniques based on projections onto convex space (POCS) framework for solving convex and some nonconvex optimization problems are presented. The dimension of the minimization problem is lifted by one and sets corresponding to the cost function are defined. If the cost function ..."
Abstract
 Add to MetaCart
Two new optimization techniques based on projections onto convex space (POCS) framework for solving convex and some nonconvex optimization problems are presented. The dimension of the minimization problem is lifted by one and sets corresponding to the cost function are defined. If the cost function is a convex function in RN the corresponding set is a convex set in RN+1. The iterative optimization approach starts with an arbitrary initial estimate in RN+1 and an orthogonal projection is performed onto one of the sets in a sequential manner at each step of the optimization problem. The method provides globally optimal solutions in totalvariation, filtered variation, l1, and entropic cost functions. It is also experimentally observed that cost functions based on lp, p < 1 can be handled by using the supporting hyperplane concept. 1