Results 11  20
of
3,092
Graph implementations for nonsmooth convex programs
 Recent Advances in Learning and Control, Lecture Notes in Control and Information Sciences
, 2008
"... Summary. We describe graph implementations, a generic method for representing a convex function via its epigraph, described in a disciplined convex programming framework. This simple and natural idea allows a very wide variety of smooth and nonsmooth convex programs to be easily specified and effi ..."
Abstract

Cited by 263 (10 self)
 Add to MetaCart
and efficiently solved, using interiorpoint methods for smooth or cone convex programs. Key words: Convex optimization, nonsmooth optimization, disciplined convex programming, optimization modeling languages, semidefinite program
NearOptimal Adaptive Polygonization
 in CGI’99 Proceedings, IEEE Computer Society
, 1999
"... Consider a triangulation of the xy plane, and a general surface z = f#x; y#. The points of the triangle, when lifted to the surface, form a linear spline approximation to the surface. We are interested in the error between the surface and the linear approximant. In fact, we are interested in buildin ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
in building triangulations in the plane such that the induced linear approximant is nearoptimal with respect to a given error. Here we describe a new method, which iteratively adds points to a "Delaunaylike" triangulation of the plane. We locally approximate f by a quadratic surface and utilize
NearOptimal Joint Object Matching via Convex Relaxation
, 2014
"... Joint matching over a collection of objects aims at aggregating information from a large collection of similar instances (e.g. images, graphs, shapes) to improve maps between pairs of them. Given multiple objects and matches computed between a few object pairs in isolation, the goal is to recover an ..."
Abstract
 Add to MetaCart
greedy rounding strategy. Theoretically, MatchLift exhibits nearoptimal errorcorrection ability, i.e. in the asymptotic regime it is guaranteed to work even when a dominant fraction 1−Θ log2 n√
Wavelet shrinkage: asymptopia
 Journal of the Royal Statistical Society, Ser. B
, 1995
"... Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators bein ..."
Abstract

Cited by 295 (36 self)
 Add to MetaCart
, Sobolev classes, and Bounded Variation. This is amuch broader nearoptimality than anything previously proposed in the minimax literature. Finally, the theory underlying the method is interesting, as it exploits a correspondence between statistical questions and questions of optimal recovery
NearOptimal Critical Sink Routing Tree Constructions
, 1995
"... We present criticalsink routing tree (CSRT) constructions which exploit available criticalpath information to yield highperformance routing trees. Our CSSteiner and "Global Slack Removal" algorithms together modify traditional Steiner tree constructions to optimize signal delay at id ..."
Abstract

Cited by 55 (13 self)
 Add to MetaCart
We present criticalsink routing tree (CSRT) constructions which exploit available criticalpath information to yield highperformance routing trees. Our CSSteiner and "Global Slack Removal" algorithms together modify traditional Steiner tree constructions to optimize signal delay
Distributed Subgradient Methods for Multiagent Optimization
, 2007
"... We study a distributed computation model for optimizing a sum of convex objective functions corresponding to multiple agents. For solving this (not necessarily smooth) optimization problem, we consider a subgradient method that is distributed among the agents. The method involves every agent minimiz ..."
Abstract

Cited by 240 (25 self)
 Add to MetaCart
We study a distributed computation model for optimizing a sum of convex objective functions corresponding to multiple agents. For solving this (not necessarily smooth) optimization problem, we consider a subgradient method that is distributed among the agents. The method involves every agent
On the Construction of Optimal or NearOptimal Rectilinear Steiner Arborescence
, 1994
"... Given a set of nodes N lying on the first quadrant of the Euclidean Plane E 2 , the Rectilinear Minimum Steiner Arborescence (RMSA) problem is to find a shortestpath tree of the minimum length rooted at the origin, containing all nodes in N , and composed solely of horizontal and vertical arcs or ..."
Abstract
 Add to MetaCart
oriented only from left to right or from bottom to top [1]. In this paper, we propose an efficient algorithm for constructing optimal or nearoptimal arborescences, and present a constructive method for finding a lower bound of the length of the optimal arborescence. Experimental results indicate
Dual methods for nonconvex spectrum optimization of multicarrier systems
 IEEE TRANS. COMMUN
, 2006
"... The design and optimization of multicarrier communications systems often involve a maximization of the total throughput subject to system resource constraints. The optimization problem is numerically difficult to solve when the problem does not have a convexity structure. This paper makes progress ..."
Abstract

Cited by 201 (7 self)
 Add to MetaCart
complexity iterative spectrum balancing algorithm based on these ideas, and show that the new algorithm achieves nearoptimal performance in many practical situations.
The group Lasso for logistic regression
 Journal of the Royal Statistical Society, Series B
, 2008
"... Summary. The group lasso is an extension of the lasso to do variable selection on (predefined) groups of variables in linear regression models. The estimates have the attractive property of being invariant under groupwise orthogonal reparameterizations. We extend the group lasso to logistic regressi ..."
Abstract

Cited by 276 (11 self)
 Add to MetaCart
regression models and present an efficient algorithm, that is especially suitable for high dimensional problems, which can also be applied to generalized linear models to solve the corresponding convex optimization problem. The group lasso estimator for logistic regression is shown to be statistically
Stochastic Approximation Approach to Stochastic Programming
"... In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim of th ..."
Abstract

Cited by 267 (20 self)
 Add to MetaCart
In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim
Results 11  20
of
3,092