Results 11  20
of
228
Greedy in Approximation Algorithms
 PROC. OF ESA
, 2006
"... The objective of this paper is to characterize classes of problems for which a greedy algorithm finds solutions provably close to optimum. To that end, we introduce the notion of kextendible systems, a natural generalization of matroids, and show that a greedy algorithm is a 1factor approximatio ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
The objective of this paper is to characterize classes of problems for which a greedy algorithm finds solutions provably close to optimum. To that end, we introduce the notion of kextendible systems, a natural generalization of matroids, and show that a greedy algorithm is a 1factor approximation for these systems. Many seemly unrelated k problems fit in our framework, e.g.: bmatching, maximum profit scheduling and maximum asymmetric TSP. In the second half of the paper we focus on the maximum weight bmatching problem. The problem forms a 2extendible system, so greedy gives us a 1factor solution which runs in 2 O(m log n) time. We improve this by providing two linear time approximation algorithms for the problem: a 1 2factor algorithm that runs in O(bm) time, and a `2 3 − ǫ ´factor algorithm which runs in expected O ` bm log 1 ´ time.
Approximating Graphic TSP by Matchings
 52ND IEEE ANNUAL 52ND ANNUAL IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE
, 2011
"... We present a framework for approximating the metric TSP based on a novel use of matchings. Traditionally, matchings have been used to add edges in order to make a given graph Eulerian, whereas our approach also allows for the removal of certain edges leading to a decreased cost. For the TSP on gra ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
We present a framework for approximating the metric TSP based on a novel use of matchings. Traditionally, matchings have been used to add edges in order to make a given graph Eulerian, whereas our approach also allows for the removal of certain edges leading to a decreased cost. For the TSP on graphic metrics (graphTSP), the approach yields a 1.461approximation algorithm with respect to the HeldKarp lower bound. For graphTSP restricted to a class of graphs that contains degree three bounded and clawfree graphs, we show that the integrality gap of the HeldKarp relaxation matches the conjectured ratio 4/3. The framework allows for generalizations in a natural way and also leads to a 1.586approximation algorithm for the traveling salesman path problem on graphic metrics where the start and end vertices are prespecified.
Rapid mathematical programming
, 2004
"... This book was typeset with TEX using L ATEX and many further formatting packages. The pictures were prepared using pstricks, xfig, gnuplot and gmt. All numerals in this text are recycled. Für meine Eltern Preface Avoid reality at all costs — fortune(6) As the inclined reader will find out soon enoug ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
This book was typeset with TEX using L ATEX and many further formatting packages. The pictures were prepared using pstricks, xfig, gnuplot and gmt. All numerals in this text are recycled. Für meine Eltern Preface Avoid reality at all costs — fortune(6) As the inclined reader will find out soon enough, this thesis is not about deeply involved mathematics as a mean in itself, but about how to apply mathematics to solve realworld problems. We will show how to shape, forge, and yield our tool of choice to rapidly answer questions of concern to people outside the world of mathematics. But there is more to it. Our tool of choice is software. This is not unusual, since it has become standard practice in science to use software as part of experiments and sometimes even for proofs. But in order to call an experiment scientific it must be reproducible. Is this the case?
When LP is the Cure for Your Matching Woes: Improved Bounds for Stochastic Matchings (Extended Abstract)
"... Abstract. Consider a random graph model where each possible edge e is present independently with some probability pe. We are given these numbers pe, and want to build a large/heavy matching in the randomly generated graph. However, the only way we can find out whether an edge is present or not is to ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
Abstract. Consider a random graph model where each possible edge e is present independently with some probability pe. We are given these numbers pe, and want to build a large/heavy matching in the randomly generated graph. However, the only way we can find out whether an edge is present or not is to query it, and if the edge is indeed present in the graph, we are forced to add it to our matching. Further, each vertex i is allowed to be queried at most ti times. How should we adaptively query the edges to maximize the expected weight of the matching? We consider several matching problems in this general framework (some of which arise in kidney exchanges and online dating, and others arise in modeling online advertisements); we give LProunding based constantfactor approximation algorithms for these problems. Our main results are: • We give a 5.75approximation for weighted stochastic matching on general graphs, and a 5approximation on bipartite graphs. This answers an open question from [Chen et al. ICALP 09]. • Combining our LProunding algorithm with the natural greedy algorithm, we give an improved 3.88approximation for unweighted stochastic matching on general graphs and 3.51approximation on bipartite graphs. • We introduce a generalization of the stochastic online matching problem [Feldman et al. FOCS 09] that also models preferenceuncertainty and timeouts of buyers, and give a constant factor approximation algorithm. 1
Projectionfree Online Learning
"... The computational bottleneck in applying online learning to massive data sets is usually the projection step. We present efficient online learning algorithms that eschew projections in favor of much more efficient linear optimization steps using the FrankWolfe technique. We obtain a range of regret ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
The computational bottleneck in applying online learning to massive data sets is usually the projection step. We present efficient online learning algorithms that eschew projections in favor of much more efficient linear optimization steps using the FrankWolfe technique. We obtain a range of regret bounds for online convex optimization, with better bounds for specific cases such as stochastic online smooth convex optimization. Besides the computational advantage, other desirable features of our algorithms are that they are parameterfree in the stochastic case and produce sparse decisions. We apply our algorithms to computationally intensive applications of collaborative filtering, and show the theoretical improvements to be clearly visible on standard datasets. 1.
On the frontier of polynomial computations in tropical geometry
 Journal of Symbolic Computation
"... Abstract. We study some basic algorithmic problems concerning the intersection of tropical hypersurfaces in general dimension: deciding whether this intersection is nonempty, whether it is a tropical variety, and whether it is connected, as well as counting the number of connected components. We cha ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
Abstract. We study some basic algorithmic problems concerning the intersection of tropical hypersurfaces in general dimension: deciding whether this intersection is nonempty, whether it is a tropical variety, and whether it is connected, as well as counting the number of connected components. We characterize the borderline between tractable and hard computations by proving N Phardness and #Phardness results under various strong restrictions of the input data, as well as providing polynomial time algorithms for various other restrictions. 1.
Performance Analysis of Dynamic OFDMA Systems with Inband Signaling
, 2005
"... Within the last decade the OFDM transmission scheme has become part of several standards for wireless systems. Today, OFDM is even a candidate for 4G wireless systems. It is well known that dynamic OFDMA systems potentially increase the spectral efficiency. They exploit ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Within the last decade the OFDM transmission scheme has become part of several standards for wireless systems. Today, OFDM is even a candidate for 4G wireless systems. It is well known that dynamic OFDMA systems potentially increase the spectral efficiency. They exploit
Weighted Popular Matchings
, 2006
"... Consider the problem of matching a set of individuals X to a set of items Y where each individual has a weight and a personal preferences over the items. The objective is to construct a matching M that is stable in the sense that there is no matching M ′ such that the weighted majority vote will cho ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
Consider the problem of matching a set of individuals X to a set of items Y where each individual has a weight and a personal preferences over the items. The objective is to construct a matching M that is stable in the sense that there is no matching M ′ such that the weighted majority vote will choose M ′ over M. More formally, we are given a bipartite graph (X, Y, E), a weight w(x) ∈ R + for each individual x ∈ X, and a rank function r: E → {1,...,Y } encoding the individual preferences. For every applicant x and items y1, y2 ∈ Y we say applicant x prefers y1 over y2 if r(x, y1) < r(x, y2), and x is indifferent between y1 and y2 if r(x, y1) = r(x, y2). The preference lists are said to be strictly ordered if applicants are never indifferent between two items, otherwise the preference lists are said to contain ties. Let M and M ′ be two matchings. An applicant x prefers M over M ′ if x prefers the item he/she gets in M over the item he/she gets in M ′. A matching M is more popular than M ′ if the applicants that prefer M over M ′ outweigh those that prefer M ′ over M. Finally, a matching M is weighted popular if there is no matching M ′ more popular than M. In the weighted popular matching problem we must determine if a given instance admits a
A dichotomy for minimum cost graph homomorphisms
 European J. Combin
, 2007
"... For graphs G and H, a mapping f: V (G)→V (H) is a homomorphism of G to H if uv ∈ E(G) implies f(u)f(v) ∈ E(H). If, moreover, each vertex u ∈ V (G) is associated with costs ci(u), i ∈ V (H), then the cost of the homomorphism f is � u∈V (G) cf(u)(u). For each fixed graph H, we have the minimum cost h ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
For graphs G and H, a mapping f: V (G)→V (H) is a homomorphism of G to H if uv ∈ E(G) implies f(u)f(v) ∈ E(H). If, moreover, each vertex u ∈ V (G) is associated with costs ci(u), i ∈ V (H), then the cost of the homomorphism f is � u∈V (G) cf(u)(u). For each fixed graph H, we have the minimum cost homomorphism problem, written as MinHOM(H). The problem is to decide, for an input graph G with costs ci(u), u ∈ V (G), i ∈ V (H), whether there exists a homomorphism of G to H and, if one exists, to find one of minimum cost. Minimum cost homomorphism problems encompass (or are related to) many well studied optimization problems. We prove a dichotomy of the minimum cost homomorphism problems for graphs H, with loops allowed. When each connected component of H is either a reflexive proper interval graph or an irreflexive proper interval bigraph, the problem MinHOM(H) is polynomial time solvable. In all other cases the problem MinHOM(H) is NPhard. This solves an open problem from an earlier paper. 1