Results 1  10
of
46
A New Model for Packet Scheduling in Multihop Wireless Networks
, 2000
"... The goal of packet scheduling disciplines is to achieve fair and maximum allocation of channel bandwidth. However, these two criteria can potentially be in conflict in a generic topology multihop wireless network where a single logical channel is shared among multiple contending ows and spatial reus ..."
Abstract

Cited by 123 (8 self)
 Add to MetaCart
The goal of packet scheduling disciplines is to achieve fair and maximum allocation of channel bandwidth. However, these two criteria can potentially be in conflict in a generic topology multihop wireless network where a single logical channel is shared among multiple contending ows and spatial reuse of the channel bandwidth is possible. In this paper, we propose a new model for packet scheduling that addresses this conflict. The main results of this paper are the following: (a) a twotier service model that provides a minimum "fair" allocation of the channel bandwidth for each packet flow and additionally maximizes spatial reuse of bandwidth, (b) an ideal centralized packet scheduling algorithm that realizes the above service model, and (c) a practical distributed backoffbased channel contention mechanism that approximates the ideal service within the framework of the CSMA/CA protocol.
Improved Approximation Algorithms for the Vertex Cover Problem in Graphs and Hypergraphs
, 1999
"... We obtain improved algorithms for finding small vertex covers in bounded degree graphs and hypergraphs. We use semidefinite programming to relax the problems, and introduce new rounding techniques for these relaxations. On graphs with maximum degree at most Δ, the algorithm achieves a performa ..."
Abstract

Cited by 95 (6 self)
 Add to MetaCart
We obtain improved algorithms for finding small vertex covers in bounded degree graphs and hypergraphs. We use semidefinite programming to relax the problems, and introduce new rounding techniques for these relaxations. On graphs with maximum degree at most Δ, the algorithm achieves a performance ratio of 2  (1  o(1)) 2 ln ln \Delta ln \Delta for large \Delta, which improves the previously known ratio of 2 \Gamma log \Delta+O(1) \Delta obtained by Halldórsson and Radhakrishnan. Using similar techniques, we also present improved approximations for the vertex cover problem in hypergraphs. For kuniform hypergraphs with n vertices, we achieve a ratio of k \Gamma (1 \Gamma o(1)) k ln ln n ln n for large n, and for kuniform hypergraphs with maximum degree at most \Delta, the algorithm achieves a ratio of k \Gamma (1 \Gamma o(1)) k(k\Gamma1) ln ln \Delta ln \Delta for large \Delta. These results considerably improve the previous best ratio of k(1\Gammac=\Delta 1 k\Gamma1 ) for bounded degree kuniform hypergraphs, and k(1 \Gamma c=n k\Gamma1 k ) for general kuniform hypergraphs, both obtained by Krivelevich. Using similar techniques, we also obtain an approximation algorithm for the weighted independent set problem, matching a recent result of Halldórsson.
Finding frequent patterns in a large sparse graph
 SIAM Data Mining Conference
, 2004
"... This paper presents two algorithms based on the horizontal and vertical pattern discovery paradigms that find the connected subgraphs that have a sufficient number of edgedisjoint embeddings in a single large undirected labeled sparse graph. These algorithms use three different methods to determine ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
This paper presents two algorithms based on the horizontal and vertical pattern discovery paradigms that find the connected subgraphs that have a sufficient number of edgedisjoint embeddings in a single large undirected labeled sparse graph. These algorithms use three different methods to determine the number of the edgedisjoint embeddings of a subgraph that are based on approximate and exact maximum independent set computations and use it to prune infrequent subgraphs. Experimental evaluation on real datasets from various domains show that both algorithms achieve good performance, scale well to sparse input graphs with more than 100,000 vertices, and significantly outperform a previously developed algorithm.
Derandomized graph products
 COMPUTATIONAL COMPLEXITY
, 1995
"... Berman and Schnitger [10] gave a randomized reduction from approximating MAXSNP problems [24] within constant factors arbitrarily close to 1 to approximating clique within a factor of n ɛ (for some ɛ). This reduction was further studied by Blum [11], who gave it the name randomized graph products. ..."
Abstract

Cited by 74 (12 self)
 Add to MetaCart
Berman and Schnitger [10] gave a randomized reduction from approximating MAXSNP problems [24] within constant factors arbitrarily close to 1 to approximating clique within a factor of n ɛ (for some ɛ). This reduction was further studied by Blum [11], who gave it the name randomized graph products. We show that this reduction can be made deterministic (derandomized), using random walks on expander graphs [1]. The main technical contribution of this paper is in lower bounding the probability that all steps of a random walk stay within a specified set of vertices of a graph. (Previous work was mainly concerned with upper bounding this probability.) This lower bound extends also to the case that different sets of vertices are specified for different time steps of the walk.
Approximations of Weighted Independent Set and Hereditary Subset Problems
 JOURNAL OF GRAPH ALGORITHMS AND APPLICATIONS
, 2000
"... The focus of this study is to clarify the approximability of weighted versions of the maximum independent set problem. In particular, we report improved performance ratios in boundeddegree graphs, inductive graphs, and general graphs, as well as for the unweighted problem in sparse graphs. Wher ..."
Abstract

Cited by 53 (6 self)
 Add to MetaCart
The focus of this study is to clarify the approximability of weighted versions of the maximum independent set problem. In particular, we report improved performance ratios in boundeddegree graphs, inductive graphs, and general graphs, as well as for the unweighted problem in sparse graphs. Where possible, the techniques are applied to related hereditary subgraph and subset problem, obtaining ratios better than previously reported for e.g. Weighted Set Packing, Longest Common Subsequence, and Independent Set in hypergraphs.
On Approximation Properties of the Independent Set Problem for Degree 3 Graphs
 In Proc. of Workshop on Algorithms and Data Structures
, 1995
"... . The main problem we consider in this paper is the Independent Set problem for bounded degree graphs. It is shown that the problem remains MAX SNPcomplete when the maximum degree is bounded by 3. Some related problems are also shown to be MAX SNPcomplete at the lowest possible degree bounds. N ..."
Abstract

Cited by 39 (0 self)
 Add to MetaCart
. The main problem we consider in this paper is the Independent Set problem for bounded degree graphs. It is shown that the problem remains MAX SNPcomplete when the maximum degree is bounded by 3. Some related problems are also shown to be MAX SNPcomplete at the lowest possible degree bounds. Next we study better polytime approximation of the problem for degree 3 graphs, and improve the previously best ratio, 5 4 , to arbitrarily close to 6 5 . This result also provides improved polytime approximation ratios, B+3 5 + ffl, for odd degree B. 1 Introduction The area of efficient approximation algorithms for NPhard optimization problems has recently seen dramatic progress with a sequence of breakthrough achievements. Even when restricted only to the area of constant bound approximation the following remarkable results have been obtained in the last few years. The subclass of NP optimization problems, called MAX SNP, consisting solely of constant ratio approximable problems ...
A Packet Scheduling Approach to QoS Support in Multihop Wireless Networks
 Mob. Netw. Appl
, 2002
"... Providing packetlevel... In this paper, we propose a new scheduling model that addresses this conflict. The main results of this paper are the following: (a) a twotier service model that provides a minimum "fair" allocation of the channel bandwidth for each packet flow and additionally maximizes s ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
Providing packetlevel... In this paper, we propose a new scheduling model that addresses this conflict. The main results of this paper are the following: (a) a twotier service model that provides a minimum "fair" allocation of the channel bandwidth for each packet flow and additionally maximizes spatial reuse of bandwidth, (b) an ideal centralized packet scheduling algorithm that realizes the above service model, and (c) a practical distributed backoffbased channel contention mechanism that approximates the ideal service within the framework of the CSMA/CA protocol.
Complexity and Approximation of Fixing Numerical Attributes in Databases Under Integrity Constraints
 In International Workshop on Database Programming Languages
, 2005
"... Abstract. Consistent query answering is the problem of computing the answers from a database that are consistent with respect to certain integrity constraints that the database as a whole may fail to satisfy. Those answers are characterized as those that are invariant under minimal forms of restorin ..."
Abstract

Cited by 30 (12 self)
 Add to MetaCart
Abstract. Consistent query answering is the problem of computing the answers from a database that are consistent with respect to certain integrity constraints that the database as a whole may fail to satisfy. Those answers are characterized as those that are invariant under minimal forms of restoring the consistency of the database. In this context, we study the problem of repairing databases by fixing integer numerical values at the attribute level with respect to denial and aggregate constraints. We introduce a quantitative definition of database fix, and investigate the complexity of several problems such as DFP, i.e. the existence of fixes within a given distance from the original instance, and CQA, i.e. deciding consistency of answers to aggregate conjunctive queries under different semantics. We provide sharp complexity bounds, identify relevant tractable cases; and introduce approximation algorithms for some of those that are intractable. More specifically, we obtain results like undecidability of existence of fixes for aggregate constraints; MAXSNPhardness of DFP, but a good approximation algorithm for a relevant special case; and intractability but good approximation for CQA for aggregate queries for one database atom denials (plus builtins). 1
Minimum Color Sum of Bipartite Graphs
 Journal of Algorithms
, 1999
"... The problem of minimum color sum of a graph is to color the vertices of the graph such that the sum (average) of all assigned colors is minimum. Recently, in [BBH + 96], it was shown that in general graphs this problem cannot be approximated within n 1\Gammaffl , for any ffl ? 0, unless NP = ..."
Abstract

Cited by 29 (11 self)
 Add to MetaCart
The problem of minimum color sum of a graph is to color the vertices of the graph such that the sum (average) of all assigned colors is minimum. Recently, in [BBH + 96], it was shown that in general graphs this problem cannot be approximated within n 1\Gammaffl , for any ffl ? 0, unless NP = ZPP . In the same paper, a 9=8approximation algorithm was presented for bipartite graphs. The hardness question for this problem on bipartite graphs was left open. In this paper we show that the minimum color sum problem for bipartite graphs admits no polynomial approximation scheme, unless P = NP . The proof is by Lreducing the problem of finding the maximum independent set in a graph whose maximum degree is four to this problem. This result indicates clearly that the minimum color sum problem is much harder than the traditional coloring problem which is trivially solvable in bipartite graphs. As for the approximation ratio, we make a further step towards finding the precise thr...
Filtering algorithms for the NValue constraint
 In Proceedings CPAIOR’05
, 2005
"... Abstract. The constraint NValue counts the number of different values assigned to a vector of variables. Propagating generalized arc consistency on this constraint is NPhard. We show that computing even the lower bound on the number of values is NPhard. We therefore study different approximation h ..."
Abstract

Cited by 27 (10 self)
 Add to MetaCart
Abstract. The constraint NValue counts the number of different values assigned to a vector of variables. Propagating generalized arc consistency on this constraint is NPhard. We show that computing even the lower bound on the number of values is NPhard. We therefore study different approximation heuristics for this problem. We introduce three new methods for computing a lower bound on the number of values. The first two are based on the maximum independent set problem and are incomparable to a previous approach based on intervals. The last method is a linear relaxation of the problem. This gives a tighter lower bound than all other methods, but at a greater asymptotic cost. 1 Introduction The NValue constraint counts the number of distinct values used by a vectorof variables. It is a generalization of the widely used AllDifferent constraint[12]. It was introduced in [4] to model a musical playlist configuration problem so