Results 1  10
of
10
Maximum Agreement and Compatible Supertrees
 IN PROCEEDINGS OF CPM
, 2004
"... Given a collection of trees on leaves with identical leaf set, the MAST, resp. MCT, problem consists in finding a largest subset of the leaves such that all input trees restricted to this set are identical, resp. have a common refinement. For MAST, resp. MCT, on rooted trees, we give an ..."
Abstract

Cited by 18 (9 self)
 Add to MetaCart
Given a collection of trees on leaves with identical leaf set, the MAST, resp. MCT, problem consists in finding a largest subset of the leaves such that all input trees restricted to this set are identical, resp. have a common refinement. For MAST, resp. MCT, on rooted trees, we give an algorithm, where and is the smallest number of leaves whose removal leads to the existence of an agreement subtree, resp. a compatible tree. This improves on [13] for MAST and proves fixed parameter tractability for MCT. We then extend these problems to the case of supertrees where input trees can have nonidentical leaf sets. For the obtained problems, SMAST and SMCT, we give an time algorithm for the special case of two input trees is the time bound for solving MAST, resp. MCT, on two leaf trees). Finally, we show that SMAST and SMCT parametrized in "! #%$ hard and cannot be approximated in polynomial time within a constant factor unless &('*)& , even when the input trees are rooted triples. We also extend the above results to the case of unrooted input trees.
Fast FixedParameter Tractable Algorithms for Nontrivial Generalizations of Vertex Cover
, 2003
"... Our goal in this paper is the development of fast algorithms for recognizing general classes of graphs. We seek algorithms whose complexity can be expressed as a linear function of the graph size plus an exponential function of k, a natural parameter describing the class. In particular, we consider ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
Our goal in this paper is the development of fast algorithms for recognizing general classes of graphs. We seek algorithms whose complexity can be expressed as a linear function of the graph size plus an exponential function of k, a natural parameter describing the class. In particular, we consider the class W_k(G), where for each graph G in W_k(G), the removal of a set of at most k vertices from G results in a graph in the base graph class G. (If G ist the class of edgeless graphs,...
LTL over integer periodicity constraints
 Proceedings of the 7th International Conference on Foundations of Software Science and Computation Structures (FOSSACS), volume 2987 of LNCS
, 2004
"... Abstract. Periodicity constraints are used in many logical formalisms, in fragments of Presburger LTL, in calendar logics, and in logics for access control, to quote a few examples. In the paper, we introduce the logic PLTL mod, an extension of LinearTime Temporal Logic LTL with pasttime operators ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Abstract. Periodicity constraints are used in many logical formalisms, in fragments of Presburger LTL, in calendar logics, and in logics for access control, to quote a few examples. In the paper, we introduce the logic PLTL mod, an extension of LinearTime Temporal Logic LTL with pasttime operators whose atomic formulae are defined from a firstorder constraint language dealing with periodicity. Although the underlying constraint language is a fragment of Presburger arithmetic shown to admit a pspacecomplete satisfiability problem, we establish that PLTL mod modelchecking and satisfiability problems remain in pspace as plain LTL (full Presburger LTL is known to be highly undecidable). This is particularly interesting for dealing with periodicity constraints since the language of PLTL mod has a language more concise than existing languages and the temporalization of our firstorder language of periodicity constraints has the same worst case complexity as the underlying constraint language. Finally, we show examples of introduction the quantification in the logical language that provide to PLTL mod, expspacecomplete problems. As another application, we establish that the equivalence problem for extended singlestring automata, known to express the equality of time granularities, is pspacecomplete by designing a reduction from QBF and by using our results for PLTL mod. Keywords: Presburger LTL, periodicity constraints, computational complexity, Büchi automaton, QBF.
Improved Parameterized Complexity of the Maximum Agreement Subtree and . . .
 IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS
, 2006
"... Given a set of evolutionary trees on a same set of taxa, the maximum agreement subtree problem (MAST), respectively maximum compatible tree problem (MCT), consists of finding a largest subset of taxa such that all input trees restricted to these taxa are isomorphic, respectively compatible. These ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Given a set of evolutionary trees on a same set of taxa, the maximum agreement subtree problem (MAST), respectively maximum compatible tree problem (MCT), consists of finding a largest subset of taxa such that all input trees restricted to these taxa are isomorphic, respectively compatible. These problems
Confronting hardness using a hybrid approach
 in SODA, 2006
"... A hybrid algorithm is a collection of heuristics, paired with a polynomial time selector S that runs on the input to decide which heuristic should be executed to solve the problem. Hybrid algorithms are interesting in scenarios where the selector must decide between heuristics that are “good ” with ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
A hybrid algorithm is a collection of heuristics, paired with a polynomial time selector S that runs on the input to decide which heuristic should be executed to solve the problem. Hybrid algorithms are interesting in scenarios where the selector must decide between heuristics that are “good ” with respect to different complexity measures. In this paper, we focus on hybrid algorithms with a “hardnessdefying ” property: for a problem Π, there is a set of complexity measures {mi} whereby Π is known or conjectured to be hard (or unsolvable) for each mi, but for each heuristic hi of the hybrid algorithm, one can give a complexity guarantee for hi on the instances of Π that S selects for hi that is strictly better than mi. For example, we show that for NPhard problems such as MaxEkLinp, Longest Path and Minimum Bandwidth, a given instance can either be solved exactly in “subexponential ” (2 o(n)) time, or be approximated in polynomial time with an approximation ratio exceeding that of the known or conjectured inapproximability of the problem, assuming P ̸ = NP. We also prove some inherent limitations to the design of hybrid algorithms that arise under the assumption that NP
Evolutionary computation: Challenges and duties
 FRONTIERS OF EVOLUTIONARY COMPUTATION
, 2004
"... Evolutionary Computation (EC) is now a few decades old. The impressive development of the field since its initial conception has made it one of the most vigorous research areas, specifically from an applied viewpoint. This should not hide the existence of some major gaps in our understanding on thes ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Evolutionary Computation (EC) is now a few decades old. The impressive development of the field since its initial conception has made it one of the most vigorous research areas, specifically from an applied viewpoint. This should not hide the existence of some major gaps in our understanding on these techniques. In this essay we propose a number of challenging tasks that –according to our opinion – should be attacked in order to fill some of these gaps. They mainly refer to the theoretical basis of the paradigm; we believe that an effective crossfertilization among different areas of Theoretical Computer Science and Artificial Intelligence (such as Parameterized Complexity and Modal Logic) is mandatory for developing a new corpus of knowledge about EC.
Rotation Distance is FixedParameter Tractable
, 2009
"... Rotation distance between trees measures the number of simple operations it takes to transform one tree into another. There are no known polynomialtime algorithms for computing rotation distance. In the case of ordered rooted trees, we show that the rotation distance between two ordered trees is fi ..."
Abstract
 Add to MetaCart
Rotation distance between trees measures the number of simple operations it takes to transform one tree into another. There are no known polynomialtime algorithms for computing rotation distance. In the case of ordered rooted trees, we show that the rotation distance between two ordered trees is fixedparameter tractable, in the parameter, k, the rotation distance. The proof relies on the kernalization of the initial trees to trees with size bounded by 7k. 1
Solving Minimum Vertex Cover: A Fast FixedParameterTractable Algorithm for Profit Cover
, 2002
"... We introduce the problem Profit Cover which is an adaptation of the graph problem Vertex Cover. Profit Cover finds application in the psychology of decisionmaking. A common assumption is that net value is a major determinant of human choice. Profit Cover incorporates the notion of net value in its ..."
Abstract
 Add to MetaCart
We introduce the problem Profit Cover which is an adaptation of the graph problem Vertex Cover. Profit Cover finds application in the psychology of decisionmaking. A common assumption is that net value is a major determinant of human choice. Profit Cover incorporates the notion of net value in its definition. For a given graph G = (V, E) and an integer p > 0, the goal is to determine a profit cover PC V such that the profit, jE j jP Cj, is at least p, where E E with (u; v) 2 E if (1) (u; v) 2 E and (2) u 2 PC or v 2 PC. We show how to use the optimization version of Profit Cover to solve Minimum Vertex Cover. In addition to the classical complexity we of this profit problem, we consider its the complexity of its natural parameterization on profit p, pProfit Cover. We show that Profit Cover is NPcomplete and present a fixedparametertractable algorithm for pProfit Cover that has a timecomplexity of O(pjV j + 1:236506 ). The algorithm is a combination of kernelizing the graph to a linear size in p, a boundedsearchtree algorithm and rekernelization. The same techniques also yield a fixedparametertractable algorithm of the same time complexity solving the more general problem pEdge Weighted Profit Cover, where each edge e 2 E is associated a weight, that is a positive integer w(e) > 0, and the profit is determined by e2E 0 w(e) jP Cj.
Detecting Traditional Packers, Decisively
"... Abstract. Many important decidability results in malware analysis are based on Turing machine models of computation. We exhibit computational models that use more realistic assumptions about machine and attacker resources. While seminal results such as [1–5] remain true for Turing machines, we show ..."
Abstract
 Add to MetaCart
Abstract. Many important decidability results in malware analysis are based on Turing machine models of computation. We exhibit computational models that use more realistic assumptions about machine and attacker resources. While seminal results such as [1–5] remain true for Turing machines, we show that under more realistic assumptions important tasks are decidable instead of undecidable. Specifically, we show that detecting traditional malware unpacking behavior – in which a payload is decompressed or decrypted and subsequently executed – is decidable under our assumptions. We then examine the issue of dealing with complex but decidable problems, and look for lessons from the hardware verification community, which has been striving to meet the challenge of intractable problems for the past three decades. 1
Published In Confronting Hardness Using a Hybrid Approach
, 2005
"... combinatorial optimization A hybrid algorithm is a collection of heuristics, paired with a polynomial time selector S that runs on the input to decide which heuristic should be executed to solve the problem. Hybrid algorithms are interesting in scenarios where the selector must decide between heuris ..."
Abstract
 Add to MetaCart
combinatorial optimization A hybrid algorithm is a collection of heuristics, paired with a polynomial time selector S that runs on the input to decide which heuristic should be executed to solve the problem. Hybrid algorithms are interesting in scenarios where the selector must decide between heuristics that are "good " with respect to different complexity measures. In this paper, we focus on hybrid algorithms with a "hardnessdefying " property: for a problem II, there is a set of complexity measures {rrii} whereby II is known or conjectured to be hard (or unsolvable) for each ra^, but for each heuristic hi of the hybrid algorithm, one can give a complexity guarantee for hi on the instances of II that S selects for hi that is strictly better than rrii. For example, we show that for NPhard problems such as MAXE/cLlNp, LONGEST PATH and MINIMUM BANDWIDTH, a given instance can either be solved exactly in "subexponential " (2°^) time, or be approximated in polynomial time with an approximation ratio exceeding that of the known or conjectured inapproximability of the problem, assuming P 7 ^ NP. We also prove some inherent limitations to the design of hybrid algorithms that arise under the assumption that NP