Results 1  10
of
31
Haplotyping as Perfect Phylogeny: Conceptual Framework and Efficient Solutions (Extended Abstract)
, 2002
"... The next highpriority phase of human genomics will involve the development of a full Haplotype Map of the human genome [12]. It will be used in largescale screens of populations to associate specific haplotypes with specific complex geneticinfluenced diseases. A prototype Haplotype Mapping strat ..."
Abstract

Cited by 125 (10 self)
 Add to MetaCart
The next highpriority phase of human genomics will involve the development of a full Haplotype Map of the human genome [12]. It will be used in largescale screens of populations to associate specific haplotypes with specific complex geneticinfluenced diseases. A prototype Haplotype Mapping strategy is presently being finalized by an NIH workinggroup. The biological key to that strategy is the surprising fact that genomic DNA can be partitioned into long blocks where genetic recombination has been rare, leading to strikingly fewer distinct haplotypes in the population than previously expected [12, 6, 21, 7]. In this paper
Efficient reconstruction of haplotype structure via perfect phylogeny
 Journal of Bioinformatics and Computational Biology
, 2003
"... Each person’s genome contains two copies of each chromosome, one inherited from the father and the other from the mother. A person’s genotype specifies the pair of bases at each site, but does not specify which base occurs on which chromosome. The sequence of each chromosome separately is called a h ..."
Abstract

Cited by 74 (12 self)
 Add to MetaCart
(Show Context)
Each person’s genome contains two copies of each chromosome, one inherited from the father and the other from the mother. A person’s genotype specifies the pair of bases at each site, but does not specify which base occurs on which chromosome. The sequence of each chromosome separately is called a haplotype. The determination of the haplotypes within a population is essential for understanding genetic variation and the inheritance of complex diseases. The haplotype mapping project, a successor to the human genome project, seeks to determine the common haplotypes in the human population. Since experimental determination of a person’s genotype is less expensive than determining its component haplotypes, algorithms are required for computing haplotypes from genotypes. Two observations aid in this process: first, the human genome contains short blocks within which only a few different haplotypes occur; second, as suggested by Gusfield, it is reasonable to assume that the haplotypes observed within a block have evolved according to a perfect phylogeny, in which at most one mutation event has occurred at any site, and no recombination occurred at the given region. We present a simple and efficient polynomialtime algorithm for inferring haplotypes from the genotypes of a set of individuals assuming a perfect phylogeny. Using a reduction to 2SAT we extend this algorithm to handle constraints that apply when we have genotypes from both parents and child. We also present a hardness result for the problem of removing the minimum number of individuals from a population to ensure that the genotypes of the remaining individuals are consistent with a perfect phylogeny. Our algorithms have been tested on real data and give biologically meaningful results. Our webserver
Distance realization problems with applications to Internet tomography
"... In recent years, a variety of graph optimization problems have arisen in which the graphs involved are much too large for the usual algorithms to be effective. In these cases, even though we are not able to examine the entire graph (which may be changing dynamically), we would still like to deduce v ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
(Show Context)
In recent years, a variety of graph optimization problems have arisen in which the graphs involved are much too large for the usual algorithms to be effective. In these cases, even though we are not able to examine the entire graph (which may be changing dynamically), we would still like to deduce various properties of it, such as the size of a connected component, the set of neighbors of a subset of vertices, etc. In this paper, we study a class of problems, called distance realization problems, which arise in the study of Internet data traffic models. uppose we are given a set S of terminal nodes, taken from some (unknown) weighted graph. A basic problem is to reconstruct a weighted graph G including S with possibly additional vertices, that realizes the given distance matrix for S. We will first show that this problem is not only difficult but the solution is often unstable in the sense that even if all distances between nodes in S decrease, the solution can increase by a factor proport...
A decomposition theory for binary linear codes
, 2008
"... The decomposition theory of matroids initiated by Paul Seymour in the 1980’s has had an enormous impact on research in matroid theory. This theory, when applied to matrices over the binary field, yields a powerful decomposition theory for binary linear codes. In this paper, we give an overview of ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
The decomposition theory of matroids initiated by Paul Seymour in the 1980’s has had an enormous impact on research in matroid theory. This theory, when applied to matrices over the binary field, yields a powerful decomposition theory for binary linear codes. In this paper, we give an overview of this code decomposition theory, and discuss some of its implications in the context of the recently discovered formulation of maximumlikelihood (ML) decoding of a binary linear code over a discrete memoryless channel as a linear programming problem. We translate matroidtheoretic results of Grötschel and Truemper from the combinatorial optimization literature to give examples of nontrivial families of codes for which the ML decoding problem can be solved in time polynomial in the length of the code. One such family is that consisting of codes C for which the codeword polytope is identical to the KoetterVontobel fundamental polytope derived from the entire dual code C ⊥. However, we also show that such families of codes are not good in a codingtheoretic sense — either their dimension or their minimum distance must grow sublinearly with codelength.
What is a matroid?
, 2007
"... Matroids were introduced by Whitney in 1935 to try to capture abstractly the essence of dependence. Whitney’s definition embraces a surprising diversity of combinatorial structures. Moreover, matroids arise naturally in combinatorial optimization since they are precisely the structures for which th ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
Matroids were introduced by Whitney in 1935 to try to capture abstractly the essence of dependence. Whitney’s definition embraces a surprising diversity of combinatorial structures. Moreover, matroids arise naturally in combinatorial optimization since they are precisely the structures for which the greedy algorithm works. This survey paper introduces matroid theory, presents some of the main theorems in the subject, and identifies some of the major problems of current research interest.
Matchings, Matroids and Unimodular Matrices
, 1995
"... We focus on combinatorial problems arising from symmetric and skewsymmetric matrices. For much of the thesis we consider properties concerning the principal submatrices. In particular, we are interested in the property that every nonsingular principal submatrix is unimodular; matrices having this p ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We focus on combinatorial problems arising from symmetric and skewsymmetric matrices. For much of the thesis we consider properties concerning the principal submatrices. In particular, we are interested in the property that every nonsingular principal submatrix is unimodular; matrices having this property are called principally unimodular. Principal unimodularity is a generalization of total unimodularity, and we generalize key polyhedral and matroidal results on total unimodularity. Highlights include a generalization of Hoffman and Kruskal's result on integral polyhedra, a generalization of Tutte's results on regular matroids, and partial results toward a decomposition theorem. Quite separate from the study of principal unimodularity we consider a particular skewsymmetric matrix of indeterminates associated with a graph. This matrix, called the Tutte matrix, was introduced by Tutte to study matchings. By considering the rank of an arbitrary submatrix of the Tutte matrix we disco...
Exact algorithms and applications for Treelike Weighted Set Cover
 JOURNAL OF DISCRETE ALGORITHMS
, 2006
"... We introduce an NPcomplete special case of the Weighted Set Cover problem and show its fixedparameter tractability with respect to the maximum subset size, a parameter that appears to be small in relevant applications. More precisely, in this practically relevant variant we require that the given ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
(Show Context)
We introduce an NPcomplete special case of the Weighted Set Cover problem and show its fixedparameter tractability with respect to the maximum subset size, a parameter that appears to be small in relevant applications. More precisely, in this practically relevant variant we require that the given collection C of subsets of a some base set S should be “treelike.” That is, the subsets in C can be organized in a tree T such that every subset onetoone corresponds to a tree node and, for each element s of S, the nodes corresponding to the subsets containing s induce a subtree of T. This is equivalent to the problem of finding a minimum edge cover in an edgeweighted acyclic hypergraph. Our main result is an algorithm running in O(3 k ·mn) time where k denotes the maximum subset size, n: = S, and m: = C. The algorithm also implies a fixedparameter tractability result for the NPcomplete Multicut in Trees problem, complementing previous approximation results. Our results find applications in computational biology in phylogenomics and for saving memory in tree decomposition based graph algorithms.
INTEGER PROGRAMMING MODELS FOR GROUNDHOLDING IN AIR TRAFFIC FLOW MANAGEMENT
"... In this dissertation, integer programming models are applied to combinatorial problems in air traffic flow management. For the two problems studied, models are developed and analyzed both theoretically and computationally. This dissertation makes contributions to integer programming while providing ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
In this dissertation, integer programming models are applied to combinatorial problems in air traffic flow management. For the two problems studied, models are developed and analyzed both theoretically and computationally. This dissertation makes contributions to integer programming while providing efficient tools for solving air traffic flow management problems. Currently, a constrained arrival capacity situation at an airport in the United States is alleviated by holding inbound aircraft at their departure gates. The ground holding problem (GH) decides which aircraft to hold on the ground and for how long. This dissertation examines the GH from two perspectives. First, the hubbing operations of the airlines are considered by adding side constraints to GH. These constraints enforce the desire of the airlines to temporally groupbanks of flights. Five basic models and several variations of the ground holding problem with banking constraints (GHB) are presented. A particularly strong, facetinducing model of the banking constraints is presented which allows one to
Multicommodity Flows and Approximation Algorithms
, 1994
"... This thesis is about multicommodity flows and their use in designing approximation algorithms for problems involving cuts in graphs. In a groundbreaking work Leighton and Rao [34] showed an approximate maxflow mincut theorem for uniform multicommodity flow and used this to obtain an approximation ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This thesis is about multicommodity flows and their use in designing approximation algorithms for problems involving cuts in graphs. In a groundbreaking work Leighton and Rao [34] showed an approximate maxflow mincut theorem for uniform multicommodity flow and used this to obtain an approximation algorithm for the flux of a graph. We consider the multicommodity flow problem in which the object is to maximize the sum of the flows routed and prove the following approximate maxflow minmulticut theorem minmulticut O(log k) maxflow minmulticut where k is the number of commodities. Our proof is based on a rounding technique from [34]. Further, we show that this theorem is tight. For a multicommodity flow instance with specified demands, the ratio of the maximum concurrent flow to the sparsest cut was shown to be bounded by O(log 2 k) [30, 57, 17, 47]. We use ideas from our proof of the approximate maxflow minmulticut theorem and a geometric scaling technique from [1] to provi...