Results 1  10
of
11
Making DNA computers error resistant
, 1995
"... We describe methods for making volume decreasing algorithms more resistant to certain types of errors. Such error recovery techniques are crucial if DNA computers ever become practical. Our first approach relies on applying PCR at various stages of the computation. We analyze its performance and sho ..."
Abstract

Cited by 37 (4 self)
 Add to MetaCart
We describe methods for making volume decreasing algorithms more resistant to certain types of errors. Such error recovery techniques are crucial if DNA computers ever become practical. Our first approach relies on applying PCR at various stages of the computation. We analyze its performance and show that it increases the survivalprobability of various strands to acceptable proportions. Our second approach relies on changing the method by which information is encoded on DNA strands. This encoding is likely to reduce false negative errors during the bead separation procedure. 1 Introduction In the short history of DNA (deoxyribonucleic acid) based computing there have already been a number of exciting results. It all started with Adleman's [A] beautiful insight that showed that biological experiments could solve the Directed Hamiltonian Path problem (DHP). Then, Lipton [L] showed how to use DNA to solve more general problems, namely to find satisfying assignments for arbitrary (direct...
A Survey of ContinuousTime Computation Theory
 Advances in Algorithms, Languages, and Complexity
, 1997
"... Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuoustime computation. However, while specialcase algorithms and devices are being developed, relatively little work exists o ..."
Abstract

Cited by 29 (6 self)
 Add to MetaCart
Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuoustime computation. However, while specialcase algorithms and devices are being developed, relatively little work exists on the general theory of continuoustime models of computation. In this paper, we survey the existing models and results in this area, and point to some of the open research questions. 1 Introduction After a long period of oblivion, interest in analog computation is again on the rise. The immediate cause for this new wave of activity is surely the success of the neural networks "revolution", which has provided hardware designers with several new numerically based, computationally interesting models that are structurally sufficiently simple to be implemented directly in silicon. (For designs and actual implementations of neural models in VLSI, see e.g. [30, 45]). However, the more fundamental...
Molecular Computing, Bounded Nondeterminism, and Efficient Recursion
 In Proceedings of the 24th International Colloquium on Automata, Languages, and Programming
, 1998
"... The maximum number of strands used is an important measure of a molecular algorithm's complexity. This measure is also called the volume used by the algorithm. Every problem that can be solved by an NP Turing machine with b(n) binary nondeterministic choices can be solved by molecular computation in ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
The maximum number of strands used is an important measure of a molecular algorithm's complexity. This measure is also called the volume used by the algorithm. Every problem that can be solved by an NP Turing machine with b(n) binary nondeterministic choices can be solved by molecular computation in a polynomial number of steps, with four test tubes, in volume 2 b(n) . We identify a large class of recursive algorithms that can be implemented using bounded nondeterminism. This yields improved molecular algorithms for important problems like 3SAT, independent set, and 3colorability. 1. A model of molecular computing Molecular computation was first studied in [1, 20]. The models we define were inspired as well by the work of [3, 28]. A molecular sequence is a string over an alphabet \Sigma (we can use any alphabet we like, encoding characters of \Sigma by finite sequences of base pairs). A test tube is a multiset of molecular sequences. We describe the allowable operations below. Whe...
Biological Computing
, 1996
"... Adleman's [Adl94] successful solution of a sevenvertex instance of the NPcomplete Hamiltonian Path problem by recombinant DNA technology initiated the field of biological computing. We propose a very different model of molecular computing based on the biochemistry of RNA editing and RNA translatio ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Adleman's [Adl94] successful solution of a sevenvertex instance of the NPcomplete Hamiltonian Path problem by recombinant DNA technology initiated the field of biological computing. We propose a very different model of molecular computing based on the biochemistry of RNA editing and RNA translation. In our model, individual molecules become fully capable general purpose computers. 1 Introduction Our goal in this paper is to show how chemically plausible modifications of wellunderstood biological systems can be used to construct general purpose processors out of individual molecules. To this end, we summarize basic biochemistry in Section 3, in order to make the technological proposals of Sections 4 and 5 understandable. Scientific understanding of the molecular basis for cell biology has grown enormously in recent years. The cell is now understood as a computational system: its program resides in DNA , and its state in the distribution of chemical compounds and electrical charges. ...
A Comparison of ResourceBounded Molecular Computation Models
 In Proceedings of the 5th Israel Symposium on Theory of Computing and Systems
, 1997
"... The number of molecular strands used by a molecular algorithm is an important measure of the algorithm's complexity. This measure is also called the volume used by the algorithm. We prove that three important polynomialtime models of molecular computation with bounded volume are equivalent to model ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
The number of molecular strands used by a molecular algorithm is an important measure of the algorithm's complexity. This measure is also called the volume used by the algorithm. We prove that three important polynomialtime models of molecular computation with bounded volume are equivalent to models of polynomialtime Turing machine computation with bounded nondeterminism. Without any assumption, we show that the Split operation does not increase the power of polynomialtime molecular computation. Assuming a plausible separation between Turing machine complexity classes, the Amplify operation does increase the power of polynomialtime molecular computation. 1. Introduction Molecular computation was first studied in [1, 15], which identified the number of molecular strands used as an important resource. This measure is also Research performed at Yale University and at the University of Maryland. Supported in part by the National Science Foundation under grant CCR8958528, CCR94154...
Active Transport in Biological Computing (Preliminary Version)
, 1996
"... Early papers on biological computing focussed on combinatorial and algorithmic issues, and worked with intentionally oversimplified chemical models. In this paper, we reintroduce complexity to the chemical model by considering the effect problem size has on the initial concentrations of reactants, a ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Early papers on biological computing focussed on combinatorial and algorithmic issues, and worked with intentionally oversimplified chemical models. In this paper, we reintroduce complexity to the chemical model by considering the effect problem size has on the initial concentrations of reactants, and the effect this has in turn on the rate of production and quantity of final reaction products. We give a sobering preliminary analyses of Adleman's technique for solving Hamiltonian path. Even on the simplest problems, the annealling phase of Adleman's technique requires time\Omega\Gamma n 2 ) rather than the O(log n) complexity given by a computationally inspired but chemically naive analysis. On more difficult problems, not only does the rate of production of witnessing molecules drop exponentially in problem size, the final yield also drops exponentially. These issues are not objections to biological computing per se, but rather difficulties to be overcome in its development as a v...
Solving Intractable Problems with DNA Computing
 In Proceedings of the 13th Annual IEEE Conference on Computational Complexity
, 1998
"... We survey the theoretical use of DNA computing to solve intractable problems. We also discuss the relationship between problems in DNA computing and questions in complexity theory. 1. Introduction Adleman's pioneering experiment [1] opened the possibility that moderately large instances of NPcomp ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We survey the theoretical use of DNA computing to solve intractable problems. We also discuss the relationship between problems in DNA computing and questions in complexity theory. 1. Introduction Adleman's pioneering experiment [1] opened the possibility that moderately large instances of NPcomplete problems might be solved via techniques from molecular biology. Since then numerous papers have explored more efficient molecular algorithms for particular problems in NP [27, 10, 3, 30, 8, 20, 21, 18], molecular solutions to PSPACEcomplete problems [7, 37], and fault tolerant molecular algorithms [12, 25]. Other papers have examined the relationships between molecular complexity classes and classical complexity classes [38, 19]. We will survey some of these advances in this paper. For previous surveys in DNA computing, see [24, 36, 34, 32]. 2. Biological Background DNA is the storage medium for genetic information. It is composed of units called nucleotides, distinguished by the che...
Active Transport in Biological Computing (Preliminary Version)
"... Early papers on biological computing focussed on combinatorial and algorithmic issues, and worked with intentionally oversimplified chemical models. In this paper, we reintroduce complexity to the chemical model by considering the effect problem size has on the initial concentrations of reactants, a ..."
Abstract
 Add to MetaCart
Early papers on biological computing focussed on combinatorial and algorithmic issues, and worked with intentionally oversimplified chemical models. In this paper, we reintroduce complexity to the chemical model by considering the effect problem size has on the initial concentrations of reactants, and the effect this has in turn on the rate of production and quantity of final reaction products. We give a sobering preliminary analyses of Adleman's technique for solving Hamiltonian path. Even on the simplest problems, the annealling phase of Adleman's technique requires time\Omega\Gamma n 2 ) rather than the O(log n) complexity given by a computationally inspired but chemically naive analysis. On more difficult problems, not only does the rate of production of witnessing molecules drop exponentially in problem size, the final yield also drops exponentially. These issues are not objections to biological computing per se, but rather difficulties to be overcome in its development as a v...
Recent Developments in DNAComputing
 IN PROCEEDINGS OF THE 1997 27TH INTERNATIONAL SYMPOSIUM ON MULTIPLEVALUED LOGIC
, 1997
"... In 1994 Adleman published the description of a lab experiment, where he computed an instance of the Hamiltonian path problem with DNA in test tubes. He initiated a flood of further research on computing with molecular means in theoretical computer science. A great number of models was introduced and ..."
Abstract
 Add to MetaCart
In 1994 Adleman published the description of a lab experiment, where he computed an instance of the Hamiltonian path problem with DNA in test tubes. He initiated a flood of further research on computing with molecular means in theoretical computer science. A great number of models was introduced and examined, concerning their computional power (universality as well as time and space complexity), their efficiency and their error resistance. The main results are presented in this survey.