Results 1  10
of
19
A survey on continuous time computations
 New Computational Paradigms
"... Abstract. We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing resu ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Abstract. We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature. 1
Closed Timelike Curves Make Quantum and Classical Computing Equivalent
"... While closed timelike curves (CTCs) are not known to exist, studying their consequences has led to nontrivial insights in general relativity, quantum information, and other areas. In this paper we show that if CTCs existed, then quantum computers would be no more powerful than classical computers: b ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
While closed timelike curves (CTCs) are not known to exist, studying their consequences has led to nontrivial insights in general relativity, quantum information, and other areas. In this paper we show that if CTCs existed, then quantum computers would be no more powerful than classical computers: both would have the (extremely large) power of the complexity class PSPACE, consisting of all problems solvable by a conventional computer using a polynomial amount of memory. This solves an open problem proposed by one of us in 2005, and gives an essentially complete understanding of computational complexity in the presence of CTCs. Following the work of Deutsch, we treat a CTC as simply a region of spacetime where a “causal consistency ” condition is imposed, meaning that Nature has to produce a (probabilistic or quantum) fixedpoint of some evolution operator. Our conclusion is then a consequence of the following theorem: given any quantum circuit (not necessarily unitary), a fixedpoint of the circuit can be (implicitly) computed in polynomial space. This theorem might have independent applications in quantum information. 1
An algorithmic framework for compression and text indexing
"... We present a unified algorithmic framework to obtain nearly optimal space bounds for text compression and compressed text indexing, apart from lowerorder terms. For a text T of n symbols drawn from an alphabet Σ, our bounds are stated in terms of the hthorder empirical entropy of the text, Hh. In ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We present a unified algorithmic framework to obtain nearly optimal space bounds for text compression and compressed text indexing, apart from lowerorder terms. For a text T of n symbols drawn from an alphabet Σ, our bounds are stated in terms of the hthorder empirical entropy of the text, Hh. In particular, we provide a tight analysis of the BurrowsWheeler transform (bwt) establishing a bound of nHh + M(T,Σ,h) bits, where M(T,Σ,h) denotes the asymptotical number of bits required to store the empirical statistical model for contexts of order h appearing in T. Using the same framework, we also obtain an implementation of the compressed suffix array (csa) which achieves nHh + M(T,Σ,h) + O(nlg lg n/lg Σ  n) bits of space while still retaining competitive fulltext indexing functionality. The novelty of the proposed framework lies in its use of the finite set model instead of the empirical probability model (as in previous work), giving us new insight into the design and analysis of our algorithms. For example, we show that our analysis gives improved bounds since M(T,Σ,h) ≤ min{g ′ h lg(n/g ′ h + 1),H ∗ hn + lg n + g′′ h}, where g ′ h = O(Σh+1) and g ′′ h = O(Σ  h+1 lg Σ  h+1) do not depend on the text length n, while H ∗ h ≥ Hh is the modified hthorder empirical entropy of T. Moreover, we show a strong relationship between a compressed fulltext index and the succinct dictionary problem. We also examine the importance of lowerorder terms, as these can dwarf any savings achieved by highorder entropy. We report further results and tradeoffs on highorder entropycompressed text indexes in the paper. 1
How much can analog and hybrid systems be proved (super)Turing
 Applied Mathematics and Computation
, 2006
"... Church thesis and its variants say roughly that all reasonable models of computation do not have more power than Turing Machines. In a contrapositive way, they say that any model with superTuring power must have something unreasonable. Our aim is to discuss how much theoretical computer science can ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Church thesis and its variants say roughly that all reasonable models of computation do not have more power than Turing Machines. In a contrapositive way, they say that any model with superTuring power must have something unreasonable. Our aim is to discuss how much theoretical computer science can quantify this, by considering several classes of continuous time dynamical systems, and by studying how much they can be proved Turing or superTuring. 1
Can Intelligence Explode?
, 2012
"... The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of superintelligent algorithms that recursively create ever higher intelligences. It took many decades for these ideas to spread from science ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of superintelligent algorithms that recursively create ever higher intelligences. It took many decades for these ideas to spread from science fiction to popular science magazines and finally to attract the attention of serious philosophers. David Chalmers ’ (JCS 2010) article is the first comprehensive philosophical analysis of the singularity in a respected philosophy journal. The motivation of my article is to augment Chalmers ’ and to discuss some issues not addressed by him, in particular what it could mean for intelligence to explode. In this course, I will (have to) provide a more careful treatment of what intelligence actually is, separate speed from intelligence explosion, compare what superintelligent participants and classical human observers might experience and do, discuss immediate implications for the diversity and value of life, consider possible bounds on intelligence, and contemplate intelligences
SIMPL system: on a public key variant of physical unclonable function,” Cryptology ePrint Archive
, 2009
"... This paper theoretically discusses a novel security tool termed SIMPL system, which can be regarded as a public key version of physical unclonable functions (PUFs). Like the latter, a SIMPL system S is physically unique and nonreproducible, and implements an individual function FS. In opposition to ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This paper theoretically discusses a novel security tool termed SIMPL system, which can be regarded as a public key version of physical unclonable functions (PUFs). Like the latter, a SIMPL system S is physically unique and nonreproducible, and implements an individual function FS. In opposition to a PUF, however, a SIMPL system S possesses a publicly known numerical description D(S), which allows its digital simulation and prediction. At the same time, it is required that any digital simulation of a SIMPL system S must work at a detectably lower speed than its realtime behavior. In other words, the holder of a SIMPL system S can evaluate a publicly known, publicly computable function FS faster than anyone else. This feature, so we argue in this paper, allows a number of improved practicality and security features. Once implemented successfully, SIMPL systems would have specific advantages over PUFs, certificates of authenticity, physically obfuscated keys, and also over standard mathematical cryptotechniques. 2
Why philosophers should care about computational complexity
 In Computability: Gödel, Turing, Church, and beyond (eds
, 2012
"... One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed casethat onewouldbe wrong. In particular, I arguethat computational complexity theory—the field that ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed casethat onewouldbe wrong. In particular, I arguethat computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing
Aggregating Robots Compute: An Adaptive Heuristic for the Euclidean Steiner Tree Problem
"... Abstract. It is becoming stateoftheart to form largescale multiagent systems or artificial swarms showing adaptive behavior by constructing high numbers of cooperating, embodied, mobile agents (robots). For the sake of space and costefficiency such robots are typically miniaturized and equipp ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. It is becoming stateoftheart to form largescale multiagent systems or artificial swarms showing adaptive behavior by constructing high numbers of cooperating, embodied, mobile agents (robots). For the sake of space and costefficiency such robots are typically miniaturized and equipped with only few sensors and actuators resulting in rather simple devices. In order to overcome these constraints, bioinspired concepts of selforganization and emergent properties are applied. Thus, accuracy is usually not a trait of such systems, but robustness and fault tolerance are. It turns out that they are applicable to even hard problems and reliably deliver approximated solutions. Based on these principles we present a heuristic for the Euclidean Steiner tree problem which is NPhard. Basically, it is the problem of connecting objects in a plane efficiently. The proposed system is investigated from two different viewpoints: computationally and behaviorally. While the performance is, as expected, clearly suboptimal but still reasonably well, the system is adaptive and robust. 1
Quantum Versions of kCSP Algorithms: a First Step Towards Quantum Algorithms for IntervalRelated Constraint Satisfaction Problems ABSTRACT
"... In data processing, we input the results exi of measuring easytomeasure quantities xi and use these results to find estimates ey = f ( ex1,..., exn) for difficulttomeasure quantities y which are related to xi by a known relation y = f(x1,..., xn). Due to measurement inaccuracy, the measured valu ..."
Abstract
 Add to MetaCart
In data processing, we input the results exi of measuring easytomeasure quantities xi and use these results to find estimates ey = f ( ex1,..., exn) for difficulttomeasure quantities y which are related to xi by a known relation y = f(x1,..., xn). Due to measurement inaccuracy, the measured values exi are, in general, different from the (unknown) actual values xi of the measured quantities, hence the result ey of data processing is different from the actual value of the quantity y. In many practical situations, we only know the bounds ∆i def on the measurement errors ∆xi = exi − xi. In such situations, we only know that the actual value xi belongs to the interval [ exi − ∆i, exi + ∆i], and we want to know the range of possible values of y. The corresponding problems of interval computations are NPhard, so solving these problems may take an unrealistically long time. One way to speed up computations is to use quantum computing, and quantum versions of interval computations algorithms have indeed been developed. In many practical situations, we also know some constraints on the possible values of the directly measured quantities x1,..., xn. In such situations, we must combine interval techniques with constraint satisfaction techniques. It is therefore desirable to extend quantum interval algorithms to such combinations. As a first step towards this combination, in this paper, we consider quantum algorithms for discrete constraint satisfaction problems.
Immunology as a Metaphor for Adaptive and Distributed Information Processing
"... AbstractHow can one effectively exploit the adaptive and distributed information processing characteristics of the immune system for the purposes of computation and engineering problem domains? The field focused on this problem is artificial immune systems, and this work provides a novel hierarchic ..."
Abstract
 Add to MetaCart
AbstractHow can one effectively exploit the adaptive and distributed information processing characteristics of the immune system for the purposes of computation and engineering problem domains? The field focused on this problem is artificial immune systems, and this work provides a novel hierarchical framework as to how such work, may proceed from at least three similar, although distinct directions.