Results 1  10
of
19
A.Lewis, Infinite time turing machines
 Journal of Symbolic Logic
"... Abstract. We extend in a natural way the operation of Turing machines to infinite ordinal time, and investigate the resulting supertask theory of computability and decidability on the reals. Every Π1 1 set, for example, is decidable by such machines, and the semidecidable sets form a portion of the ..."
Abstract

Cited by 83 (6 self)
 Add to MetaCart
(Show Context)
Abstract. We extend in a natural way the operation of Turing machines to infinite ordinal time, and investigate the resulting supertask theory of computability and decidability on the reals. Every Π1 1 set, for example, is decidable by such machines, and the semidecidable sets form a portion of the ∆1 2 sets. Our oracle concept leads to a notion of relative computability for sets of reals and a rich degree structure, stratified by two natural jump operators. In these days of superfast computers whose speed seems to be increasing without bound, the more philosophical among us are perhaps pushed to wonder: what could we compute with an infinitely fast computer? By proposing a natural model for supertasks—computations with infinitely many steps—we provide in this paper a theoretical foundation on which to answer this question. Our model is simple: we simply extend the Turing machine concept into transfinite ordinal time. The resulting machines can perform infinitely many steps of computation, and go on to more computation after that. But mechanically they work just like Turing machines. In particular, they have the usual Turing machine hardware; there is still the same smooth infinite paper tape and the same mechanical head moving back and forth according to a finite algorithm, with finitely many states. What is new is the definition of the behavior of the machine at limit ordinal times. The resulting computability theory leads to a notion of computation on the reals, concepts of decidability and semidecidability for sets of reals as well as individual reals, two kinds of jumpoperator, and a notion of relative computability using oracles which gives a rich degree structure on both the collection of reals and the collection of sets of reals. But much remains unknown; we hope to stir interest in these ideas, which have been a joy for us to think about.
Real recursive functions and their hierarchy
, 2004
"... ... onsidered, first as a model of analog computation, and second to obtain analog characterizations of classical computational complexity classes (Unconventional Models of Computation, UMC 2002, Lecture Notes in Computer Science, Vol. 2509, Springer, Berlin, pp. 1–14). However, one of the operators ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
... onsidered, first as a model of analog computation, and second to obtain analog characterizations of classical computational complexity classes (Unconventional Models of Computation, UMC 2002, Lecture Notes in Computer Science, Vol. 2509, Springer, Berlin, pp. 1–14). However, one of the operators introduced in the seminal paper by Moore (1996), the minimalization operator, has not been considered: (a) although differential recursion (the analog counterpart of classical recurrence) is, in some extent, directly implementable in the General Purpose Analog Computer of Claude Shannon, analog minimalization is far from physical realizability, and (b) analog minimalization was borrowed from classical recursion theory and does not fit well the analytic realm of analog computation. In this paper, we show that a most natural operator captured from analysis—the operator of taking a limit—can be used properly to enhance the theory of recursion over the reals, providing good solutions to puzzling problems raised by the original model.
Some recent developments on Shannon’s general purpose analog computer
 Mathematical Logic Quarterly
"... This paper revisits one of the first models of analog computation, the General Purpose Analog Computer (GPAC). In particular, we restrict our attention to the improved model presented in [11] and we show that it can be further refined. With this we prove the following: (i) the previous model can be ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
(Show Context)
This paper revisits one of the first models of analog computation, the General Purpose Analog Computer (GPAC). In particular, we restrict our attention to the improved model presented in [11] and we show that it can be further refined. With this we prove the following: (i) the previous model can be simplified; (ii) it admits extensions having close connections with the class of smooth continuous time dynamical systems. As a consequence, we conclude that some of these extensions achieve Turing universality. Finally, it is shown that if we introduce a new notion of computability for the GPAC, based on ideas from computable analysis, then one can compute transcendentally transcendental functions such as the Gamma function or Riemann’s Zeta function. 1
The many forms of hypercomputation
 Applied Mathematics and Computation
, 2006
"... This paper surveys a wide range of proposed hypermachines, examining the resources that they require and the capabilities that they possess. ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
This paper surveys a wide range of proposed hypermachines, examining the resources that they require and the capabilities that they possess.
BioSteps Beyond Turing
 BIOSYSTEMS
, 2004
"... Are there `biologically computing agents' capable to compute Turing uncomputable functions? It is perhaps tempting to dismiss this question with a negative answer. Quite the opposite, for the first time in the literature on molecular computing we contend that the answer is not theoretically ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Are there `biologically computing agents' capable to compute Turing uncomputable functions? It is perhaps tempting to dismiss this question with a negative answer. Quite the opposite, for the first time in the literature on molecular computing we contend that the answer is not theoretically negative. Our results will be formulated in the language of membrane computing (P systems). Some mathematical results presented here are interesting in themselves. In contrast with most speedup methods which are based on nondeterminism, our results rest upon some universality results proved for deterministic P systems. These results will be used for building "accelerated P systems". In contrast with the case of Turing machines, acceleration is a part of the hardware (not a quality of the environment) and it is realised either by decreasing the size of "reactors" or by speedingup the communication channels.
Breaking the Turing Barrier
, 1998
"... Machine Universal state: Q heads: h 1 h 2 R#W head ### 0 1 0 1 #Binary Program# Universal T.M. state: Q heads: h 1 h 2 speed: S Accellerated Figure 1: Various types of Turing machines. In recent years, researchers have looked at natural processes in the physical and biological world a ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Machine Universal state: Q heads: h 1 h 2 R#W head ### 0 1 0 1 #Binary Program# Universal T.M. state: Q heads: h 1 h 2 speed: S Accellerated Figure 1: Various types of Turing machines. In recent years, researchers have looked at natural processes in the physical and biological world as motivation for constructing new models of computation holding out the hope of breaking the #Turing barrier." But are there alternatives? The quantum phenomenon of interference has led to one such model, as has the process of folding of DNA strands in a living cell. In addition, re#nements to the Turing view of computing have led to #superTuring" models, that allow one to compute in ways that transcend Turing's original scheme. Breaking Turing's barrier is double important: a# theoretically, as unconventional models are explored with an eye toward underst
ACCELERATING MACHINES
, 2006
"... This paper presents an overview of accelerating machines. We begin by exploring the history of the accelerating machine model and the potential power that it provides. We look at some of the problems that could be solved with an accelerating machine, and review some of the possible implementation me ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper presents an overview of accelerating machines. We begin by exploring the history of the accelerating machine model and the potential power that it provides. We look at some of the problems that could be solved with an accelerating machine, and review some of the possible implementation methods that have been presented. Finally, we expose the limitations of accelerating machines and conclude by posing some problems for further research.
SuperTasks, Accelerating Turing Machines and Uncomputability
"... Accelerating Turing machines are abstract devices that have the same computational structure as Turing machines, but can perform supertasks. I argue that performing supertasks alone does not buy more computational power, and that accelerating Turing machines do not solve the halting problem. To sh ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Accelerating Turing machines are abstract devices that have the same computational structure as Turing machines, but can perform supertasks. I argue that performing supertasks alone does not buy more computational power, and that accelerating Turing machines do not solve the halting problem. To show this, I analyze the reasoning that leads to Thomson's paradox, point out that the paradox rests on a conflation of different perspectives of accelerating processes, and conclude that the same conflation underlies the claim that accelerating Turing machines can solve the halting problem.
Technical Report No. 2006526 UNCONVENTIONAL COMPUTING PROBLEMS?
"... Abstract. An evolving computation is one whose characteristics vary during its execution. These variations have many di erent origins and can manifest themselves in several ways. Thus, for example, the parameters of a computation, such as the data it uses, may vary with time independently of the com ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. An evolving computation is one whose characteristics vary during its execution. These variations have many di erent origins and can manifest themselves in several ways. Thus, for example, the parameters of a computation, such as the data it uses, may vary with time independently of the computational environment in which the computation is carried out. Alternatively, itmay be that the data interact with one another during the computation thus changing each other's value irreversibly. In this paper we describe a number of evolving computational paradigms, such as computations with timevarying variables, interacting variables, timevarying complexity, and so on. We show that evolving computations demonstrate the impossibility ofachieving universality in computing, be it conventional or unconventional. 1
unknown title
, 2003
"... www.elsevier.com/locate/tcs The modal argument for hypercomputing minds ..."
(Show Context)