Results 1  10
of
16
RELATIVISTIC COMPUTERS AND THE TURING Barrier
, 2006
"... We examine the current status of the physical version of the ChurchTuring Thesis (PhCT for short) in view of latest developments in spacetime theory. This also amounts to investigating the status of hypercomputation in view of latest results on spacetime. We agree with Deutsch et al [17] that PhCT ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
We examine the current status of the physical version of the ChurchTuring Thesis (PhCT for short) in view of latest developments in spacetime theory. This also amounts to investigating the status of hypercomputation in view of latest results on spacetime. We agree with Deutsch et al [17] that PhCT is not only a conjecture of mathematics but rather a conjecture of a combination of theoretical physics, mathematics and, in some sense, cosmology. Since the idea of computability is intimately connected with the nature of Time, relevance of spacetime theory seems to be unquestionable. We will see that recent developments in spacetime theory show that temporal developments may exhibit features that traditionally seemed impossible or absurd. We will see that recent results point in the direction that the possibility of artificial systems computing nonTuring computable functions may be consistent with spacetime theory. All these trigger new open questions and new research directions for spacetime theory, cosmology, and computability.
Hypercomputation and the Physical ChurchTuring Thesis
, 2003
"... A version of the ChurchTuring Thesis states that every e#ectively realizable physical system can be defined by Turing Machines (`Thesis P'); in this formulation the Thesis appears an empirical, more than a logicomathematical, proposition. We review the main approaches to computation beyond Tu ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
A version of the ChurchTuring Thesis states that every e#ectively realizable physical system can be defined by Turing Machines (`Thesis P'); in this formulation the Thesis appears an empirical, more than a logicomathematical, proposition. We review the main approaches to computation beyond Turing definability (`hypercomputation'): supertask, nonwellfounded, analog, quantum, and retrocausal computation. These models depend on infinite computation, explicitly or implicitly, and appear physically implausible; moreover, even if infinite computation were realizable, the Halting Problem would not be a#ected. Therefore, Thesis P is not essentially di#erent from the standard ChurchTuring Thesis.
Incomputability in Nature
"... To what extent is incomputability relevant to the material Universe? We look at ways in which this question might be answered, and the extent to which the theory of computability, which grew out of the work of Godel, Church, Kleene and Turing, can contribute to a clear resolution of the current conf ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
To what extent is incomputability relevant to the material Universe? We look at ways in which this question might be answered, and the extent to which the theory of computability, which grew out of the work of Godel, Church, Kleene and Turing, can contribute to a clear resolution of the current confusion. It is hoped that the presentation will be accessible to the nonspecialist reader.
How can Nature help us compute
 SOFSEM 2006: Theory and Practice of Computer Science – 32nd Conference on Current Trends in Theory and Practice of Computer Science, Merin, Czech Republic, January 21–27
, 2006
"... Abstract. Ever since Alan Turing gave us a machine model of algorithmic computation, there have been questions about how widely it is applicable (some asked by Turing himself). Although the computer on our desk can be viewed in isolation as a Universal Turing Machine, there are many examples in natu ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Abstract. Ever since Alan Turing gave us a machine model of algorithmic computation, there have been questions about how widely it is applicable (some asked by Turing himself). Although the computer on our desk can be viewed in isolation as a Universal Turing Machine, there are many examples in nature of what looks like computation, but for which there is no wellunderstood model. In many areas, we have to come to terms with emergence not being clearly algorithmic. The positive side of this is the growth of new computational paradigms based on metaphors for natural phenomena, and the devising of very informative computer simulations got from copying nature. This talk is concerned with general questions such as: • Can natural computation, in its various forms, provide us with genuinely new ways of computing? • To what extent can natural processes be captured computationally? • Is there a universal model underlying these new paradigms?
Emergence as a ComputabilityTheoretic Phenomenon
, 2008
"... In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well un ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well understood, and (with important exceptions) captured by mathematics from which it is relatively easy to extract algorithmic content. A widespread view is that the difficulty in describing transitions from algorithmic activity to the emergence associated with chaotic situations is a simple case of complexity outstripping computational resources and human ingenuity. Or, on the other hand, that phenomena transcending the standard Turing model of computation, if they exist, must necessarily lie outside the domain of classical computability theory. In this talk we suggest that much of the current confusion arises from conceptual gaps and the lack of a suitably fundamental model within which to situate emergence. We examine the potential for placing emergent relations in a familiar context based on Turing’s 1939 model for interactive computation over structures described in terms of reals. The explanatory power of this model is explored, formalising informal descriptions in terms of mathematical definability and invariance, and relating a range of basic scientific puzzles to results and intractable problems in computability theory. In this talk
FROM DESCARTES TO TURING: THE COMPUTATIONAL CONTENT OF SUPERVENIENCE
"... Mathematics can provide precise formulations of relatively vague concepts and problems from the real world, and bring out underlying structure common to diverse scientific areas. Sometimes very natural mathematical concepts lie neglected and not widely understood for many years, before their fundame ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Mathematics can provide precise formulations of relatively vague concepts and problems from the real world, and bring out underlying structure common to diverse scientific areas. Sometimes very natural mathematical concepts lie neglected and not widely understood for many years, before their fundamental relevance is recognised and their explanatory power is fully exploited. The notion of definability in a structure is such a concept, and Turing’s [77] 1939 model of interactive computation provides a fruitful context in which to exercise the usefulness of definability as a powerful and widely applicable source of understanding. In this article we set out to relate this simple idea to one of the oldest and apparently least scientifically approachable of problems — that of realistically modelling how mental properties supervene on physical ones.
Extending and Interpreting Post’s Programme
, 2008
"... Computability theory concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. Emil Post was the first to focus on the close relationship between information, coded as real numbers, and its algorithmic in ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Computability theory concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. Emil Post was the first to focus on the close relationship between information, coded as real numbers, and its algorithmic infrastructure. Having characterised the close connection between the quantifier type of a real and the Turing jump operation, he looked for more subtle ways in which information entails a particular causal context. Specifically, he wanted to find simple relations on reals which produced richness of local computabilitytheoretic structure. To this extent, he was not just interested in causal structure as an abstraction, but in the way in which this structure emerges in natural contexts. Posts programme was the genesis of a more far reaching research project. In this article we will firstly review the history of Posts programme, and look at two interesting developments of Posts approach. The first of these developments concerns the extension of the core programme, initially restricted to the Turing structure of the computably enumerable sets of natural numbers, to the Ershov hierarchy of sets. The second looks at how new types of information coming from the recent growth of research into randomness, and the revealing of unexpected new computabilitytheoretic infrastructure. We will conclude by viewing Posts programme from a more general perspective. We will look at how algorithmic structure does not just emerge mathematically from information, but how that emergent structure can model the emergence of very basic aspects of the real world.
The Incomputable Alan Turing
"... The last century saw dramatic challenges to the Laplacian predictability which had underpinned scientific research for around 300 years. Basic to this was Alan Turing’s 1936 discovery (along with Alonzo Church) of the existence of unsolvable problems. This paper focuses on incomputability as a power ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The last century saw dramatic challenges to the Laplacian predictability which had underpinned scientific research for around 300 years. Basic to this was Alan Turing’s 1936 discovery (along with Alonzo Church) of the existence of unsolvable problems. This paper focuses on incomputability as a powerful theme in Turing’s work and personal life, and examines its role in his evolving concept of machine intelligence. It also traces some of the ways in which important new developments are anticipated by Turing’s ideas in logic.
Computability Theory *
"... Computability is perhaps the most significant and distinctive notion modern logic has introduced; in the guise of decidability and effective calculability it has a venerable history within philosophy and mathematics. Now it is also the basic theoretical concept for computer science, artificial intel ..."
Abstract
 Add to MetaCart
Computability is perhaps the most significant and distinctive notion modern logic has introduced; in the guise of decidability and effective calculability it has a venerable history within philosophy and mathematics. Now it is also the basic theoretical concept for computer science, artificial intelligence and cognitive
On the calculating power of Laplace’s demon (Part I)
, 2006
"... We discuss several ways of making precise the informal concept of physical determinism, drawing on ideas from mathematical logic and computability theory. We outline a programme of investigating these notions of determinism in detail for specific, precisely articulated physical theories. We make a s ..."
Abstract
 Add to MetaCart
We discuss several ways of making precise the informal concept of physical determinism, drawing on ideas from mathematical logic and computability theory. We outline a programme of investigating these notions of determinism in detail for specific, precisely articulated physical theories. We make a start on our programme by proposing a general logical framework for describing physical theories, and analysing several possible formulations of a simple Newtonian theory from the point of view of determinism. Our emphasis throughout is on clarifying the precise physical and metaphysical assumptions that typically underlie a claim that some physical theory is ‘deterministic’. A sequel paper is planned, in which we shall apply similar methods to the analysis of other physical theories. Along the way, we discuss some possible repercussions of this kind of investigation for both physics and logic. 1