Results 1  10
of
11
General relativistic hypercomputing and foundation of mathematics
"... Abstract. Looking at very recent developments in spacetime theory, we can wonder whether these results exhibit features of hypercomputation that traditionally seemed impossible or absurd. Namely, we describe a physical device in relativistic spacetime which can compute a nonTuring computable task, ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. Looking at very recent developments in spacetime theory, we can wonder whether these results exhibit features of hypercomputation that traditionally seemed impossible or absurd. Namely, we describe a physical device in relativistic spacetime which can compute a nonTuring computable task, e.g. which can decide the halting problem of Turing machines or decide whether ZF set theory is consistent (more precisely, can decide the theorems of ZF). Starting from this, we will discuss the impact of recent breakthrough results of relativity theory, black hole physics and cosmology to well established foundational issues of computability theory as well as to logic. We find that the unexpected, revolutionary results in the mentioned branches of science force us to reconsider the status of the physical Church Thesis and to consider it as being seriously challenged. We will outline the consequences of all this for the foundation of mathematics (e.g. to Hilbert’s programme). Observational, empirical evidence will be quoted to show that the statements above do not require any assumption of some physical universe outside of our own one: in our specific physical universe there seem to exist regions of spacetime supporting potential nonTuring computations. Additionally, new “engineering ” ideas will be outlined for solving the socalled blueshift problem of GRcomputing. Connections with related talks at the Physics and Computation meeting, e.g. those of Jerome DurandLose, Mark Hogarth and Martin Ziegler, will be indicated. 1
Emergence as a ComputabilityTheoretic Phenomenon
, 2008
"... In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well un ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well understood, and (with important exceptions) captured by mathematics from which it is relatively easy to extract algorithmic content. A widespread view is that the difficulty in describing transitions from algorithmic activity to the emergence associated with chaotic situations is a simple case of complexity outstripping computational resources and human ingenuity. Or, on the other hand, that phenomena transcending the standard Turing model of computation, if they exist, must necessarily lie outside the domain of classical computability theory. In this talk we suggest that much of the current confusion arises from conceptual gaps and the lack of a suitably fundamental model within which to situate emergence. We examine the potential for placing emergent relations in a familiar context based on Turing’s 1939 model for interactive computation over structures described in terms of reals. The explanatory power of this model is explored, formalising informal descriptions in terms of mathematical definability and invariance, and relating a range of basic scientific puzzles to results and intractable problems in computability theory. In this talk
Can general relativistic computers break the Turing barrier?
"... Abstract. Can general relativistic computers break the Turing barrier? Are there final limits to human knowledge? Limitative results versus human creativity (paradigm shifts). Gödel’s logical results in comparison/combination with Gödel’s relativistic results. Can Hilbert’s programme be carried ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. Can general relativistic computers break the Turing barrier? Are there final limits to human knowledge? Limitative results versus human creativity (paradigm shifts). Gödel’s logical results in comparison/combination with Gödel’s relativistic results. Can Hilbert’s programme be carried through after all? 1 Aims, perspective The Physical ChurchTuring Thesis, PhCT, is the conjecture that whatever physical computing device (in the broader sense) or physical thought experiment will be designed by any future civilization, it will always be simulatable by a Turing machine. The PhCT was formulated and generally accepted in the 1930’s. At that time a general consensus was reached declaring PhCT valid, and indeed in the succeeding decades the PhCT was an extremely useful and valuable maxim in elaborating the foundations of theoretical computer science, logic, foundation of mathematics and related areas. But since PhCT is partly a physical conjecture, we emphasize that this consensus of the 1930’s was based on the physical worldview of the 1930’s. Moreover, many thinkers considered PhCT as being based on
Extending and Interpreting Post’s Programme
, 2008
"... Computability theory concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. Emil Post was the first to focus on the close relationship between information, coded as real numbers, and its algorithmic in ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Computability theory concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. Emil Post was the first to focus on the close relationship between information, coded as real numbers, and its algorithmic infrastructure. Having characterised the close connection between the quantifier type of a real and the Turing jump operation, he looked for more subtle ways in which information entails a particular causal context. Specifically, he wanted to find simple relations on reals which produced richness of local computabilitytheoretic structure. To this extent, he was not just interested in causal structure as an abstraction, but in the way in which this structure emerges in natural contexts. Posts programme was the genesis of a more far reaching research project. In this article we will firstly review the history of Posts programme, and look at two interesting developments of Posts approach. The first of these developments concerns the extension of the core programme, initially restricted to the Turing structure of the computably enumerable sets of natural numbers, to the Ershov hierarchy of sets. The second looks at how new types of information coming from the recent growth of research into randomness, and the revealing of unexpected new computabilitytheoretic infrastructure. We will conclude by viewing Posts programme from a more general perspective. We will look at how algorithmic structure does not just emerge mathematically from information, but how that emergent structure can model the emergence of very basic aspects of the real world.
How can natural brains help us compute?
"... Abstract. A model of biologically inspired natural computing is reviewed. Recurrent neural networks are set up so as to take advantage of emergent spatiotemporal chaotic regimes. Seminal work explaining the emergence of complexity in initially homogeneous physical and biological systems can be attri ..."
Abstract
 Add to MetaCart
Abstract. A model of biologically inspired natural computing is reviewed. Recurrent neural networks are set up so as to take advantage of emergent spatiotemporal chaotic regimes. Seminal work explaining the emergence of complexity in initially homogeneous physical and biological systems can be attributed to Alan Turing himself. Dynamical complexity provides a variety of computational modes and rich inputoutput relations in a dynamical perturbation scheme. Our model is initially proposed as an ’operational ’ device most suitable for the processing of spatially distributed input patterns varying in continuous time. Formalizations leading to hypercomputation can be envisaged.
The Extended Turing Model As Contextual Tool
"... Abstract. Computability concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. We look at ways in which computabilitytheoretic structure emerges in natural contexts. We will look at how algorithmic str ..."
Abstract
 Add to MetaCart
Abstract. Computability concerns information with a causal – typically algorithmic – structure. As such, it provides a schematic analysis of many naturally occurring situations. We look at ways in which computabilitytheoretic structure emerges in natural contexts. We will look at how algorithmic structure does not just emerge mathematically from information, but how that emergent structure can model the emergence of very basic aspects of the real world. The adequacy of the classical Turing model of computation — as first presented in [18] — is in question in many contexts. There is widespread doubt concerning the reducibility to this model of a broad spectrum of realworld processes and natural phenomena, from basic quantum mechanics to aspects of evolutionary development, or human mental activity. In 1939 Turing [19] described an extended model providing mathematical form to the algorithmic content of structures which are presented in terms of real numbers. Most scientific laws with a computational content can be framed
Typologies of Computation and Computational Models
"... Abstract. We need much better understanding of information processing and computation as its primary form. Future progress of new computational devices capable of dealing with problems of big data, internet of things, semantic web, cognitive robotics and neuroinformatics depends on the adequate mode ..."
Abstract
 Add to MetaCart
Abstract. We need much better understanding of information processing and computation as its primary form. Future progress of new computational devices capable of dealing with problems of big data, internet of things, semantic web, cognitive robotics and neuroinformatics depends on the adequate models of computation. In this article we first present the current state of the art through systematisation of existing models and mechanisms, and outline basic structural framework of computation. We argue that defining computation as information processing, and given that there is no information without (physical) representation, the dynamics of information on the fundamental level is physical / intrinsic / natural computation. As a special case, intrinsic computation is used for designed computation in computing machinery. Intrinsic natural computation occurs on variety of levels of physical processes, containing the levels of computation of living organisms (including highly intelligent animals) as well as designed computational devices. The present article offers a typology of current models of computation and indicates future paths for the advancement of the field; both by the development of new computational models and by learning from nature how to better compute using different mechanisms of intrinsic computation. 1
Computing Nature – A Network of Networks of Concurrent Information Processes
"... This is a draft of the article to be published in Springer book series SAPERE. The final publication will be available at ..."
Abstract
 Add to MetaCart
This is a draft of the article to be published in Springer book series SAPERE. The final publication will be available at
Symposium on Natural/Unconventional Computing at AISB/IACAP (British Society
"... The articles in the volume Computing Nature present a selection of works from the ..."
Abstract
 Add to MetaCart
The articles in the volume Computing Nature present a selection of works from the