Results 1  10
of
18
The Broad Conception Of Computation
 American Behavioral Scientist
, 1997
"... A myth has arisen concerning Turing's paper of 1936, namely that Turing set forth a fundamental principle concerning the limits of what can be computed by machine  a myth that has passed into cognitive science and the philosophy of mind, to wide and pernicious effect. This supposed principle, ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
(Show Context)
A myth has arisen concerning Turing's paper of 1936, namely that Turing set forth a fundamental principle concerning the limits of what can be computed by machine  a myth that has passed into cognitive science and the philosophy of mind, to wide and pernicious effect. This supposed principle, sometimes incorrectly termed the 'ChurchTuring thesis', is the claim that the class of functions that can be computed by machines is identical to the class of functions that can be computed by Turing machines. In point of fact Turing himself nowhere endorses, nor even states, this claim (nor does Church). I describe a number of notional machines, both analogue and digital, that can compute more than a universal Turing machine. These machines are exemplars of the class of nonclassical computing machines. Nothing known at present rules out the possibility that machines in this class will one day be built, nor that the brain itself is such a machine. These theoretical considerations undercut a numb...
How can Nature help us compute
 SOFSEM 2006: Theory and Practice of Computer Science – 32nd Conference on Current Trends in Theory and Practice of Computer Science, Merin, Czech Republic, January 21–27
, 2006
"... Abstract. Ever since Alan Turing gave us a machine model of algorithmic computation, there have been questions about how widely it is applicable (some asked by Turing himself). Although the computer on our desk can be viewed in isolation as a Universal Turing Machine, there are many examples in natu ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Ever since Alan Turing gave us a machine model of algorithmic computation, there have been questions about how widely it is applicable (some asked by Turing himself). Although the computer on our desk can be viewed in isolation as a Universal Turing Machine, there are many examples in nature of what looks like computation, but for which there is no wellunderstood model. In many areas, we have to come to terms with emergence not being clearly algorithmic. The positive side of this is the growth of new computational paradigms based on metaphors for natural phenomena, and the devising of very informative computer simulations got from copying nature. This talk is concerned with general questions such as: • Can natural computation, in its various forms, provide us with genuinely new ways of computing? • To what extent can natural processes be captured computationally? • Is there a universal model underlying these new paradigms?
Fast Quantum Algorithms for Handling Probabilistic and Interval Uncertainty
, 2003
"... this paper, we show how the use of quantum computing can speed up some computations related to interval and probabilistic uncertainty. We end the paper with speculations on whether (and how) "hypothetic" physical devices can compute NPhard problems faster than in exponential time ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
this paper, we show how the use of quantum computing can speed up some computations related to interval and probabilistic uncertainty. We end the paper with speculations on whether (and how) "hypothetic" physical devices can compute NPhard problems faster than in exponential time
2002]: „On Effective procedures
 Minds and Machines
"... Abstract. Since the midtwentieth century, the concept of the Turing machine has dominated thought about effective procedures. This paper presents an alternative to Turing’s analysis; it unifies, refines, and extends my earlier work on this topic. I show that Turing machines cannot live up to their ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Since the midtwentieth century, the concept of the Turing machine has dominated thought about effective procedures. This paper presents an alternative to Turing’s analysis; it unifies, refines, and extends my earlier work on this topic. I show that Turing machines cannot live up to their billing as paragons of effective procedure; at best, they may be said to provide us with mere procedure schemas. I argue that the concept of an effective procedure crucially depends upon distinguishing procedures as definite courses of action( types) from the particular courses of action(tokens) that actually instantiate them and the causal processes and/or interpretations that ultimately make them effective. On my analysis, effectiveness is not just a matter of logical form; ‘content ’ matters. The analysis I provide has the advantage of applying to ordinary, everyday procedures such as recipes and methods, as well as the more refined procedures of mathematics and computer science. It also has the virtue of making better sense of the physical possibilities for hypercomputation than the received view and its extensions, e.g. Turing’s omachines, accelerating machines. Key words: causal process, effective procedure, hypercomputation, precisely described instruction, procedure schema, quotidian procedure, Turing machine 1.
Positional Value and Linguistic Recursion
 Springer Netherlands
, 2007
"... Computation and Natural Language The confluence of linguistic and mathematical thought in ancient India provides a unique view of how modern mathematics and computation rely on linguistic and cognitive skills. The linchpin of the analysis is the use of positional notation as a counting method for ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Computation and Natural Language The confluence of linguistic and mathematical thought in ancient India provides a unique view of how modern mathematics and computation rely on linguistic and cognitive skills. The linchpin of the analysis is the use of positional notation as a counting method for ancient and modern arithmetical procedures. Positional notation is a primary contribution from India to the development of modern mathematics, and in ancient India bridges mathematics to Indian linguistics. Pān: ini’s grammar, while not thought of as mathematical, uses techniques essential to modern logic and the theory of computation, and is the most thoroughgoing historical example of algorithmic and formal methods until the nineteenth century. Taken together, modern logic and ancient algorithmics show how computation of all kinds is constructed from language pattern and use. To set the stage we start with the contemporary idea that all kinds of mathematics can be thought of as sets of formulas or sentences expressed in some formal language. Such sets are called theories, and are often thought of as being algorithmically generated by some precise rules of proof, such as the rules of predicate logic applied to domainspecific, or ‘‘nonlogical,’ ’ axioms with specially defined terms. So there are theories of arithmetic based on axioms for addition and multiplication; set theories based on axioms for set membership and formation; theories of the real numbers; various kinds of geometry, algebra, and so on. Today such proof systems can also be thought of as computations, which mainly means spelling out the details by which an
A natural axiomatization of Church’s thesis
, 2007
"... The Abstract State Machine Thesis asserts that every classical algorithm is behaviorally equivalent to an abstract state machine. This thesis has been shown to follow from three natural postulates about algorithmic computation. Here, we prove that augmenting those postulates with an additional requ ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The Abstract State Machine Thesis asserts that every classical algorithm is behaviorally equivalent to an abstract state machine. This thesis has been shown to follow from three natural postulates about algorithmic computation. Here, we prove that augmenting those postulates with an additional requirement regarding basic operations implies Church’s Thesis, namely, that the only numeric functions that can be calculated by effective means are the recursive ones (which are the same, extensionally, as the Turingcomputable numeric functions). In particular, this gives a natural axiomatization of Church’s Thesis, as Gödel and others suggested may be possible.
The Incomputable Alan Turing
 In the Proceedings of
"... The last century saw dramatic challenges to the Laplacian predictability which had underpinned scientific research for around 300 years. Basic to this was Alan Turing’s 1936 discovery (along with Alonzo Church) of the existence of unsolvable problems. This paper focuses on incomputability as a power ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The last century saw dramatic challenges to the Laplacian predictability which had underpinned scientific research for around 300 years. Basic to this was Alan Turing’s 1936 discovery (along with Alonzo Church) of the existence of unsolvable problems. This paper focuses on incomputability as a powerful theme in Turing’s work and personal life, and examines its role in his evolving concept of machine intelligence. It also traces some of the ways in which important new developments are anticipated by Turing’s ideas in logic. This paper is based on the talk given on 5th June 2004 at the conference at Manchester University marking the 50th anniversary of Alan Turing’s death. It is published by the British Computer Society on
Feature Some Reflections on Alan Turing’s Centenary
"... We review two of Alan Turing’s chief publications in mathematical ..."
mlq header will be provided by the publisher Fast Quantum Algorithms for Handling Probabilistic and Interval Uncertainty
, 2003
"... In many reallife situations, we are interested in the value of a physical quantity y that is difficult or impossible to measure directly. To estimate y, we find some easiertomeasure quantities x1,..., xn which are related to y by a known relation y = f(x1,..., xn). Measurements are never 100 % ac ..."
Abstract
 Add to MetaCart
(Show Context)
In many reallife situations, we are interested in the value of a physical quantity y that is difficult or impossible to measure directly. To estimate y, we find some easiertomeasure quantities x1,..., xn which are related to y by a known relation y = f(x1,..., xn). Measurements are never 100 % accurate; hence, the measured values ˜xi are different from xi, and the resulting estimate ˜y = f(˜x1,..., ˜xn) is different from the desired value y = f(x1,..., xn). How different can it be? Traditional engineering approach to error estimation in data processing assumes that we know the probabildef ities of different measurement errors ∆xi = ˜xi − xi. In many practical situations, we only know the upper bound ∆i for this error; hence, after the measurement, the only information that we have about xi is that it def belongs to the interval xi = [˜xi − ∆i, ˜xi + ∆i]. In this case, it is important to find the range y of all possible values of y = f(x1,..., xn) when xi ∈ xi. We start the paper with a brief overview of the computational complexity of the corresponding interval computation problems. Most of the related problems turn out to be, in general, at least NPhard. In this paper, we show how the use of quantum computing can speed up some computations related to interval and probabilistic uncertainty. We end the paper with speculations on whether (and how) “hypothetic ” physical devices can compute NPhard problems faster than in exponential time. Most of the paper’s results were first presented at NAFIPS’2003 [30]. Copyright line will be provided by the publisher 1 Introduction: Data Processing
HYPOCOMPUTATION?
"... ABSTRACT. Most research into hypercomputation focuses only on machines able to prove stronger results the basic Turing Machine, hence the phrase hypercomputation. However, developing hypercomputational theories requires a deep understanding of computational theories: particularly the boundary betwee ..."
Abstract
 Add to MetaCart
ABSTRACT. Most research into hypercomputation focuses only on machines able to prove stronger results the basic Turing Machine, hence the phrase hypercomputation. However, developing hypercomputational theories requires a deep understanding of computational theories: particularly the boundary between formal and informal theory present in all known computational theories. In this paper we argue for an investigation into computation from the informal side of this boundary, through the investigation of mathematical machines just below the boundary of computation. We will call this class of machines utilising a weaker definition of ‘computation ’ hypocomputational machines, and argue that by studying the assumptions used by these machine we should gain insights into the actions of machines such as the Turing Machine lying on the computational boundary. This in turn should provide a much firmer foundation for developing the theories required for hypercomputation, and may help in effort to build a hypercomputer. 1.