Results 1  10
of
88
The Symbol Grounding Problem
, 1990
"... There has been much discussion recently about the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling. This paper describes the "symbol grounding problem": How can the semantic interpretation of a formal symbol system be made intrin ..."
Abstract

Cited by 806 (14 self)
 Add to MetaCart
There has been much discussion recently about the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling. This paper describes the "symbol grounding problem": How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads? How can the meanings of the meaningless symbol tokens, manipulated solely on the basis of their (arbitrary) shapes, be grounded in anything but other meaningless symbols? The problem is analogous to trying to learn Chinese from a Chinese/Chinese dictionary alone. A candidate solution is sketched: Symbolic representations must be grounded bottomup in nonsymbolic representations of two kinds: (1) "iconic representations" , which are analogs of the proximal sensory projections of distal objects and events, and (2) "categorical representations" , which are learned and innate featuredetectors that pick out the invariant features of object and event categories from their sensory projections. Elementary symbols are the names of these object and event categories, assigned on the basis of their (nonsymbolic) categorical representations. Higherorder (3) "symbolic representations" , grounded in these elementary symbols, consist of symbol strings describing category membership relations (e.g., "An X is a Y that is Z"). Connectionism is one natural candidate for the mechanism that learns the invariant features underlying categorical representations, thereby connecting names to the proximal projections of the distal objects they stand for. In this way connectionism can be seen as a complementary component in a hybrid nonsymbolic/symbolic model of the mind, rather than a rival to purely symbolic modeling. Such ...
On the Length of Programs for Computing Finite Binary Sequences
 Journal of the ACM
, 1966
"... The use of Turing machines for calculating finite binary sequences is studied from the point of view of information theory and the theory of recursive functions. Various results are obtained concerning the number of instructions in programs. A modified form of Turing machine is studied from the same ..."
Abstract

Cited by 226 (7 self)
 Add to MetaCart
The use of Turing machines for calculating finite binary sequences is studied from the point of view of information theory and the theory of recursive functions. Various results are obtained concerning the number of instructions in programs. A modified form of Turing machine is studied from the same point of view. An application to the problem of defining a patternless sequence is proposed in terms of the concepts here 2 G. J. Chaitin developed. Introduction In this paper the Turing machine is regarded as a general purpose computer and some practical questions are asked about programming it. Given an arbitrary finite binary sequence, what is the length of the shortest program for calculating it? What are the properties of those binary sequences of a given length which require the longest programs? Do most of the binary sequences of a given length require programs of about the same length? The questions posed above are answered in Part 1. In the course of answering them, the logical ...
A Basis for a Mathematical Theory of Computation
 Computer Programming and Formal Systems
, 1963
"... edited by P. Braffort and D. Hirshberg and published by NorthHolland. ..."
Abstract

Cited by 205 (6 self)
 Add to MetaCart
edited by P. Braffort and D. Hirshberg and published by NorthHolland.
A Survey of Computational Complexity Results in Systems and Control
, 2000
"... The purpose of this paper is twofold: (a) to provide a tutorial introduction to some key concepts from the theory of computational complexity, highlighting their relevance to systems and control theory, and (b) to survey the relatively recent research activity lying at the interface between these fi ..."
Abstract

Cited by 116 (21 self)
 Add to MetaCart
The purpose of this paper is twofold: (a) to provide a tutorial introduction to some key concepts from the theory of computational complexity, highlighting their relevance to systems and control theory, and (b) to survey the relatively recent research activity lying at the interface between these fields. We begin with a brief introduction to models of computation, the concepts of undecidability, polynomial time algorithms, NPcompleteness, and the implications of intractability results. We then survey a number of problems that arise in systems and control theory, some of them classical, some of them related to current research. We discuss them from the point of view of computational complexity and also point out many open problems. In particular, we consider problems related to stability or stabilizability of linear systems with parametric uncertainty, robust control, timevarying linear systems, nonlinear and hybrid systems, and stochastic optimal control.
Quantum Algorithm For Hilberts Tenth Problem
 Int.J.Theor.Phys
, 2003
"... We explore in the framework of Quantum Computation the notion of Computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm for Hilbert’s tenth problem, which is equivalent to the Turing halting problem and is known to be mathematically noncomp ..."
Abstract

Cited by 62 (10 self)
 Add to MetaCart
We explore in the framework of Quantum Computation the notion of Computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm for Hilbert’s tenth problem, which is equivalent to the Turing halting problem and is known to be mathematically noncomputable, is proposed where quantum continuous variables and quantum adiabatic evolution are employed. If this algorithm could be physically implemented, as much as it is valid in principle—that is, if certain hamiltonian and its ground state can be physically constructed according to the proposal—quantum computability would surpass classical computability as delimited by the ChurchTuring thesis. It is thus argued that computability, and with it the limits of Mathematics, ought to be determined not solely by Mathematics itself but also by Physical Principles. 1
Informationtheoretic Limitations of Formal Systems
 JOURNAL OF THE ACM
, 1974
"... An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.
Computability and recursion
 BULL. SYMBOLIC LOGIC
, 1996
"... We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they b ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they became fixed in their present roles, how they were first and are now used, their impact on nonspecialists, how their use will affect the future content of the subject of computability theory, and its connection to other related areas. After a careful historical and conceptual analysis of computability and recursion we make several recommendations in section §7 about preserving the intensional differences between the concepts of “computability” and “recursion.” Specifically we recommend that: the term “recursive ” should no longer carry the additional meaning of “computable” or “decidable;” functions defined using Turing machines, register machines, or their variants should be called “computable” rather than “recursive;” we should distinguish the intensional difference between Church’s Thesis and Turing’s Thesis, and use the latter particularly in dealing with mechanistic questions; the name of the subject should be “Computability Theory” or simply Computability rather than
Computing the noncomputable
 Contemporary Physics
"... We explore in the framework of Quantum Computation the notion of computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm that exploits the quantum adiabatic which is equivalent to the Turing halting problem and known to be mathematically non ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
We explore in the framework of Quantum Computation the notion of computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm that exploits the quantum adiabatic which is equivalent to the Turing halting problem and known to be mathematically noncomputable. Generalised quantum algorithms are also considered for some other mathematical noncomputables in the same and of different noncomputability classes. The key element of all these algorithms is the measurability of both the values of physical observables and of the quantummechanical probability distributions for these values. It is argued that computability, and thus the limits of Mathematics, ought to be determined not
Enumeration Reducibility, Nondeterministic Computations and Relative . . .
 RECURSION THEORY WEEK, OBERWOLFACH 1989, VOLUME 1432 OF LECTURE NOTES IN MATHEMATICS
, 1990
"... ..."
Complexity distortion theory
 in Proc. IEEE Int. Symp. Information Theory
, 1997
"... Abstract—Complexity distortion theory (CDT) is a mathematical framework providing a unifying perspective on media representation. The key component of this theory is the substitution of the decoder in Shannon’s classical communication model with a universal Turing machine. Using this model, the math ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Abstract—Complexity distortion theory (CDT) is a mathematical framework providing a unifying perspective on media representation. The key component of this theory is the substitution of the decoder in Shannon’s classical communication model with a universal Turing machine. Using this model, the mathematical framework for examining the efficiency of coding schemes is the algorithmic or Kolmogorov complexity. CDT extends this framework to include distortion by defining the complexity distortion function. We show that despite their different natures, CDT and rate distortion theory (RDT) predict asymptotically the same results, under stationary and ergodic assumptions. This closes the circle of representation models, from probabilistic models of information proposed by Shannon in information and rate distortion theories, to deterministic algorithmic models, proposed by Kolmogorov in Kolmogorov complexity theory and its extension to lossy source coding, CDT. Index Terms—Kolmogorov complexity, Markov types, rate distortion function, universal coding. I.