Results 1  10
of
34
Beyond the Turing Test
 J. Logic, Language & Information
"... Abstract. We define the main factor of intelligence as the ability to comprehend, formalising this ability with the help of new constructs based on descriptional complexity. The result is a comprehension test, or Ctest, exclusively defined in terms of universal descriptional machines (e.g universal ..."
Abstract

Cited by 34 (18 self)
 Add to MetaCart
Abstract. We define the main factor of intelligence as the ability to comprehend, formalising this ability with the help of new constructs based on descriptional complexity. The result is a comprehension test, or Ctest, exclusively defined in terms of universal descriptional machines (e.g universal Turing machines). Despite the absolute and nonanthropomorphic character of the test it is equally applicable to both humans and machines. Moreover, it correlates with classical psychometric tests, thus establishing the first firm connection between information theoretic notions and traditional IQ tests. The Turing Test is compared with the Ctest and their joint combination is discussed. As a result, the idea of the Turing Test as a practical test of intelligence should be left behind, and substituted by computational and factorial tests of different cognitive abilities, a much more useful approach for artificial intelligence progress and for many other intriguing questions that are presented beyond the Turing Test.
A Formal Definition of Intelligence Based on an Intensional Variant of Algorithmic Complexity
 In Proceedings of the International Symposium of Engineering of Intelligent Systems (EIS'98
, 1998
"... Machine Due to the current technology of the computers we can use, we have chosen an extremely abridged emulation of the machine that will effectively run the programs, instead of more proper languages, like lcalculus (or LISP). We have adapted the "toy RISC" machine of [Hernndez & H ..."
Abstract

Cited by 31 (17 self)
 Add to MetaCart
Machine Due to the current technology of the computers we can use, we have chosen an extremely abridged emulation of the machine that will effectively run the programs, instead of more proper languages, like lcalculus (or LISP). We have adapted the "toy RISC" machine of [Hernndez & Hernndez 1993] with two remarkable features inherited from its objectoriented coding in C++: it is easily tunable for our needs, and it is efficient. We have made it even more reduced, removing any operand in the instruction set, even for the loop operations. We have only three registers which are AX (the accumulator), BX and CX. The operations Q b we have used for our experiment are in Table 1: LOOPTOP Decrements CX. If it is not equal to the first element jump to the program top.
Information and Computation: Classical and Quantum Aspects
 REVIEWS OF MODERN PHYSICS
, 2001
"... Quantum theory has found a new field of applications in the realm of information and computation during the recent years. This paper reviews how quantum physics allows information coding in classically unexpected and subtle nonlocal ways, as well as information processing with an efficiency largely ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Quantum theory has found a new field of applications in the realm of information and computation during the recent years. This paper reviews how quantum physics allows information coding in classically unexpected and subtle nonlocal ways, as well as information processing with an efficiency largely surpassing that of the present and foreseeable classical computers. Some outstanding aspects of classical and quantum information theory will be addressed here. Quantum teleportation, dense coding, and quantum cryptography are discussed as a few samples of the impact of quanta in the transmission of information. Quantum logic gates and quantum algorithms are also discussed as instances of the improvement in information processing by a quantum computer. We provide finally some examples of current experimental
Information Assurance through Kolmogorov Complexity
, 2001
"... The problem of Information Assurance is approached from the point of view of Kolmogorov Complexity and Minimum Message Length criteria. Several theoretical results are obtained, possible applications are discussed and a new metric for measuring complexity is introduced. Utilization of Kolmogorov Com ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
The problem of Information Assurance is approached from the point of view of Kolmogorov Complexity and Minimum Message Length criteria. Several theoretical results are obtained, possible applications are discussed and a new metric for measuring complexity is introduced. Utilization of Kolmogorov Complexity like metrics as conserved parameters to detect abnormal system behavior is explored. Data and process vulnerabilities are put forward as two different dimensions of vulnerability that can be discussed in terms of Kolmogorov Complexity. Finally, these results are utilized to conduct complexitybased vulnerability analysis. 1. Introduction Information security (or lack thereof) is too often dealt with after security has been lost. Back doors are opened, Trojan horses are placed, passwords are guessed and firewalls are broken down  in general, security is lost as barriers to hostile attackers are breached and one is put in the undesirable position of detecting and patching holes. In ...
On the computational measurement of intelligence factors
 National Institute of Standards and Technology
, 2000
"... In this paper we develop a computational framework for the measurement of different factors or abilities which are usually found in intelligent behaviours. For this, we first develop a scale for measuring the complexity of an instance of a problem, depending on the descriptional complexity (Levin LT ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
In this paper we develop a computational framework for the measurement of different factors or abilities which are usually found in intelligent behaviours. For this, we first develop a scale for measuring the complexity of an instance of a problem, depending on the descriptional complexity (Levin LT variant) of the ‘explanation ’ of the answer to the problem. We centre on the establishment of either deductive and inductive abilities, and we show that their evaluation settings are special cases of the general framework. Some classical dependencies between them are shown and a way to separate these dependencies is developed. Finally, some variants of the previous factors and other possible ones to be taken into account are investigated. In the end, the application of these measurements for the evaluation of AI progress is discussed.
What is a Random Sequence
 The Mathematical Association of America, Monthly
, 2002
"... there laws of randomness? These old and deep philosophical questions still stir controversy today. Some scholars have suggested that our difficulty in dealing with notions of randomness could be gauged by the comparatively late development of probability theory, which had a ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
there laws of randomness? These old and deep philosophical questions still stir controversy today. Some scholars have suggested that our difficulty in dealing with notions of randomness could be gauged by the comparatively late development of probability theory, which had a
The Missing Link  Implementation And Realization of . . .
, 1999
"... The notion of computation has attracted researchers from a wide range of areas, cognitive psychology being one of them. The analogy underlying the (metaphorical) usage of "computer" in cognitive psychology can be succinctly summarized by saying that the mind is to the brain as the program ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The notion of computation has attracted researchers from a wide range of areas, cognitive psychology being one of them. The analogy underlying the (metaphorical) usage of "computer" in cognitive psychology can be succinctly summarized by saying that the mind is to the brain as the program is to the hardware. Two main assumptions are buried in this analogy: 1) that the mind can somehow be understood computationally, and 2) that the same kind of relationthe implementation relationthat obtains between programs and computer hardware obtains between minds and brains too. While the first assumption has led to fertile research, the second remained mainly at the level of an assumption. Recently our
LifeLike Computing Beyond the Machine Metaphor
 In: R. Paton
, 1993
"... Introduction The question, what models, if any, can serve to represent the complexity of life, is a very important one. The application of biological ideas to novel software or hardware designs, or the seemingly opposite but in effect closely related task of using existing computers for the study o ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Introduction The question, what models, if any, can serve to represent the complexity of life, is a very important one. The application of biological ideas to novel software or hardware designs, or the seemingly opposite but in effect closely related task of using existing computers for the study of lifelike phenomena requires an at least partial clarification of what computers can do. The subject of study of this paper is a foundational question of this kind. Following a few earlier writings [Kampis 1991a, Kampis 1991b] we attempt here to give a short nontechnical summary for the nonspecialist of a set of general ideas about computer modelling, and to present an account of an operational modelling methodology for dealing with models of life, and in particular, with models of evolving systems. Evolvability is perhaps the most distinctive characteristic of living systems. Many biologists like J. Maynard Smith [1975, 1986] or R. Dawkins [1985] consider this to be the key to l
Comparative Analysis of Hypercomputational Systems
, 2006
"... In the 1930s, Turing suggested his abstract model for a practical computer, hypothetically visualizing the digital programmable computer long before it was actually invented. His model formed the foundation for every computer made today. The past few years have seen a change in ideas where philosoph ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In the 1930s, Turing suggested his abstract model for a practical computer, hypothetically visualizing the digital programmable computer long before it was actually invented. His model formed the foundation for every computer made today. The past few years have seen a change in ideas where philosophers and scientists are suggesting models of hypothetical computing devices which can outperform the Turing machine, performing some calculations the latter is unable to. The ChurchTuring Thesis, which the Turing machine model embodies, has raised discussions on whether it could be possible to solve undecidable problems which Turing’s model is unable to. Models which could solve these problems, have gone further to claim abilities relating to quantum computing, relativity theory, even the modeling of natural biological laws themselves. These so called ‘hypermachines’ use hypercomputational abilities to make the impossible possible. Various models belonging to different disciplines of physics, mathematics and philosophy, have been suggested for these theories. My (primarily researchoriented) project is based on the study and review of these different hypercomputational models and attempts to compare the different models in terms of computational power. The project focuses on the ability to compare these models of different disciplines on similar grounds and
Computational Processes, Observers and Turing Incompleteness
"... We propose a formal definition of Wolfram’s notion of computational process based on iterated transducers together with a weak observer, a model of computation that captures some aspects of physicslike computation. These processes admit a natural classification into decidable, intermediate and comp ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We propose a formal definition of Wolfram’s notion of computational process based on iterated transducers together with a weak observer, a model of computation that captures some aspects of physicslike computation. These processes admit a natural classification into decidable, intermediate and complete, where intermediate processes correspond to recursively enumerable sets of intermediate degree in the classical setting. It is shown that a standard finite injury priority argument will not suffice to establish the existence of an intermediate computational process.