Results 1  10
of
91
Algorithmic information theory
 IBM JOURNAL OF RESEARCH AND DEVELOPMENT
, 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract

Cited by 399 (20 self)
 Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that a program whose bits are chosen by coin flipping produces a given output. During the past few years the definitions of algorithmic information theory have been reformulated. The basic features of the new formalism are presented here and certain results of R. M. Solovay are reported.
Evolving morphologies of simulated 3d organisms based on differential gene expression
 PROCEEDINGS OF THE FOURTH EUROPEAN CONFERENCE ON ARTI CIAL LIFE, ECAL97
, 1997
"... Most simulations of biological evolution depend on a rather restricted set of properties. In this paper a richer model, based on differential gene expression is introduced to control developmental processes in an artificial evolutionary system. Differential gene expression is used to get different c ..."
Abstract

Cited by 154 (1 self)
 Add to MetaCart
Most simulations of biological evolution depend on a rather restricted set of properties. In this paper a richer model, based on differential gene expression is introduced to control developmental processes in an artificial evolutionary system. Differential gene expression is used to get different cell types and to modulate cell division and cell death. One of the advantages using developmental processes in evolutionary systems is the reduction of the information needed in the genome to encode e.g. shapes or cell types which results in better scaling behavior of the system. My result showed that the shaping of multicellular organisms in 3d is possible with the proposed system.
The thermodynamics of computationa review
 In International Journlll öj Theoretical Physics [38
"... Computers may be thought of as engines for transforming free energy into waste heat and mathematical work. Existing electronic omputers dissipate energy vastly in excess of the mean thermal energy kT, for purposes uch as maintaining volatile storage devices in a bistable condition, synchronizing and ..."
Abstract

Cited by 108 (2 self)
 Add to MetaCart
Computers may be thought of as engines for transforming free energy into waste heat and mathematical work. Existing electronic omputers dissipate energy vastly in excess of the mean thermal energy kT, for purposes uch as maintaining volatile storage devices in a bistable condition, synchronizing and standardizing signals, and maximizing switching speed. On the other hand, recent models due to Fredkin and Toffoli show that in principle a computer could compute at finite speed with zero energy dissipation and zero error. In these models, a simple assemblage of simple but idealized mechanical parts (e.g., hard spheres and flat plates) determines a ballistic trajectory isomorphic with the desired computation, a trajectory therefore not foreseen in detail by the builder of the computer. In a classical or semiclassical setting, ballistic models are unrealistic because they require the parts to be assembled with perfect precision and isolated from thermal noise, which would eventually randomize the trajectory and lead to errors. Possibly quantum effects could be exploited to prevent his undesired equipartition of the kinetic energy. Another family of models may be called
Computation at the onset of chaos
 The Santa Fe Institute, Westview
, 1988
"... Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubl ..."
Abstract

Cited by 99 (16 self)
 Add to MetaCart
Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubling and bandmerging cascades, we derive expressions for the entropy, the interdependence ofmachine complexity and entropy, and the latent complexity of the transition to chaos. At the transition deterministic finite automaton models diverge in size. Although there is no regular or contextfree Chomsky grammar in this case, we give finite descriptions at the higher computational level of contextfree Lindenmayer systems. We construct a restricted indexed contextfree grammar and its associated oneway nondeterministic nested stack automaton for the cascade limit language. This analysis of a family of dynamical systems suggests a complexity theoretic description of phase transitions based on the informational diversity and computational complexity of observed data that is independent of particular system control parameters. The approach gives a much more refined picture of the architecture of critical states than is available via
Making sense of randomness: Implicit encoding as a basis for judgment
 Psychological Review
, 1997
"... People attempting to generate random sequences usually produce more alternations than expected by chance. They also judge overalternating sequences as maximally random. In this article, the authors review findings, implications, and explanatory mechanisms concerning subjective randomness. The author ..."
Abstract

Cited by 52 (1 self)
 Add to MetaCart
(Show Context)
People attempting to generate random sequences usually produce more alternations than expected by chance. They also judge overalternating sequences as maximally random. In this article, the authors review findings, implications, and explanatory mechanisms concerning subjective randomness. The authors next present the general approach of the mathematical theory of complexity, which identifies the length of the shortest program for reproducing a sequence with its degree of randomness. They describe three experiments, based on mean group responses, indicating that the perceived randomness of a sequence is better predicted by various measures of its encoding difficulty than by its objective randomness. These results seem to imply that in accordance with the complexity view, judging the extent of a sequence's randomness is based on an attempt to mentally encode it. The experience of randomness may result when this attempt fails. Judging a situation as more or less random is often the key to important cognitions and behaviors. Perceiving a situation as nonchance calls for explanations, and it marks the onset of inductive inference (Lopes, 1982). Lawful environments encourage a coping orientation. One may try to control a situation
Incompleteness Theorems for Random Reals
, 1987
"... We obtain some dramatic results using statistical mechanicsthermodynamics kinds of arguments concerning randomness, chaos, unpredictability, and uncertainty in mathematics. We construct an equation involving only whole numbers and addition, multiplication, and exponentiation, with the property tha ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
We obtain some dramatic results using statistical mechanicsthermodynamics kinds of arguments concerning randomness, chaos, unpredictability, and uncertainty in mathematics. We construct an equation involving only whole numbers and addition, multiplication, and exponentiation, with the property that if one varies a parameter and asks whether the number of solutions is finite or infinite, the answer to this question is indistinguishable from the result of independent tosses of a fair coin. This yields a number of powerful Godel incompletenesstype results concerning the limitations of the axiomatic method, in which entropyinformation measures are used. c fl 1987 Academic Press, Inc. 2 G. J. Chaitin 1. Introduction It is now half a century since Turing published his remarkable paper On Computable Numbers, with an Application to the Entscheidungsproblem (Turing [15]). In that paper Turing constructs a universal Turing machine that can simulate any other Turing machine. He also use...
Cell Interactions as a Control Tool of Developmental Processes for Evolutionary Robotics
 From Animals to Animats 4: Proceedings of the Fourth International Conference on Simulation of Adaptive Behavior
, 1996
"... This paper describes new genetic and developmental principles for an artificial evolutionary system (AES) and reports the first simulation results. Emphasis is placed on those developmental processes which reduce the length of the genome to code for a given problem. We exemplify the usefulness of de ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
This paper describes new genetic and developmental principles for an artificial evolutionary system (AES) and reports the first simulation results. Emphasis is placed on those developmental processes which reduce the length of the genome to code for a given problem. We exemplify the usefulness of developmental processes with cell growth, cell differentiation and the creation of neural control structures which we used to control a real world autonomous agent. The importance of including developmental processes relies much on the fact that a neural network can be specified implicitly by using celltocell communication. 1 Introduction In the field of autonomous agents different approaches have been studied: One of them, the evolutionary approach, aims to produce increasingly sophisticated autonomous agents with no need to care about the details of the robots control structure. As others,(Nolfi et al.,1994; Cangelosi et al.,1994; Daellert & Beer,1994; Harvey et al.,1995), we are convinc...
The Application Of Algorithmic Probability to Problems in Artificial Intelligence
 in Uncertainty in Artificial Intelligence, Kanal, L.N. and Lemmer, J.F. (Eds), Elsevier Science Publishers B.V
, 1986
"... INTRODUCTION We will cover two topics First, Algorithmic Probability  the motivation for defining it, how it overcomes di#culties in other formulations of probability, some of its characteristic properties and successful applications. Second, we will apply it to problems in A.I.  where it p ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
INTRODUCTION We will cover two topics First, Algorithmic Probability  the motivation for defining it, how it overcomes di#culties in other formulations of probability, some of its characteristic properties and successful applications. Second, we will apply it to problems in A.I.  where it promises to give near optimum search procedures for two very broad classes of problems. A strong motivation for revising classical concepts of probability has come from the analysis of human problem solving. When working on a di#cult problem, a person is in a maze in which he must make choices of possible courses of action. If the problem is a familiar one, the choices will all be easy. If it is not familiar, there can be much uncertainty in each choice, but choices must somehow be made. One basis for choice might be the probability of each choice leading to a quick solution  this probability being based on experience in this problem and in problems like it. A good reason for using proba
A system for incremental learning based on algorithmic probability
 Probability,” Proceedings of the Sixth Israeli Conference on Artificial Intelligence, Computer Vision and Pattern Recognition
, 1989
"... We have employed Algorithmic Probability Theory to construct a system for machine learning of great power and generality. The principal thrust of present research is the design of sequences of problems to train this system. Current programs for machine learning are limited in the kinds of concepts a ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
We have employed Algorithmic Probability Theory to construct a system for machine learning of great power and generality. The principal thrust of present research is the design of sequences of problems to train this system. Current programs for machine learning are limited in the kinds of concepts accessible to them, the kinds of problems they can learn to solve, and in the efficiency with which they learn — both in computation time needed and/or in amount of data needed for learning. Algorithmic Probability Theory provides a general model of the learning process that enables us to understand and surpass many of these limitations. Starting with a machine containing a small set of concepts, we use a carefully designed sequence of problems of increasing difficulty to bring the machine to a high level of problem solving skill. The use of training sequences of problems for machine knowledge acquisition promises to yield Expert Systems that will be easier to train and free of the brittleness that characterizes the narrow specialization of present day systems of this sort. It is also expected that the present research will give needed insight in the design of training sequences for human learning.