Results 1  10
of
63
Computation at the onset of chaos
 The Santa Fe Institute, Westview
, 1988
"... Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubl ..."
Abstract

Cited by 83 (14 self)
 Add to MetaCart
Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubling and bandmerging cascades, we derive expressions for the entropy, the interdependence ofmachine complexity and entropy, and the latent complexity of the transition to chaos. At the transition deterministic finite automaton models diverge in size. Although there is no regular or contextfree Chomsky grammar in this case, we give finite descriptions at the higher computational level of contextfree Lindenmayer systems. We construct a restricted indexed contextfree grammar and its associated oneway nondeterministic nested stack automaton for the cascade limit language. This analysis of a family of dynamical systems suggests a complexity theoretic description of phase transitions based on the informational diversity and computational complexity of observed data that is independent of particular system control parameters. The approach gives a much more refined picture of the architecture of critical states than is available via
The calculi of emergence: Computation, dynamics, and induction
 Physica D
, 1994
"... Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analyzed in terms of how modelbuilding observers infer from measurements the computational capabilities embedded ..."
Abstract

Cited by 77 (14 self)
 Add to MetaCart
Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analyzed in terms of how modelbuilding observers infer from measurements the computational capabilities embedded in nonlinear processes. An observer’s notion of what is ordered, what is random, and what is complex in its environment depends directly on its computational resources: the amount of raw measurement data, of memory, and of time available for estimation and inference. The discovery of structure in an environment depends more critically and subtlely, though, on how those resources are organized. The descriptive power of the observer’s chosen (or implicit) computational model class, for example, can be an overwhelming determinant in finding regularity in data. This paper presents an overview of an inductive framework — hierarchicalmachine reconstruction — in which the emergence of complexity is associated with the innovation of new computational model classes. Complexity metrics for detecting structure and quantifying emergence, along with an analysis of the constraints on the dynamics of innovation, are outlined. Illustrative examples are drawn from the onset of unpredictability in nonlinear systems, finitary nondeterministic processes, and
Computational mechanics: Pattern and prediction, structure and simplicity
 Journal of Statistical Physics
, 1999
"... Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with ..."
Abstract

Cited by 43 (8 self)
 Add to MetaCart
Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with
Predictability, Complexity, and Learning
, 2001
"... We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If t ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, powerlaw growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and the analysis of physical systems through statistical mechanics and dynamical systems theory. Furthermore, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in problems in physics, statistics, and biology.
Multifield Visualization Using Local Statistical Complexity
 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
, 2007
"... Modern unsteady (multi)field visualizations require an effective reduction of the data to be displayed. From a huge amount of information the most informative parts have to be extracted. Instead of the fuzzy application dependent notion of feature, a new approach based on information theoretic conc ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
Modern unsteady (multi)field visualizations require an effective reduction of the data to be displayed. From a huge amount of information the most informative parts have to be extracted. Instead of the fuzzy application dependent notion of feature, a new approach based on information theoretic concepts is introduced in this paper to detect important regions. This is accomplished by extending the concept of local statistical complexity from finite state cellular automata to discretized (multi)fields. Thus, informative parts of the data can be highlighted in an applicationindependent, purely mathematical sense. The new measure can be applied to unsteady multifields on regular grids in any application domain. The ability to detect and visualize important parts is demonstrated using diffusion, flow, and weather simulations.
Dynamics, computation, and the “edge of chaos”: A reexamination
 Complexity:Metaphors, Models, and Reality
, 1994
"... In this paper we review previous work and present new work concerning the relationship between dynamical systems theory and computation. In particular, we review work by Langton [21] and Packard [29] on the relationship between dynamical behavior and computational capability in cellular automata (CA ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
In this paper we review previous work and present new work concerning the relationship between dynamical systems theory and computation. In particular, we review work by Langton [21] and Packard [29] on the relationship between dynamical behavior and computational capability in cellular automata (CAs). We present results from an experiment similar to the one described by Packard [29], which was cited as evidence for the hypothesis that rules capable of performing complex computations are most likely to be found at a phase transition between ordered and chaotic behavioral regimes for CAs (the “edge of chaos”). Our experiment produced very different results from the original experiment, and we suggest that the interpretation of the original results is not correct. We conclude by discussing general issues related to dynamics, computation, and the “edge of chaos ” in cellular automata. 1
Syntactic Measures of Complexity
, 1999
"... page 14 Declaration  page 15 Notes of copyright and the ownership of intellectual property rights  page 15 The Author  page 16 Acknowledgements  page 16 1  Introduction  page 17 1.1  Background  page 17 1.2  The Style of Approach  page 18 1.3  Motivation  page 19 1.4  Style of ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
page 14 Declaration  page 15 Notes of copyright and the ownership of intellectual property rights  page 15 The Author  page 16 Acknowledgements  page 16 1  Introduction  page 17 1.1  Background  page 17 1.2  The Style of Approach  page 18 1.3  Motivation  page 19 1.4  Style of Presentation  page 20 1.5  Outline of the Thesis  page 21 2  Models and Modelling  page 23 2.1  Some Types of Models  page 25 2.2  Combinations of Models  page 28 2.3  Parts of the Modelling Apparatus  page 33 2.4  Models in Machine Learning  page 38 2.5  The Philosophical Background to the Rest of this Thesis  page 41 Syntactic Measures of Complexity  page 3  3  Problems and Properties  page 44 3.1  Examples of Common Usage  page 44 3.1.1  A case of nails  page 44 3.1.2  Writing a thesis  page 44 3.1.3  Mathematics  page 44 3.1.4  A gas  page 44 3.1.5  An ant hill  page 45 3.1.6  A car engine  page 45 3.1.7  A cell as part of an organism ...
On the relationship between complexity and entropy for Markov chains and regular languages
 Complex Systems
, 1991
"... Abstract. Using the pastfuture mutual information as a measure of complexity, the relation between the complexity and the Shannon entropy is determined analytically for sequences generated by Markov chains and regular languages. It is emphasized that, given an entropy value, there are many possible ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Abstract. Using the pastfuture mutual information as a measure of complexity, the relation between the complexity and the Shannon entropy is determined analytically for sequences generated by Markov chains and regular languages. It is emphasized that, given an entropy value, there are many possible complexity values, and vice versa; that is, the relationship between complexity and entropy is not onetoone, but rather manytoone or onetomany. It is also emphasized that there are structures in the complexityversusentropy plots, and these structures depend on the details of a Markov chain or a regular language grammar. 1.
Is there chaos in the brain? II. Experimental evidence and related models
 C. R. Biol
, 2003
"... The search for chaotic patterns has occupied numerous investigators in neuroscience, as in many other fields of science. Their results and main conclusions are reviewed in the light of the most recent criteria that need to be satisfied since the first descriptions of the surrogate strategy. The meth ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
The search for chaotic patterns has occupied numerous investigators in neuroscience, as in many other fields of science. Their results and main conclusions are reviewed in the light of the most recent criteria that need to be satisfied since the first descriptions of the surrogate strategy. The methods used in each of these studies have almost invariably combined the analysis of experimental data with simulations using formal models, often based on modified Huxley and Hodgkin equations and/or of the Hindmarsh and Rose models of bursting neurons. Due to technical limitations, the results of these simulations have prevailed over experimental ones in studies on the nonlinear properties of large cortical networks and higher brain functions. Yet, and although a convincing proof of chaos (as defined mathematically) has only been obtained at the level of axons, of single and coupled cells, convergent results can be interpreted as compatible with the notion that signals in the brain are distributed according to chaotic patterns at all levels of its various forms of hierarchy. This chronological account of the main landmarks of nonlinear neurosciences follows an earlier publication [Faure, Korn, C. R. Acad. Sci. Paris, Ser. III 324 (2001) 773–793] that was focused on the basic concepts of nonlinear dynamics and methods of investigations which allow chaotic processes to be distinguished from stochastic ones and on the rationale for envisioning their control using external perturbations. Here we present the data and main arguments that support the existence of chaos at all levels from the simplest to the most complex forms of organization of the nervous system.