Results 1  10
of
64
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning pro ..."
Abstract

Cited by 214 (16 self)
 Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
Revisiting the edge of chaos: Evolving cellular automata to perform computations
 Complex Systems
, 1993
"... We present results from an experiment similar to one performed by Packard [24], in which a genetic algorithm is used to evolve cellular automata (CA) to perform a particular computational task. Packard examined the frequency of evolved CA rules as a function of Langton’s λ parameter [17], and interp ..."
Abstract

Cited by 99 (10 self)
 Add to MetaCart
We present results from an experiment similar to one performed by Packard [24], in which a genetic algorithm is used to evolve cellular automata (CA) to perform a particular computational task. Packard examined the frequency of evolved CA rules as a function of Langton’s λ parameter [17], and interpreted the results of his experiment as giving evidence for the following two hypotheses: (1) CA rules able to perform complex computations are most likely to be found near “critical ” λ values, which have been claimed to correlate with a phase transition between ordered and chaotic behavioral regimes for CA; (2) When CA rules are evolved to perform a complex computation, evolution will tend to select rules with λ values close to the critical values. Our experiment produced very different results, and we suggest that the interpretation of the original results is not correct. We also review and discuss issues related to λ, dynamicalbehavior classes, and computation in CA. The main constructive results of our study are identifying the emergence and competition of computational strategies and analyzing the central role of symmetries in an evolutionary system. In particular, we demonstrate how symmetry breaking can impede the evolution toward higher computational capability.
The calculi of emergence: Computation, dynamics, and induction
 Physica D
, 1994
"... Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analyzed in terms of how modelbuilding observers infer from measurements the computational capabilities embedded ..."
Abstract

Cited by 77 (14 self)
 Add to MetaCart
Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analyzed in terms of how modelbuilding observers infer from measurements the computational capabilities embedded in nonlinear processes. An observer’s notion of what is ordered, what is random, and what is complex in its environment depends directly on its computational resources: the amount of raw measurement data, of memory, and of time available for estimation and inference. The discovery of structure in an environment depends more critically and subtlely, though, on how those resources are organized. The descriptive power of the observer’s chosen (or implicit) computational model class, for example, can be an overwhelming determinant in finding regularity in data. This paper presents an overview of an inductive framework — hierarchicalmachine reconstruction — in which the emergence of complexity is associated with the innovation of new computational model classes. Complexity metrics for detecting structure and quantifying emergence, along with an analysis of the constraints on the dynamics of innovation, are outlined. Illustrative examples are drawn from the onset of unpredictability in nonlinear systems, finitary nondeterministic processes, and
Extraction of Rules from Discretetime Recurrent Neural Networks
, 1996
"... The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on t ..."
Abstract

Cited by 61 (15 self)
 Add to MetaCart
The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on the quality of the rules that are extracted from recurrent neural networks. Discretetime recurrent neural networks can be trained to correctly classify strings of a regular language. Rules defining the learned grammar can be extracted from networks in the form of deterministic finitestate automata (DFAs) by applying clustering algorithms in the output space of recurrent state neurons. Our algorithm can extract different finitestate automata that are consistent with a training set from the same network. We compare the generalization performances of these different models and the trained network and we introduce a heuristic that permits us to choose among the consistent DFAs the model which best approximates the learned regular grammar.
Turbulent Pattern Bases for Cellular Automata
 Physica D
, 1993
"... Unpredictable patterns generated by cellular automata (CA) can be decomposed with respect to a turbulent, positive entropy rate pattern basis. The resulting filtered patterns uncover significant structural organization in a CA's dynamics and information processing capabilities. We illustrate the dec ..."
Abstract

Cited by 46 (14 self)
 Add to MetaCart
Unpredictable patterns generated by cellular automata (CA) can be decomposed with respect to a turbulent, positive entropy rate pattern basis. The resulting filtered patterns uncover significant structural organization in a CA's dynamics and information processing capabilities. We illustrate the decomposition technique by analyzing a binary, range2 cellular automaton having two invariant chaotic domains of different complexities and entropies. Once identified, the domains are seen to organize the CA's state space and to dominate its evolution. Starting from the domains' structures, we show how to construct a finitestate transducer that performs nonlinear spatial filtering such that the resulting spacetime patterns reveal the domains and the intervening walls and dislocations. To show the statistical consequences of domain detection, we compare the entropy and complexity densities of each domain with the globally averaged quantities. A more graphical comparison uses difference patter...
Computational mechanics: Pattern and prediction, structure and simplicity
 Journal of Statistical Physics
, 1999
"... Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with ..."
Abstract

Cited by 43 (8 self)
 Add to MetaCart
Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with
Computation in cellular automata: A selected review
 Nonstandard Computation
, 1996
"... Cellular automata (CAs) are decentralized spatially extended systems consisting of large numbers of simple identical components with local connectivity. Such systems have the potential to perform complex computations with a high degree of efficiency and robustness, as well as to model the behavior o ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
Cellular automata (CAs) are decentralized spatially extended systems consisting of large numbers of simple identical components with local connectivity. Such systems have the potential to perform complex computations with a high degree of efficiency and robustness, as well as to model the behavior of complex systems in nature. For these reasons CAs and related architectures have
Analysis of Dynamical Recognizers
 NEURAL COMPUTATION
, 1996
"... Pollack (1991) demonstrated that secondorder recurrent neural networks can act as dynamical recognizers for formal languages when trained on positive and negative examples, and observed both phase transitions in learning and IFSlike fractal state sets. Followon work focused mainly on the extra ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
Pollack (1991) demonstrated that secondorder recurrent neural networks can act as dynamical recognizers for formal languages when trained on positive and negative examples, and observed both phase transitions in learning and IFSlike fractal state sets. Followon work focused mainly on the extraction and minimization of a finite state automaton (FSA) from the trained network. However, such networks are capable of inducing languages which are not regular, and therefore not equivalenttoany FSA. Indeed, it may be simpler for a small network to fit its training data by inducing such a nonregular language. But when is the network's language not regular? In this paper, using a low dimensional network capable of learning all the Tomita data sets, we present an empirical method for testing whether the language induced by the network is regular or not. We also provide a detailed "machine analysis of trained networks for both regular and nonregular languages.
Emergence of Netgrammar in Communicating Agents
 BioSystems
, 1996
"... Evolution of symbolic language and grammar is studied in a network model. Language is expressed by words, i.e. strings of symbols, which are generated by agents with their own symbolic grammar system. Agents communicate with each other by deriving and accepting words in terms of their own grammar. T ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
Evolution of symbolic language and grammar is studied in a network model. Language is expressed by words, i.e. strings of symbols, which are generated by agents with their own symbolic grammar system. Agents communicate with each other by deriving and accepting words in terms of their own grammar. They are ranked according to their communicative effectiveness: an agent which can derive less frequent and less acceptable words and accept words in less computational time will have higher scores. They can evolve by mutational processes, which change rewriting rules in their symbolic grammars. Complexity and diversity of words increase in the course of time. The emergence of modules and loop structure enhances the evolution. On the other hand, ensemble structure lead to a netgrammar, restricting individual grammars and their evolution. Key words: Netgrammar; Algorithmic evolution; Moduletype evolution; Evolution of language; Symbolic grammar systems 1 Introduction Linguistic expressions...
Recurrent Networks: State Machines Or Iterated Function Systems?
 Proceedings of the 1993 Connectionist Models Summer School
, 1994
"... this paper, clustering of hidden unit activations, or recurrent network state space, provides incomplete information regarding the IP state of the network. IP states determine future behavior as well as encapsulate input history. The network's state transformations can exhibit sensitivity to initial ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
this paper, clustering of hidden unit activations, or recurrent network state space, provides incomplete information regarding the IP state of the network. IP states determine future behavior as well as encapsulate input history. The network's state transformations can exhibit sensitivity to initial conditions and generate disparate futures for state clusters of all sizes. The second part of the paper presents IFS theory and shows how it can explain recurrent network state dynamics. By linking IFSs and recurrent networks, existing constraints on network dynamics independent of network models are now evident. By assuming a finite set of inputs, which is often the case in symbolic domains, one can describe recurrent network models as a finite collection of nonlinear state transformations.The interaction of several transforms produces complex state spaces with recursive structure. The limit behavior of the collection of transformations, and recurrent networks in symbolic applications, is more complex than the union of the individual transformations. An input driven recurrent network behaves like the random iteration algorithm. Infinite input sequence generates sequences of points dense in the state space attractor when the transformations are contractive. While the demonstration in this paper used the SCN, other models produce similar IFSlike behaviors as long as the network's input selects transformations [19]. The IFS approach also explains the phenomena of state clustering in recurrent networks. In [20], ServenSchreiber et al report significant clustering in simple recurrent networks [21] both before and after training from the Reber grammar prediction task. A set of random transformations will normally reduce the volume of the recurrent networks state space, and plac...