Results 1  10
of
91
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the le ..."
Abstract

Cited by 218 (14 self)
 Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
Revisiting the edge of chaos: Evolving cellular automata to perform computations
 Complex Systems
, 1993
"... We present results from an experiment similar to one performed by Packard [24], in which a genetic algorithm is used to evolve cellular automata (CA) to perform a particular computational task. Packard examined the frequency of evolved CA rules as a function of Langton’s λ parameter [17], and interp ..."
Abstract

Cited by 124 (10 self)
 Add to MetaCart
We present results from an experiment similar to one performed by Packard [24], in which a genetic algorithm is used to evolve cellular automata (CA) to perform a particular computational task. Packard examined the frequency of evolved CA rules as a function of Langton’s λ parameter [17], and interpreted the results of his experiment as giving evidence for the following two hypotheses: (1) CA rules able to perform complex computations are most likely to be found near “critical ” λ values, which have been claimed to correlate with a phase transition between ordered and chaotic behavioral regimes for CA; (2) When CA rules are evolved to perform a complex computation, evolution will tend to select rules with λ values close to the critical values. Our experiment produced very different results, and we suggest that the interpretation of the original results is not correct. We also review and discuss issues related to λ, dynamicalbehavior classes, and computation in CA. The main constructive results of our study are identifying the emergence and competition of computational strategies and analyzing the central role of symmetries in an evolutionary system. In particular, we demonstrate how symmetry breaking can impede the evolution toward higher computational capability.
The calculi of emergence: Computation, dynamics, and induction
 Physica D 1994
"... SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent the views of the Santa Fe Institute. We accept papers intended for publication in peerreviewed journals or proceedings volumes, but not papers that have already appeared in print. Except for pap ..."
Abstract

Cited by 99 (15 self)
 Add to MetaCart
(Show Context)
SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent the views of the Santa Fe Institute. We accept papers intended for publication in peerreviewed journals or proceedings volumes, but not papers that have already appeared in print. Except for papers by our external faculty, papers must be based on work done at SFI, inspired by an invited visit to or collaboration at SFI, or funded by an SFI grant. ©NOTICE: This working paper is included by permission of the contributing author(s) as a means to ensure timely distribution of the scholarly and technical work on a noncommercial basis. Copyright and all rights therein are maintained by the author(s). It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may be reposted only with the explicit permission of the copyright holder. www.santafe.edu
Extraction of Rules from Discretetime Recurrent Neural Networks
, 1996
"... The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on t ..."
Abstract

Cited by 65 (15 self)
 Add to MetaCart
The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on the quality of the rules that are extracted from recurrent neural networks. Discretetime recurrent neural networks can be trained to correctly classify strings of a regular language. Rules defining the learned grammar can be extracted from networks in the form of deterministic finitestate automata (DFAs) by applying clustering algorithms in the output space of recurrent state neurons. Our algorithm can extract different finitestate automata that are consistent with a training set from the same network. We compare the generalization performances of these different models and the trained network and we introduce a heuristic that permits us to choose among the consistent DFAs the model which best approximates the learned regular grammar.
Turbulent Pattern Bases for Cellular Automata
 Physica D
, 1993
"... Unpredictable patterns generated by cellular automata (CA) can be decomposed with respect to a turbulent, positive entropy rate pattern basis. The resulting filtered patterns uncover significant structural organization in a CA's dynamics and information processing capabilities. We illustrate th ..."
Abstract

Cited by 50 (15 self)
 Add to MetaCart
(Show Context)
Unpredictable patterns generated by cellular automata (CA) can be decomposed with respect to a turbulent, positive entropy rate pattern basis. The resulting filtered patterns uncover significant structural organization in a CA's dynamics and information processing capabilities. We illustrate the decomposition technique by analyzing a binary, range2 cellular automaton having two invariant chaotic domains of different complexities and entropies. Once identified, the domains are seen to organize the CA's state space and to dominate its evolution. Starting from the domains' structures, we show how to construct a finitestate transducer that performs nonlinear spatial filtering such that the resulting spacetime patterns reveal the domains and the intervening walls and dislocations. To show the statistical consequences of domain detection, we compare the entropy and complexity densities of each domain with the globally averaged quantities. A more graphical comparison uses difference patter...
Computational mechanics: Pattern and prediction, structure and simplicity
 Journal of Statistical Physics
, 1999
"... Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
(Show Context)
Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with
Emergence of Netgrammar in Communicating Agents
 BioSystems
, 1996
"... Evolution of symbolic language and grammar is studied in a network model. Language is expressed by words, i.e. strings of symbols, which are generated by agents with their own symbolic grammar system. Agents communicate with each other by deriving and accepting words in terms of their own grammar. T ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
Evolution of symbolic language and grammar is studied in a network model. Language is expressed by words, i.e. strings of symbols, which are generated by agents with their own symbolic grammar system. Agents communicate with each other by deriving and accepting words in terms of their own grammar. They are ranked according to their communicative effectiveness: an agent which can derive less frequent and less acceptable words and accept words in less computational time will have higher scores. They can evolve by mutational processes, which change rewriting rules in their symbolic grammars. Complexity and diversity of words increase in the course of time. The emergence of modules and loop structure enhances the evolution. On the other hand, ensemble structure lead to a netgrammar, restricting individual grammars and their evolution. Key words: Netgrammar; Algorithmic evolution; Moduletype evolution; Evolution of language; Symbolic grammar systems 1 Introduction Linguistic expressions...
Computation in cellular automata: A selected review
 Nonstandard Computation
, 1996
"... Cellular automata (CAs) are decentralized spatially extended systems consisting of large numbers of simple identical components with local connectivity. Such systems have the potential to perform complex computations with a high degree of efficiency and robustness, as well as to model the behavior o ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
Cellular automata (CAs) are decentralized spatially extended systems consisting of large numbers of simple identical components with local connectivity. Such systems have the potential to perform complex computations with a high degree of efficiency and robustness, as well as to model the behavior of complex systems in nature. For these reasons CAs and related architectures have
Rule Extraction from Recurrent Neural Networks: a Taxonomy and Review
 Neural Computation
, 2005
"... this paper, the progress of this development is reviewed and analysed in detail. In order to structure the survey and to evaluate the techniques, a taxonomy, specifically designed for this purpose, has been developed. Moreover, important open research issues are identified, that, if addressed pr ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
this paper, the progress of this development is reviewed and analysed in detail. In order to structure the survey and to evaluate the techniques, a taxonomy, specifically designed for this purpose, has been developed. Moreover, important open research issues are identified, that, if addressed properly, possibly can give the field a significant push forward
Analysis of Dynamical Recognizers
 NEURAL COMPUTATION
, 1996
"... Pollack (1991) demonstrated that secondorder recurrent neural networks can act as dynamical recognizers for formal languages when trained on positive and negative examples, and observed both phase transitions in learning and IFSlike fractal state sets. Followon work focused mainly on the extra ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
Pollack (1991) demonstrated that secondorder recurrent neural networks can act as dynamical recognizers for formal languages when trained on positive and negative examples, and observed both phase transitions in learning and IFSlike fractal state sets. Followon work focused mainly on the extraction and minimization of a finite state automaton (FSA) from the trained network. However, such networks are capable of inducing languages which are not regular, and therefore not equivalenttoany FSA. Indeed, it may be simpler for a small network to fit its training data by inducing such a nonregular language. But when is the network's language not regular? In this paper, using a low dimensional network capable of learning all the Tomita data sets, we present an empirical method for testing whether the language induced by the network is regular or not. We also provide a detailed "machine analysis of trained networks for both regular and nonregular languages.