Results 11  20
of
225
Coevolving HighLevel Representations
 TO APPEAR IN: ARTIFICIAL LIFE III
"... Several evolutionary simulations allow for a dynamic resizing of the genotype. This is an important alternative to constraining the genotype’s maximum size and complexity. In this paper, we add an additional dynamic to simulated evolution with the description of a genetic algorithm that coevolves it ..."
Abstract

Cited by 97 (13 self)
 Add to MetaCart
Several evolutionary simulations allow for a dynamic resizing of the genotype. This is an important alternative to constraining the genotype’s maximum size and complexity. In this paper, we add an additional dynamic to simulated evolution with the description of a genetic algorithm that coevolves its representation language with the genotypes. We introduce two mutation operators that permit the acquisition of modules from the genotypes during evolution. These modules form an increasingly highlevel representation language specific to the developmental environment. Experimental results illustrating interesting properties of the acquired modules and the evolved languages are provided.
Language as a Dynamical System
 In
, 1995
"... Introduction Despite considerable diversity among theories about how humans process language, there are a number of fundamental assumptions which are shared by most such theories. This consensus extends to the very basic question about what counts as a cognitive process. So although many cognitive s ..."
Abstract

Cited by 96 (3 self)
 Add to MetaCart
(Show Context)
Introduction Despite considerable diversity among theories about how humans process language, there are a number of fundamental assumptions which are shared by most such theories. This consensus extends to the very basic question about what counts as a cognitive process. So although many cognitive scientists are fond of referring to the brain as a `mental organ' (e.g., Chomsky, 1975)implying a similarity to other organs such as the liver or kidneysit is also assumed that the brain is an organ with special properties which set it apart. Brains `carry out computation' (it is argued)
Extracting Comprehensible Models from Trained Neural Networks
, 1996
"... To Mom, Dad, and Susan, for their support and encouragement. ..."
Abstract

Cited by 84 (3 self)
 Add to MetaCart
(Show Context)
To Mom, Dad, and Susan, for their support and encouragement.
Constructing Deterministic FiniteState Automata in Recurrent Neural Networks
 Journal of the ACM
, 1996
"... Recurrent neural networks that are trained to behave like deterministic finitestate automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use o ..."
Abstract

Cited by 77 (16 self)
 Add to MetaCart
(Show Context)
Recurrent neural networks that are trained to behave like deterministic finitestate automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this instability. We prove that a simple algorithm can construct secondorder recurrent neural networks with a sparse interconnection topology and sigmoidal discriminant function such that the internal DFA state representations are stable, i.e. the constructed network correctly classifies strings of arbitrary length. The algorithm is based on encoding strengths of weights directly into the neural network. We derive a relationship between the weight strength and the number of DFA states for robust string classification. For a DFA with n states and m input alphabet symbols, the constructive algorithm genera...
A Recurrent Neural Network That Learns to Count
 CONNECTION SCIENCE
, 1999
"... ..."
(Show Context)
Extraction of Rules from Discretetime Recurrent Neural Networks
, 1996
"... The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on t ..."
Abstract

Cited by 69 (15 self)
 Add to MetaCart
The extraction of symbolic knowledge from trained neural networks and the direct encoding of (partial) knowledge into networks prior to training are important issues. They allow the exchange of information between symbolic and connectionist knowledge representations. The focas of this paper is on the quality of the rules that are extracted from recurrent neural networks. Discretetime recurrent neural networks can be trained to correctly classify strings of a regular language. Rules defining the learned grammar can be extracted from networks in the form of deterministic finitestate automata (DFAs) by applying clustering algorithms in the output space of recurrent state neurons. Our algorithm can extract different finitestate automata that are consistent with a training set from the same network. We compare the generalization performances of these different models and the trained network and we introduce a heuristic that permits us to choose among the consistent DFAs the model which best approximates the learned regular grammar.
Learning semantic combinatoriality from the interaction between linguistic and behavioral processes
 ADAPTIVE BEHAVIOR
, 2005
"... ..."
(Show Context)
SelfOrganization of Distributedly Represented Multiple Behavior Schemata in a Mirror System: . . .
, 2004
"... The current paper reviews a connectionist model, the recurrent neural network with parametric biases (RNNPB), in which multiple behavior schemata can be learned by the network in a distributed manner. The parametric biases in the network play an essential role in both generating and recognizing beh ..."
Abstract

Cited by 66 (11 self)
 Add to MetaCart
The current paper reviews a connectionist model, the recurrent neural network with parametric biases (RNNPB), in which multiple behavior schemata can be learned by the network in a distributed manner. The parametric biases in the network play an essential role in both generating and recognizing behavior 1 patterns. They act as a mirror system by means of selforganizing adequate memory structures. Three different robot experiments are reviewed: robot and user interactions; learning and generating different types of dynamic patterns; and linguisticbehavior binding. The hallmark of this study is explaining how selforganizing internal structures can contribute to generalization in learning, and diversity in behavior generation, in the proposed distributed representation scheme.
Learning recursive distributed representations for holistic computation
 Connection Science
, 1991
"... Learning recursive distributed representations for holistic computation ..."
Abstract

Cited by 63 (0 self)
 Add to MetaCart
Learning recursive distributed representations for holistic computation
Dynamical Recognizers: Realtime Language Recognition by Analog Computers
 Theoretical Computer Science
, 1996
"... We consider a model of analog computation which can recognize various languages in real time. We encode an input word as a point in R d by composing iterated maps, and then apply inequalities to the resulting point to test for membership in the language. Each class of maps and inequalities, suc ..."
Abstract

Cited by 63 (4 self)
 Add to MetaCart
(Show Context)
We consider a model of analog computation which can recognize various languages in real time. We encode an input word as a point in R d by composing iterated maps, and then apply inequalities to the resulting point to test for membership in the language. Each class of maps and inequalities, such as quadratic functions with rational coefficients, is capable of recognizing a particular class of languages; for instance, linear and quadratic maps can have both stacklike and queuelike memories. We use methods equivalent to the VapnikChervonenkis dimension to separate some of our classes from each other, e.g. linear maps are less powerful than quadratic or piecewiselinear ones, polynomials are less powerful than elementary (trigonometric and exponential) maps, and deterministic polynomials of each degree are less powerful than their nondeterministic counterparts. Comparing these dynamical classes with various discrete language classes helps illuminate how iterated maps can...