Results 1  10
of
66
Processing Capacity Defined by Relational Complexity: Implications for Comparative, Developmental, and Cognitive Psychology
, 1989
"... It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of differen ..."
Abstract

Cited by 102 (9 self)
 Add to MetaCart
It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of different ages. Complexity is defined by the number of dimensions, or sources of variation, that are related. A unary relation has one argument and one source of variation, because its argument can be instantiated in only one way at a time. A binary relation has two arguments, and two sources of variation, because two argument instantiations are possible at once. Similarly, a ternary relation is three dimensional, a quaternary relation is four dimensional, and so on. Dimensionality is related to number of chunks, because both attributes on dimensions and chunks are independent units of information of arbitrary size. Empirical studies of working memory limitations indicate a soft limit which corresponds to processing one quaternary relation in parallel. More complex concepts are processed by segmentation or conceptual chunking. Segmentation entails breaking tasks into components which do not exceed processing capacity, and which are processed serially. Conceptual chunking entails "collapsing" representations to reduce their dimensionality and consequently their processing load, but at the cost of making some relational information inaccessible. Parallel distributed processing implementations of relational representations show that relations with more arguments entail a higher computational cost, which corresponds to empirical observations of higher processing loads in humans. Empirical evidence is presented that relational complexity discriminates between higher species...
Approximating the Semantics of Logic Programs by Recurrent Neural Networks
"... In [18] we have shown how to construct a 3layered recurrent neural network that computes the fixed point of the meaning function TP of a given propositional logic program P, which corresponds to the computation of the semantics of P. In this article we consider the first order case. We define a no ..."
Abstract

Cited by 55 (9 self)
 Add to MetaCart
In [18] we have shown how to construct a 3layered recurrent neural network that computes the fixed point of the meaning function TP of a given propositional logic program P, which corresponds to the computation of the semantics of P. In this article we consider the first order case. We define a notion of approximation for interpretations and prove that there exists a 3layered feed forward neural network that approximates the calculation of TP for a given first order acyclic logic program P with an injective level mapping arbitrarily well. Extending the feed forward network by recurrent connections we obtain a recurrent neural network whose iteration approximates the fixed point of TP. This result is proven by taking advantage of the fact that for acyclic logic programs the function TP is a contraction mapping on a complete metric space defined by the interpretations of the program. Mapping this space to the metric space IR with Euclidean distance, a real valued function fP can be defined which corresponds to TP and is continuous as well as a contraction. Consequently it can be approximated by an appropriately chosen class of feed forward neural networks.
From Words to Understanding
 COMPUTING WITH LARGE RANDOM PATTERNS
"... As was discussed in section 22, language is central to a correct understanding of the mind. Compositional analytic models perform well in the domain and subject area they are developed for, but any extension is difficult and the models have incomplete psychological veracity. Here we explore how to c ..."
Abstract

Cited by 48 (14 self)
 Add to MetaCart
As was discussed in section 22, language is central to a correct understanding of the mind. Compositional analytic models perform well in the domain and subject area they are developed for, but any extension is difficult and the models have incomplete psychological veracity. Here we explore how to compute representations of meaning based on a lower level of abstraction and how to use the models for tasks that require some form of language understanding.
Binary SpatterCoding of Ordered KTuples
 In
, 1996
"... Information with structure is traditionally organized into records with fields. For example, a medical record consisting of name, sex, age, and weight might look like (Joe, male, 66, 77). What 77 stands for is determined by its location in the record, so that this is an example of local representati ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
Information with structure is traditionally organized into records with fields. For example, a medical record consisting of name, sex, age, and weight might look like (Joe, male, 66, 77). What 77 stands for is determined by its location in the record, so that this is an example of local representation. The brain's wiring, and robustness under local damage, speak for the importance of distributed representations. The Holographic Reduced Representation (HRR) of Plate is a prime example based on real or complex vectors. This paper describes how spatter coding leads to binary HRRs, and how the fields of a record are encoded into a long binary word without fields and how they are extracted from such a word. 1 Introduction Nested compositional structure is fundamental to highlevel mental functions, such as language and analogy. Accordingly, modeling these functions with neural nets requires that the structures be represented in a form suitable for neural nets.
Principles for an Integrated Connectionist/Symbolic Theory of Higher Cognition
, 1992
"... The main claim of this paper is that connectionism offers cognitive science a number of excellent opportunities for turning methodological, theoretical. and metatheoretica! schisms into powerfnl integrationsopportunities for forging constructive synergy out of the destructive interference whic ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
The main claim of this paper is that connectionism offers cognitive science a number of excellent opportunities for turning methodological, theoretical. and metatheoretica! schisms into powerfnl integrationsopportunities for forging constructive synergy out of the destructive interference which plagues the field. The paper begins with an analysis of the rifts in tile field and what it would take to overcome them. We argue that while connectionism ha,s often contributed to the deepexLing of these schisms, ]t is nonetheless possible to turn this trend aroundpossible for connectionism to play a central role in a unification of cognitive science. Essential o this process is the development of strong theoretical principles founded (in part) on connectionist computation; a main goal of this paper is to demonstrate that such principles are indeed within the reach of a connectionistgrounded theory of cognition. The enterprise rests on a willingness to entertain, analyze, and extend characterizations of cognitive problems, and hypothesized solutions, which are deliberately overly simple and generalin order to discover the insights they can offer through mathematical analyses which this simplicity and generality are makes possible. In this
Multiplicative Binding, Representation Operators and Analogy
"... This paper introduces a novel implementation of the bind() operator that is simple, can be efficiently implemented, and highlights the relationship between retrieval queries and analogical mapping. A frame of role/filler bindings can easily be represented using bind() and bundle(). However, typical ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
This paper introduces a novel implementation of the bind() operator that is simple, can be efficiently implemented, and highlights the relationship between retrieval queries and analogical mapping. A frame of role/filler bindings can easily be represented using bind() and bundle(). However, typical binding systems are unable to adequately represent multiple frames and arbitrary nested compositional structures. A novel family of representational operators (called braid()) is introduced to address these problems. Other binding systems make the strong assumption that the roles and fillers are disjoint in order to avoid ambiguities inherent in their representational idioms. The braid() operator can be used to avoid this assumption. The new representational idiom suggests how the cognitive processes of bottomup and topdown object recognition might be implemented. These processes depend on analogical mapping to integrate disjoint representations and drive perceptual search. Analogical Inference by Systematic Substitution Analogical inference depends on systematic substitution of the components of compositional structures (Gentner, 1983; Halford et al., 1994; Holyoak & Thagard, 1989)
Connectionist sentence processing in perspective
 Cognitive Science
, 1999
"... The emphasis in the connectionist sentenceprocessing literature on distributed representation and emergence of grammar from such systems can easily obscure the often close relations between connectionist and symbolist systems. This paper argues that the Simple Recurrent Network (SRN) models propose ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
The emphasis in the connectionist sentenceprocessing literature on distributed representation and emergence of grammar from such systems can easily obscure the often close relations between connectionist and symbolist systems. This paper argues that the Simple Recurrent Network (SRN) models proposed by Jordan (1989) and Elman (1990) are more directly related to stochastic PartofSpeech (POS) Taggers than to parsers or grammars as such, while autoassociative memory models of the kind pioneered by Longuet–Higgins, Willshaw, Pollack and others may be useful for grammar induction from a networkbased conceptual structure as well as for structurebuilding. These observations suggest some interesting new directions for specifically connectionist sentence processing research, including more efficient representations for finite state machines, and acquisition devices based on a distinctively connectionist basis for grounded symbolist conceptual structure. I.
Dynamical Automata
, 1998
"... The recent work on automata whose variables and parameters are real numbers (e.g., Blum, Shub, and Smale, 1989; Koiran, 1993; Bournez and Cosnard, 1996; Siegelmann, 1996; Moore, 1996) has focused largely on questions about computational complexity and tractability. It is also revealing to examine th ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
The recent work on automata whose variables and parameters are real numbers (e.g., Blum, Shub, and Smale, 1989; Koiran, 1993; Bournez and Cosnard, 1996; Siegelmann, 1996; Moore, 1996) has focused largely on questions about computational complexity and tractability. It is also revealing to examine the metric relations that such systems induce on automata via the natural metrics on their parameter spaces. This brings the theory of computational classification closer to theories of learning and statistical modeling which depend on measuring distances between models. With this in mind, I develop a generalized method of identifying pushdown automata in one class of realvalued automata. I show how the realvalued automata can be implemented in neural networks. I then explore the metric organization of these automata in a basic example, showing how it fleshes out the skeletal structure of the Chomsky Hierarchy and indicates new approaches to problems in language learning and language typolog...
Analogy Retrieval and Processing With Distributed Vector Representations
, 1998
"... : Holographic Reduced Representations (HRRs) are a method for encoding nested relational structures in fixed width vector representations. HRRs encode relational structures as vector representations in such a way that the superficial similarity of the vectors reflects both superficial and structural ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
: Holographic Reduced Representations (HRRs) are a method for encoding nested relational structures in fixed width vector representations. HRRs encode relational structures as vector representations in such a way that the superficial similarity of the vectors reflects both superficial and structural similarity of the relational structures. HRRs also support a number of operations that could be very useful in psychological models of human analogy processing: fast estimation of superficial and structural similarity via a vector dotproduct; finding corresponding objects in two structures; and chunking of vector representations. Although similarity assessment and discovery of corresponding objects both theoretically take exponential time to perform fully and accurately, with HRRs one can obtain approximate solutions in constant time. The accuracy of these operations with HRRs mirrors patterns of human performance on analog retrieval and processing tasks. Keywords: neural networks, distributed representations, binding, analogy, analog retrieval, structure, chunking, systematicity 1
A Computational Memory and Processing Model for Prosody
, 1998
"... This paper links prosody to the information in the text and how it is processed by the speaker. It describes the operation and output of Loq, a texttospeech implementation that includes a model of limited attention and working memory. Attentional limitations are key. Varying the attentional parame ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper links prosody to the information in the text and how it is processed by the speaker. It describes the operation and output of Loq, a texttospeech implementation that includes a model of limited attention and working memory. Attentional limitations are key. Varying the attentional parameter in the simulations varies in turn what counts as given and new in a text, and therefore, the intonational contours with which it is uttered. Currently, the system produces prosody in three different styles: childlike, adult expressive, and knowledgeable. This prosody also exhibits differences within each style – no two simulations are alike. The limited resource approach captures some of the stylistic and individual variety found in natural prosody. 1.