Results 1  10
of
53
Processing Capacity Defined by Relational Complexity: Implications for Comparative, Developmental, and Cognitive Psychology
, 1989
"... It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of differen ..."
Abstract

Cited by 95 (9 self)
 Add to MetaCart
It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of different ages. Complexity is defined by the number of dimensions, or sources of variation, that are related. A unary relation has one argument and one source of variation, because its argument can be instantiated in only one way at a time. A binary relation has two arguments, and two sources of variation, because two argument instantiations are possible at once. Similarly, a ternary relation is three dimensional, a quaternary relation is four dimensional, and so on. Dimensionality is related to number of chunks, because both attributes on dimensions and chunks are independent units of information of arbitrary size. Empirical studies of working memory limitations indicate a soft limit which corresponds to processing one quaternary relation in parallel. More complex concepts are processed by segmentation or conceptual chunking. Segmentation entails breaking tasks into components which do not exceed processing capacity, and which are processed serially. Conceptual chunking entails "collapsing" representations to reduce their dimensionality and consequently their processing load, but at the cost of making some relational information inaccessible. Parallel distributed processing implementations of relational representations show that relations with more arguments entail a higher computational cost, which corresponds to empirical observations of higher processing loads in humans. Empirical evidence is presented that relational complexity discriminates between higher species...
Approximating the Semantics of Logic Programs by Recurrent Neural Networks
"... In [18] we have shown how to construct a 3layered recurrent neural network that computes the fixed point of the meaning function TP of a given propositional logic program P, which corresponds to the computation of the semantics of P. In this article we consider the first order case. We define a no ..."
Abstract

Cited by 54 (9 self)
 Add to MetaCart
In [18] we have shown how to construct a 3layered recurrent neural network that computes the fixed point of the meaning function TP of a given propositional logic program P, which corresponds to the computation of the semantics of P. In this article we consider the first order case. We define a notion of approximation for interpretations and prove that there exists a 3layered feed forward neural network that approximates the calculation of TP for a given first order acyclic logic program P with an injective level mapping arbitrarily well. Extending the feed forward network by recurrent connections we obtain a recurrent neural network whose iteration approximates the fixed point of TP. This result is proven by taking advantage of the fact that for acyclic logic programs the function TP is a contraction mapping on a complete metric space defined by the interpretations of the program. Mapping this space to the metric space IR with Euclidean distance, a real valued function fP can be defined which corresponds to TP and is continuous as well as a contraction. Consequently it can be approximated by an appropriately chosen class of feed forward neural networks.
From Words to Understanding
 COMPUTING WITH LARGE RANDOM PATTERNS
"... As was discussed in section 22, language is central to a correct understanding of the mind. Compositional analytic models perform well in the domain and subject area they are developed for, but any extension is difficult and the models have incomplete psychological veracity. Here we explore how to c ..."
Abstract

Cited by 48 (14 self)
 Add to MetaCart
As was discussed in section 22, language is central to a correct understanding of the mind. Compositional analytic models perform well in the domain and subject area they are developed for, but any extension is difficult and the models have incomplete psychological veracity. Here we explore how to compute representations of meaning based on a lower level of abstraction and how to use the models for tasks that require some form of language understanding.
Binary SpatterCoding of Ordered KTuples
 In
, 1996
"... Information with structure is traditionally organized into records with fields. For example, a medical record consisting of name, sex, age, and weight might look like (Joe, male, 66, 77). What 77 stands for is determined by its location in the record, so that this is an example of local representati ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Information with structure is traditionally organized into records with fields. For example, a medical record consisting of name, sex, age, and weight might look like (Joe, male, 66, 77). What 77 stands for is determined by its location in the record, so that this is an example of local representation. The brain's wiring, and robustness under local damage, speak for the importance of distributed representations. The Holographic Reduced Representation (HRR) of Plate is a prime example based on real or complex vectors. This paper describes how spatter coding leads to binary HRRs, and how the fields of a record are encoded into a long binary word without fields and how they are extracted from such a word. 1 Introduction Nested compositional structure is fundamental to highlevel mental functions, such as language and analogy. Accordingly, modeling these functions with neural nets requires that the structures be represented in a form suitable for neural nets.
Multiplicative Binding, Representation Operators and Analogy
"... This paper introduces a novel implementation of the bind() operator that is simple, can be efficiently implemented, and highlights the relationship between retrieval queries and analogical mapping. A frame of role/filler bindings can easily be represented using bind() and bundle(). However, typical ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
This paper introduces a novel implementation of the bind() operator that is simple, can be efficiently implemented, and highlights the relationship between retrieval queries and analogical mapping. A frame of role/filler bindings can easily be represented using bind() and bundle(). However, typical binding systems are unable to adequately represent multiple frames and arbitrary nested compositional structures. A novel family of representational operators (called braid()) is introduced to address these problems. Other binding systems make the strong assumption that the roles and fillers are disjoint in order to avoid ambiguities inherent in their representational idioms. The braid() operator can be used to avoid this assumption. The new representational idiom suggests how the cognitive processes of bottomup and topdown object recognition might be implemented. These processes depend on analogical mapping to integrate disjoint representations and drive perceptual search. Analogical Inference by Systematic Substitution Analogical inference depends on systematic substitution of the components of compositional structures (Gentner, 1983; Halford et al., 1994; Holyoak & Thagard, 1989)
Connectionist sentence processing in perspective
 Cognitive Science
, 1999
"... The emphasis in the connectionist sentenceprocessing literature on distributed representation and emergence of grammar from such systems can easily obscure the often close relations between connectionist and symbolist systems. This paper argues that the Simple Recurrent Network (SRN) models propose ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
The emphasis in the connectionist sentenceprocessing literature on distributed representation and emergence of grammar from such systems can easily obscure the often close relations between connectionist and symbolist systems. This paper argues that the Simple Recurrent Network (SRN) models proposed by Jordan (1989) and Elman (1990) are more directly related to stochastic PartofSpeech (POS) Taggers than to parsers or grammars as such, while autoassociative memory models of the kind pioneered by Longuet–Higgins, Willshaw, Pollack and others may be useful for grammar induction from a networkbased conceptual structure as well as for structurebuilding. These observations suggest some interesting new directions for specifically connectionist sentence processing research, including more efficient representations for finite state machines, and acquisition devices based on a distinctively connectionist basis for grounded symbolist conceptual structure. I.
Analogy Retrieval and Processing With Distributed Vector Representations
, 1998
"... : Holographic Reduced Representations (HRRs) are a method for encoding nested relational structures in fixed width vector representations. HRRs encode relational structures as vector representations in such a way that the superficial similarity of the vectors reflects both superficial and structural ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
: Holographic Reduced Representations (HRRs) are a method for encoding nested relational structures in fixed width vector representations. HRRs encode relational structures as vector representations in such a way that the superficial similarity of the vectors reflects both superficial and structural similarity of the relational structures. HRRs also support a number of operations that could be very useful in psychological models of human analogy processing: fast estimation of superficial and structural similarity via a vector dotproduct; finding corresponding objects in two structures; and chunking of vector representations. Although similarity assessment and discovery of corresponding objects both theoretically take exponential time to perform fully and accurately, with HRRs one can obtain approximate solutions in constant time. The accuracy of these operations with HRRs mirrors patterns of human performance on analog retrieval and processing tasks. Keywords: neural networks, distributed representations, binding, analogy, analog retrieval, structure, chunking, systematicity 1
Dynamical Automata
, 1998
"... The recent work on automata whose variables and parameters are real numbers (e.g., Blum, Shub, and Smale, 1989; Koiran, 1993; Bournez and Cosnard, 1996; Siegelmann, 1996; Moore, 1996) has focused largely on questions about computational complexity and tractability. It is also revealing to examine th ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
The recent work on automata whose variables and parameters are real numbers (e.g., Blum, Shub, and Smale, 1989; Koiran, 1993; Bournez and Cosnard, 1996; Siegelmann, 1996; Moore, 1996) has focused largely on questions about computational complexity and tractability. It is also revealing to examine the metric relations that such systems induce on automata via the natural metrics on their parameter spaces. This brings the theory of computational classification closer to theories of learning and statistical modeling which depend on measuring distances between models. With this in mind, I develop a generalized method of identifying pushdown automata in one class of realvalued automata. I show how the realvalued automata can be implemented in neural networks. I then explore the metric organization of these automata in a basic example, showing how it fleshes out the skeletal structure of the Chomsky Hierarchy and indicates new approaches to problems in language learning and language typolog...
A Computational Memory And Processing Model For Prosody
 In Proceedings of the Intl. Conf. on Spoken Language Processing
, 1998
"... This paper links prosody to the information in the text and how it is processed by the speaker. It describes the operation and output of Loq, a texttospeech implementation that includes a model of limited attention and working memory. Attentional limitations are key. Varying the attentional parame ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper links prosody to the information in the text and how it is processed by the speaker. It describes the operation and output of Loq, a texttospeech implementation that includes a model of limited attention and working memory. Attentional limitations are key. Varying the attentional parameter in the simulations varies in turn what counts as given and new in a text, and therefore, the intonational contours with which it is uttered. Currently, the system produces prosody in three different styles: childlike, adult expressive, and knowledgeable. This prosody also exhibits differences within each style  no two simulations are alike. The limited resource approach captures some of the stylistic and individual variety found in natural prosody. 1. INTRODUCTION Ask any lay person to imitate computer speech and you will be treated to an utterance delivered in melodic and rhythmic monotone, possibly accompanied by choppy articulation and a voice quality that is nasal and strained. ...
Connectionist Variable Binding
 Expert Systems
, 2000
"... : Variable binding has long been a challenge to connectionists. Attempts to perform variable binding using localist and distributed connectionist representations are discussed, and problems inherent in each type of representation are outlined. Keywords: Neural networks, localist representations, dis ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
: Variable binding has long been a challenge to connectionists. Attempts to perform variable binding using localist and distributed connectionist representations are discussed, and problems inherent in each type of representation are outlined. Keywords: Neural networks, localist representations, distributed representations, systematicity, variable binding, unification, inference. 3 1. Introduction In recent years research on intelligent systems has split into two paradigms, the classical symbolic artificial intelligence (AI) paradigm and the connectionist AI paradigm. Researchers working in symbolic AI maintain that the correct level at which to model intelligent systems (including the human mind) is that of the symbol. The symbol is an entity in a computational system that can have arbitrary designations and is used to refer to an entity in the outside world, and the main assumptions on which this paradigm rests were coherently outlined by Newell and Simon (Newell & Simon, 1980) und...