Results 1  10
of
50
ConvergenceZone Episodic Memory: Analysis and Simulations
 NEURAL NETWORKS
, 1997
"... Human episodic memory provides a seemingly unlimited storage for everyday experiences, and a retrieval system that allows us to access the experiences with partial activation of their components. The system is believed to consist of a fast, temporary storage in the hippocampus, and a slow, longterm ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Human episodic memory provides a seemingly unlimited storage for everyday experiences, and a retrieval system that allows us to access the experiences with partial activation of their components. The system is believed to consist of a fast, temporary storage in the hippocampus, and a slow, longterm storage within the neocortex. This paper presents a neural network model of the hippocampal episodic memory inspired by Damasio's idea of Convergence Zones. The model consists of a layer of perceptual feature maps and a binding layer. A perceptual feature pattern is coarse coded in the binding layer, and stored on the weights between layers. A partial activation of the stored features activates the binding pattern, which in turn reactivates the entire stored pattern. For many configurations of the model, a theoretical lower bound for the memory capacity can be derived, and it can be an order of magnitude or higher than the number of all units in the model, and several orders of magnitude higher than the number of bindinglayer units. Computational simulations further indicate that the average capacity is an order of magnitude larger than the theoretical lower bound, and making the connectivity between layers sparser causes an even further increase in capacity. Simulations also show that if more descriptive binding patterns are used, the errors tend to be more plausible (patterns are confused with other similar patterns), with a slight cost in capacity. The convergencezone episodic memory therefore accounts for the immediate storage and associative retrieval capability and large capacity of the hippocampal memory, and shows why the memory encoding areas can be much smaller than the perceptual maps, consist of rather coarse computational units, and be only sparsely connected t...
Computational Complexity Of Neural Networks: A Survey
, 1994
"... . We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. Our main emphasis is on the computational power of various acyclic and cyclic network models, but we also discuss briefly the complexity aspects of synthesizing networks fr ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
. We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. Our main emphasis is on the computational power of various acyclic and cyclic network models, but we also discuss briefly the complexity aspects of synthesizing networks from examples of their behavior. CR Classification: F.1.1 [Computation by Abstract Devices]: Models of Computationneural networks, circuits; F.1.3 [Computation by Abstract Devices ]: Complexity Classescomplexity hierarchies Key words: Neural networks, computational complexity, threshold circuits, associative memory 1. Introduction The currently again very active field of computation by "neural" networks has opened up a wealth of fascinating research topics in the computational complexity analysis of the models considered. While much of the general appeal of the field stems not so much from new computational possibilities, but from the possibility of "learning", or synthesizing networks...
Gibbs States Of The Hopfield Model In The Regime Of Perfect Memory
, 1994
"... : We study the thermodynamic properties of the Hopfield model of an autoassociative memory. If N denotes the number of neurons and M(N) the number of stored patterns, we prove the following results: If M N # 0 as N " 1, then there exists an infinite number of infinite volume Gibbs measures for all ..."
Abstract

Cited by 22 (11 self)
 Add to MetaCart
: We study the thermodynamic properties of the Hopfield model of an autoassociative memory. If N denotes the number of neurons and M(N) the number of stored patterns, we prove the following results: If M N # 0 as N " 1, then there exists an infinite number of infinite volume Gibbs measures for all temperatures T ! 1 concentrated on spin configurations that have overlap with exactly one specific pattern. Moreover, the measures induced on the overlap parameters are Dirac measures concentrated on a single point. If M N ! ff, as N " 1 for ff small enough, we show that for temperatures T smaller than some T (ff) ! 1, the induced measures can have support only on a disjoint union of balls around the previous points, but we cannot construct the infinite volume measures through convergent sequences of measures. Subject Classification Numbers: 60K35, 82B44, 82C32 # Work partially supported by the Commission of the European Communities under contract No. SC1CT910695 1 email: bovier@iaa...
Associative neural network model for the generation of temporal patterns: Theory and application to central pattern generators
 Biophys J
, 1988
"... ABSTRACT Cyclic patterns ofmotor neuron activity are involved in the production ofmany rhythmic movements, such as walking, swimming, and scratching. These movements are controlled by neural circuits referred to as central pattern generators (CPGs). Some of these circuits function in the absence of ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
ABSTRACT Cyclic patterns ofmotor neuron activity are involved in the production ofmany rhythmic movements, such as walking, swimming, and scratching. These movements are controlled by neural circuits referred to as central pattern generators (CPGs). Some of these circuits function in the absence of both internal pacemakers and external feedback. We describe an associative neural network model whose dynamic behavior is similar to that of CPGs. The theory predicts the strength of all possible connections between pairs ofneurons on the basis ofthe outputs oftheCPG. It also allows themean operating levels ofthe neurons tobededuced from themeasured synaptic strengthsbetween the pairs of neurons. We apply our theory to theCPG controlling escape swimming in the mollusk Tritonia diomedea. The basic rhythmic behavior is shown to be consistent with a simplified model that approximates neurons as threshold units and slow synaptic responses as elementary time delays. The model we describe may have relevance to other fixed action behaviors, as well as to the learning, recall, and recognition oftemporally ordered information.
Analysis of Synfire Chains
 Network
, 1995
"... The biological implications of synfire chain neural networks are explored by studying two idealized models. In the first a network model is proposed with binary firing neurons and parallel updating. This model can be solved exactly in the thermodynamic limit using mean field theory. An explicit e ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
The biological implications of synfire chain neural networks are explored by studying two idealized models. In the first a network model is proposed with binary firing neurons and parallel updating. This model can be solved exactly in the thermodynamic limit using mean field theory. An explicit equation for the capacity is obtained. In the second model the synchrony of the pulse of activity along a synfire chain is investigated in the context of simple integrateandfire neurons. It is found that under natural assumptions a near synchronous wave of activity can stably propagate along a synfire chain. The relevance of this result to real systems is discussed. 1 Introduction Synfire chains have been proposed by Abeles [1, 2] as a model of cortical function. Interest in them has grown because they provide an explanation for otherwise mysterious measurements of precise spike timing. A synfire chain consists of small pools of neurons linked together in a feedforward chain so that ...
Complexity Issues in Discrete Hopfield Networks
, 1994
"... We survey some aspects of the computational complexity theory of discretetime and discretestate Hopfield networks. The emphasis is on topics that are not adequately covered by the existing survey literature, most significantly: 1. the known upper and lower bounds for the convergence times of Hopfi ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
We survey some aspects of the computational complexity theory of discretetime and discretestate Hopfield networks. The emphasis is on topics that are not adequately covered by the existing survey literature, most significantly: 1. the known upper and lower bounds for the convergence times of Hopfield nets (here we consider mainly worstcase results); 2. the power of Hopfield nets as general computing devices (as opposed to their applications to associative memory and optimization); 3. the complexity of the synthesis ("learning") and analysis problems related to Hopfield nets as associative memories. Draft chapter for the forthcoming book The Computational and Learning Complexity of Neural Networks: Advanced Topics (ed. Ian Parberry).
Learning symmetry groups with hidden units: Beyond the perceptron
 Physica
, 1986
"... Learning to recognize mirror, rotational and translational symmetries is a difficult problem for massivelyparallel network models. These symmetries cannot be learned by firstorder perceptrons or Hopfield networks, which have no means for incorporating additional adaptive units that are hidden from ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
Learning to recognize mirror, rotational and translational symmetries is a difficult problem for massivelyparallel network models. These symmetries cannot be learned by firstorder perceptrons or Hopfield networks, which have no means for incorporating additional adaptive units that are hidden from the input and output layers. We demonstrate that the Boltzmann learning algorithm is capable of finding sets of weights which turn hidden units into useful higherorder feature detectors capable of solving symmetry problems. 1.
Large Deviation Principles For The Hopfield Model And The KacHopfield Model
, 1994
"... : We study the Kac version of the Hopfield model and prove a LebowitzPenrose theorem for the distributions of the overlap parameters. At the same time, we prove a large deviation principle for the standard Hopfield model with infinitely many patterns. Subject Classification Numbers: 60K35, 82B44, ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
: We study the Kac version of the Hopfield model and prove a LebowitzPenrose theorem for the distributions of the overlap parameters. At the same time, we prove a large deviation principle for the standard Hopfield model with infinitely many patterns. Subject Classification Numbers: 60K35, 82B44, 82C32 # Work partially supported by the Commission of the European Communities under contract No. SC1CT910695 1 email: bovier@iaasberlin.d400.de 2 email: gayrard@cpt.univmrs.fr 3 email: picco@cpt.univmrs.fr 1. Introduction In 1977 Figotin and Pastur [11,12] introduced a class of simplified and exactly solvable models for meanfield spinglasses, in which the random interaction J ij between two spins was of the form J ij = P p ¯=1 ¸ ¯ i ¸ ¯ j , with ¸ ¯ i , i 2 IN , ¯ 2 f1; : : : ; pg a family of independent, identically distributed random variables, taking, in the simplest case, the values +1 and \Gamma1 with equal probability. While these at first did not receive much...
High Capacity, Small World Associative Memory Models
"... Models of associative memory usually have full connectivity or if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, nonsymmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. Th ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
Models of associative memory usually have full connectivity or if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, nonsymmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. The units are given position and are arranged in a ring. The connectivity graph varies between being local to random via a small world regime, with short pathlengths between any two neurons. The connectivity may be symmetric or nonsymmetric. The results show that it is the smallworld networks with nonsymmetric weights and nonsymmetric connectivity that perform best as associative memories. It is also shown that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities.
Learning In Neural Network Memories
 Network
, 1990
"... Various algorithms for constructing a synaptic coupling matrix which can associatively map input patterns onto nearby stored memory patterns are reviewed. Issues discussed include performance, capacity, speed, efficiency and biological plausibility. 0 Research supported by Department of Energy Cont ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Various algorithms for constructing a synaptic coupling matrix which can associatively map input patterns onto nearby stored memory patterns are reviewed. Issues discussed include performance, capacity, speed, efficiency and biological plausibility. 0 Research supported by Department of Energy Contract DEAC0276ER03230. I. Introduction The term `learning' is applied to a wide range of activities associated with the construction of neural networks ranging from singlelayer binary classifiers [1] to multilayered systems performing relatively sophisticated tasks [2]. Any reviewer hoping to cover this field in a reasonable amount of time and space must do so with a severely restricted viewpoint. Here, I will concentrate on a fairly simple task, associative memory, accomplished by a singlelayered iterative network of binary elements [3, 4, 5]. This area is considered because there are now available a large number of precise analytic results and a wealth of ideas and approaches have ap...