Results 1  10
of
15
Iterative Retrieval of Sparsely Coded Associative Memory Patterns
 Neural Networks
, 1995
"... We investigate the pattern completion performance of neural autoassociative memories composed of binary threshold neurons for sparsely coded binary memory patterns. Focussing on iterative retrieval, effective threshold control strategies are introduced. These are investigated by means of computer s ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
We investigate the pattern completion performance of neural autoassociative memories composed of binary threshold neurons for sparsely coded binary memory patterns. Focussing on iterative retrieval, effective threshold control strategies are introduced. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider the completion capacity C and the mean retrieval errors. The asymptotic completion capacity values for the recall of sparsely coded binary patterns in onestep retrieval is known to be ln 2=4 ß 17:32% for binary Hebbian learning, and 1=(8 ln2) ß 18% for additive Hebbian learning [Palm, 1988]. These values are accomplished with vanishing error probability and yet are higher than those obtained in other known neural memory models. Recent investigations on binary Hebbian learning have proved that iterative retrieval as a more refined retrieval method does not improve the asymptotic completion capacit...
Cell Assemblies, Associative Memory and Temporal Structure in Brain Signals
"... : In this work we discuss Hebb's old ideas about cell assemblies in the light of recent results concerning temporal structure and correlations in neural signals. We want to give a conceptual, necessarily only rough picture, how ideas about `binding by synchronisation', `synfire chains', `local and g ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
: In this work we discuss Hebb's old ideas about cell assemblies in the light of recent results concerning temporal structure and correlations in neural signals. We want to give a conceptual, necessarily only rough picture, how ideas about `binding by synchronisation', `synfire chains', `local and global assemblies', `short and long term memory' and `behaviour' might be integrated into a coherent model of brain functioning based on neuronal assemblies. Keywords: cell assemblies, synchronization, gammaoscillations, synfire chains, memory, behaviour 1 ASSEMBLIES AND ASSOCIATIVE MEMORIES 1.1 Cell Assemblies Cell assemblies have been introduced by Donald Hebb with the intention of providing a functional and at the same time structural model for cortical processes and neuronal representations of external events (Hebb, 1949). According to Hebb's ideas, stimuli, objects, things, but also more abstract entities like concepts, contextual relations, ideas, and so on are thought of being repre...
Associative Data Storage and Retrieval in Neural Networks
, 1995
"... Associative storage and retrieval of binary random patterns in various neural net models with onestep thresholddetection retrieval and local learning rules are the subject of this paper. For different heteroassociation and autoassociation memory tasks, specified by the properties of the pattern s ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
Associative storage and retrieval of binary random patterns in various neural net models with onestep thresholddetection retrieval and local learning rules are the subject of this paper. For different heteroassociation and autoassociation memory tasks, specified by the properties of the pattern sets to be stored and upper bounds on the retrieval errors, we compare the performance of various models of finite as well as asymptotically infinite size. In infinite models, we consider the case of asymptotically sparse patterns, where the mean activity in a pattern vanishes, and study two asymptotic fidelity requirements: constant error probabilities and vanishing error probabilities. A signaltonoise ratio analysis is carried out for one retrieval step where the calculations are comparatively straightforward and easy. As performance measures we propose and evaluate information capacities in bits/synapse which also take into account the important property of fault tolerance. For autoasso...
Bayesian Retrieval in Associative Memories with Storage Errors
 IEEE Trans. Neural Networks
, 1998
"... It is well known that for finitesized networks, onestep retrieval in the autoassociative Willshaw net is a suboptimal way to extract the information stored in the synapses. Iterative retrieval strategies are much better, but have hitherto only had heuristic justification. We show how they emerge ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
It is well known that for finitesized networks, onestep retrieval in the autoassociative Willshaw net is a suboptimal way to extract the information stored in the synapses. Iterative retrieval strategies are much better, but have hitherto only had heuristic justification. We show how they emerge naturally from considerations of probabilistic inference under conditions of noisy and partial input and a corrupted weight matrix. We start from the conditional probability distribution over possible patterns for retrieval. This contains all possible information that is available to an observer of the network and the initial input. Since this distribution is over exponentially many patterns, we use it to develop two approximate, but tractable, iterative retrieval methods. One performs maximum likelihood inference to find the single most likely pattern, using the (negative log of the) conditional probability as a Lyapunov function for retrieval. In physics terms, if storage errors are present, then the modified iterative update equations contain an additional antiferromagnetic interaction term and site dependent threshold values. The second method makes a mean field assumption to optimize a tractable estimate of the full conditional probability distribution. This leads to iterative mean field equations which can be interpreted in terms of a network of neurons with sigmoidal responses but with the same interactions and thresholds as in the maximum likelihood update equations. In the absence of storage errors, both models become very similiar to the Willshaw model, where standard retrieval is iterated using a particular form of linear threshold strategy.
Identification Criteria and Lower Bounds for Perceptronlike Learning Rules
 Neural Computation
, 1998
"... Perceptronlike learning rules are known to require exponentially many correction steps in order to identify Boolean threshold functions exactly. We introduce criteria that are weaker than exact identification and investigate whether learning becomes significantly faster if exact identification is r ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Perceptronlike learning rules are known to require exponentially many correction steps in order to identify Boolean threshold functions exactly. We introduce criteria that are weaker than exact identification and investigate whether learning becomes significantly faster if exact identification is replaced by one of these criteria: PAC identification, order identification, and sign identification. PAC identification is based on the learning paradigm introduced by Valiant and known to be easier than exact identification. Order identification uses the fact that each threshold function induces an ordering relation on the input variables which can be represented by weights of linear size. Sign identification is based on a property of threshold functions known as unateness and requires only weights of constant size. We show that Perceptronlike learning rules cannot satisfy these criteria when the number of correction steps is to be bounded by a polynomial. We also present an exponential lo...
Bidirectional Retrieval from Associative Memory
 In Advances in Neural Information Processing Systems 10 (NIPS
, 1998
"... Similarity based fault tolerant retrieval in neural associative memories (NAM) has not lead to wiedespread applications. A drawback of the efficient Willshaw model for sparse patterns [Ste61, WBLH69], is that the high asymptotic information capacity is of little practical use because of high cross t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Similarity based fault tolerant retrieval in neural associative memories (NAM) has not lead to wiedespread applications. A drawback of the efficient Willshaw model for sparse patterns [Ste61, WBLH69], is that the high asymptotic information capacity is of little practical use because of high cross talk noise arising in the retrieval for finite sizes. Here a new bidirectional iterative retrieval method for the Willshaw model is presented, called crosswise bidirectional (CB) retrieval, providing enhanced performance. We discuss its asymptotic capacity limit, analyze the first step, and compare it in experiments with the Willshaw model. Applying the very efficient CB memory model either in information retrieval systems or as a functional model for reciprocal corticocortical pathways requires more than robustness against random noise in the input: Our experiments show also the segmentation ability of CBretrieval with addresses containing the superposition of pattens, provided even at hig...
On the Complexity of Consistency Problems for Neurons with Binary Weights
, 1994
"... We inquire into the complexity of training a neuron with binary weights when the training examples are Boolean and required to have bounded coincidence and heaviness. Coincidence of an example set is defined as the maximum inner product of two elements, heaviness of an example set is the maximum Ham ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We inquire into the complexity of training a neuron with binary weights when the training examples are Boolean and required to have bounded coincidence and heaviness. Coincidence of an example set is defined as the maximum inner product of two elements, heaviness of an example set is the maximum Hammingweight of an element. We use both as parameters to define classes of restricted consistency problems and ask for which values they are NPcomplete or solvable in polynomial time. The consistency problem is shown to be NPcomplete when the example sets are allowed to have coincidence at least 1 and heaviness at least 4. On the other hand, we give lineartime algorithms for solving consistency problems with coincidence 0 or heaviness at most 3. Moreover, these results remain valid when the threshold of the neuron is bounded by a constant of value at least 2, whereas consistency can be decided in linear time for neurons with threshold at most 1. We also study maximum consistency problems an...
Outline of a Linear Neural Network
 NEUROCOMPUTING
, 1996
"... By utilizing a new definition of product, we develop a neural net model. The memorization and generalization capabilities are investigated in an Information Theory fashion. To show the memorization capabilities, we use it as a decoder, and prove the net reduces the error probability to zero in the r ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
By utilizing a new definition of product, we develop a neural net model. The memorization and generalization capabilities are investigated in an Information Theory fashion. To show the memorization capabilities, we use it as a decoder, and prove the net reduces the error probability to zero in the range of the error correcting capacity of the used code. To show the generalization capabilities, we use it to infer a code from patterns received by a noisy channel. When the data are affected by independent random errors, this strategy is shown to require a small number of patterns to obtain a good identification with high probability of the code from the noisy data. We also address its use as an associative memory.
Network capacity analysis for latent attractor computation
 Network: Computation in Neural Systems
"... Attractor networks have been one of the most successful paradigms in neural computation, and have been used as models of computation in the nervous system. Recently, we proposed a paradigm called ‘latent attractors’ where attractors embedded in a recurrent network via Hebbian learning are used to ch ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Attractor networks have been one of the most successful paradigms in neural computation, and have been used as models of computation in the nervous system. Recently, we proposed a paradigm called ‘latent attractors’ where attractors embedded in a recurrent network via Hebbian learning are used to channel network response to external input rather than becoming manifest themselves. This allows the network to generate contextsensitive internal codes in complex situations. Latent attractors are particularly helpful in explaining computations within the hippocampus—a brain region of fundamental significance for memory and spatial learning. Latent attractor networks are a special case of associative memory networks. The model studied here consists of a twolayer recurrent network with attractors stored in the recurrent connections using a clipped Hebbian learning rule. The firing in both layers is competitive—K winners take all firing. The number of neurons allowed to fire, K,issmaller than the size of the
Dynamics of SpatioTemporal Patterns in Associative Networks of Spiking Neurons
 In ICANN99: Ninth International Conference on Artificial Neural Networks. Institute of Electrical Engineers
, 2000
"... This paper studies dynamical properties of spatiotemporal pattern sequences ("synfire chains") in associative networks of spiking neurons. Employing postsynaptic potentials with a finite risetime, the replay speed of stored sequences can be controled by unspecific background signals. In addition, ..."
Abstract
 Add to MetaCart
This paper studies dynamical properties of spatiotemporal pattern sequences ("synfire chains") in associative networks of spiking neurons. Employing postsynaptic potentials with a finite risetime, the replay speed of stored sequences can be controled by unspecific background signals. In addition, the speed also depends on the number of coactivated sequences, but balanced inhibition can prevent this dependency. An implicit equation is derived and solved numerically which relates the speed to the network parameters. Simulations reveal instabilities for low and high control signals, which are traced back to four different destabilizing mechanisms. Key words: Synfire chains; Spiking neurons; Spatiotemporal associative memory; 1 Introduction The synfirechain model has been introduced by M. Abeles in order to explain precise spike patterns observable in the cortex on timescales of up to hundreds of milliseconds [1]. The model assumes that highly specific spatiotemporal firing patter...