Results 1  10
of
19
Iterative Retrieval of Sparsely Coded Associative Memory Patterns
 Neural Networks
, 1995
"... We investigate the pattern completion performance of neural autoassociative memories composed of binary threshold neurons for sparsely coded binary memory patterns. Focussing on iterative retrieval, effective threshold control strategies are introduced. These are investigated by means of computer s ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
We investigate the pattern completion performance of neural autoassociative memories composed of binary threshold neurons for sparsely coded binary memory patterns. Focussing on iterative retrieval, effective threshold control strategies are introduced. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider the completion capacity C and the mean retrieval errors. The asymptotic completion capacity values for the recall of sparsely coded binary patterns in onestep retrieval is known to be ln 2=4 ß 17:32% for binary Hebbian learning, and 1=(8 ln2) ß 18% for additive Hebbian learning [Palm, 1988]. These values are accomplished with vanishing error probability and yet are higher than those obtained in other known neural memory models. Recent investigations on binary Hebbian learning have proved that iterative retrieval as a more refined retrieval method does not improve the asymptotic completion capacit...
Cell Assemblies, Associative Memory and Temporal Structure in Brain Signals
"... : In this work we discuss Hebb's old ideas about cell assemblies in the light of recent results concerning temporal structure and correlations in neural signals. We want to give a conceptual, necessarily only rough picture, how ideas about `binding by synchronisation', `synfire chains&apos ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
: In this work we discuss Hebb's old ideas about cell assemblies in the light of recent results concerning temporal structure and correlations in neural signals. We want to give a conceptual, necessarily only rough picture, how ideas about `binding by synchronisation', `synfire chains', `local and global assemblies', `short and long term memory' and `behaviour' might be integrated into a coherent model of brain functioning based on neuronal assemblies. Keywords: cell assemblies, synchronization, gammaoscillations, synfire chains, memory, behaviour 1 ASSEMBLIES AND ASSOCIATIVE MEMORIES 1.1 Cell Assemblies Cell assemblies have been introduced by Donald Hebb with the intention of providing a functional and at the same time structural model for cortical processes and neuronal representations of external events (Hebb, 1949). According to Hebb's ideas, stimuli, objects, things, but also more abstract entities like concepts, contextual relations, ideas, and so on are thought of being repre...
Bayesian Retrieval in Associative Memories with Storage Errors
 IEEE Trans. Neural Networks
, 1998
"... It is well known that for finitesized networks, onestep retrieval in the autoassociative Willshaw net is a suboptimal way to extract the information stored in the synapses. Iterative retrieval strategies are much better, but have hitherto only had heuristic justification. We show how they emerge ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
It is well known that for finitesized networks, onestep retrieval in the autoassociative Willshaw net is a suboptimal way to extract the information stored in the synapses. Iterative retrieval strategies are much better, but have hitherto only had heuristic justification. We show how they emerge naturally from considerations of probabilistic inference under conditions of noisy and partial input and a corrupted weight matrix. We start from the conditional probability distribution over possible patterns for retrieval. This contains all possible information that is available to an observer of the network and the initial input. Since this distribution is over exponentially many patterns, we use it to develop two approximate, but tractable, iterative retrieval methods. One performs maximum likelihood inference to find the single most likely pattern, using the (negative log of the) conditional probability as a Lyapunov function for retrieval. In physics terms, if storage errors are present, then the modified iterative update equations contain an additional antiferromagnetic interaction term and site dependent threshold values. The second method makes a mean field assumption to optimize a tractable estimate of the full conditional probability distribution. This leads to iterative mean field equations which can be interpreted in terms of a network of neurons with sigmoidal responses but with the same interactions and thresholds as in the maximum likelihood update equations. In the absence of storage errors, both models become very similiar to the Willshaw model, where standard retrieval is iterated using a particular form of linear threshold strategy.
Associative Data Storage and Retrieval in Neural Networks
, 1995
"... Associative storage and retrieval of binary random patterns in various neural net models with onestep thresholddetection retrieval and local learning rules are the subject of this paper. For different heteroassociation and autoassociation memory tasks, specified by the properties of the pattern s ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
Associative storage and retrieval of binary random patterns in various neural net models with onestep thresholddetection retrieval and local learning rules are the subject of this paper. For different heteroassociation and autoassociation memory tasks, specified by the properties of the pattern sets to be stored and upper bounds on the retrieval errors, we compare the performance of various models of finite as well as asymptotically infinite size. In infinite models, we consider the case of asymptotically sparse patterns, where the mean activity in a pattern vanishes, and study two asymptotic fidelity requirements: constant error probabilities and vanishing error probabilities. A signaltonoise ratio analysis is carried out for one retrieval step where the calculations are comparatively straightforward and easy. As performance measures we propose and evaluate information capacities in bits/synapse which also take into account the important property of fault tolerance. For autoasso...
Identification Criteria and Lower Bounds for Perceptronlike Learning Rules
 Neural Computation
, 1998
"... Perceptronlike learning rules are known to require exponentially many correction steps in order to identify Boolean threshold functions exactly. We introduce criteria that are weaker than exact identification and investigate whether learning becomes significantly faster if exact identification is r ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Perceptronlike learning rules are known to require exponentially many correction steps in order to identify Boolean threshold functions exactly. We introduce criteria that are weaker than exact identification and investigate whether learning becomes significantly faster if exact identification is replaced by one of these criteria: PAC identification, order identification, and sign identification. PAC identification is based on the learning paradigm introduced by Valiant and known to be easier than exact identification. Order identification uses the fact that each threshold function induces an ordering relation on the input variables which can be represented by weights of linear size. Sign identification is based on a property of threshold functions known as unateness and requires only weights of constant size. We show that Perceptronlike learning rules cannot satisfy these criteria when the number of correction steps is to be bounded by a polynomial. We also present an exponential lo...
Bidirectional Retrieval from Associative Memory
 In Advances in Neural Information Processing Systems 10 (NIPS
, 1998
"... Similarity based fault tolerant retrieval in neural associative memories (NAM) has not lead to wiedespread applications. A drawback of the efficient Willshaw model for sparse patterns [Ste61, WBLH69], is that the high asymptotic information capacity is of little practical use because of high cross t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Similarity based fault tolerant retrieval in neural associative memories (NAM) has not lead to wiedespread applications. A drawback of the efficient Willshaw model for sparse patterns [Ste61, WBLH69], is that the high asymptotic information capacity is of little practical use because of high cross talk noise arising in the retrieval for finite sizes. Here a new bidirectional iterative retrieval method for the Willshaw model is presented, called crosswise bidirectional (CB) retrieval, providing enhanced performance. We discuss its asymptotic capacity limit, analyze the first step, and compare it in experiments with the Willshaw model. Applying the very efficient CB memory model either in information retrieval systems or as a functional model for reciprocal corticocortical pathways requires more than robustness against random noise in the input: Our experiments show also the segmentation ability of CBretrieval with addresses containing the superposition of pattens, provided even at hig...
On the Complexity of Consistency Problems for Neurons with Binary Weights
, 1994
"... We inquire into the complexity of training a neuron with binary weights when the training examples are Boolean and required to have bounded coincidence and heaviness. Coincidence of an example set is defined as the maximum inner product of two elements, heaviness of an example set is the maximum Ham ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We inquire into the complexity of training a neuron with binary weights when the training examples are Boolean and required to have bounded coincidence and heaviness. Coincidence of an example set is defined as the maximum inner product of two elements, heaviness of an example set is the maximum Hammingweight of an element. We use both as parameters to define classes of restricted consistency problems and ask for which values they are NPcomplete or solvable in polynomial time. The consistency problem is shown to be NPcomplete when the example sets are allowed to have coincidence at least 1 and heaviness at least 4. On the other hand, we give lineartime algorithms for solving consistency problems with coincidence 0 or heaviness at most 3. Moreover, these results remain valid when the threshold of the neuron is bounded by a constant of value at least 2, whereas consistency can be decided in linear time for neurons with threshold at most 1. We also study maximum consistency problems an...
Associating words to visually recognized objects
 Papers from the AAAI Workshop. Technical Report WS0403
, 2004
"... Using associative memories and sparse distributed representations we have developed a system that can learn to associate words with objects, properties like colors, and actions. This system is used in a robotics context to enable a robot to respond to spoken commands like ”bot show plum ” or ”bot p ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Using associative memories and sparse distributed representations we have developed a system that can learn to associate words with objects, properties like colors, and actions. This system is used in a robotics context to enable a robot to respond to spoken commands like ”bot show plum ” or ”bot put apple to yellow cup”. The scenario for this is a robot close to one or two tables on which there are certain kinds of fruit and/or other simple objects. We can demonstrate part of this scenario where the task is to find certain fruits in a complex visual scene according to spoken or typed commands. This involves parsing and understanding of simple sentences and relating the nouns to concrete objects sensed by the camera and recognized by a neural network from the visual input.
Network capacity analysis for latent attractor computation
 Network: Computation in Neural Systems
"... ..."
(Show Context)
Outline of a Linear Neural Network
 NEUROCOMPUTING
, 1996
"... By utilizing a new definition of product, we develop a neural net model. The memorization and generalization capabilities are investigated in an Information Theory fashion. To show the memorization capabilities, we use it as a decoder, and prove the net reduces the error probability to zero in the r ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
By utilizing a new definition of product, we develop a neural net model. The memorization and generalization capabilities are investigated in an Information Theory fashion. To show the memorization capabilities, we use it as a decoder, and prove the net reduces the error probability to zero in the range of the error correcting capacity of the used code. To show the generalization capabilities, we use it to infer a code from patterns received by a noisy channel. When the data are affected by independent random errors, this strategy is shown to require a small number of patterns to obtain a good identification with high probability of the code from the noisy data. We also address its use as an associative memory.