Results 1  10
of
156
Weakly pulsecoupled oscillators, FM interactions, synchronization, and oscillatory associative memory
 IEEE Trans. Neural Networks
, 1999
"... Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
(Show Context)
Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and oscillatory associative properties are easier to analyze and understand. Using the phase model, we can predict whether a given pulsecoupled network has oscillatory associative memory, or what minimal adjustments should be made so that it can acquire memory. In the search for such minimal adjustments we obtain a large class of simple pulsecoupled neural networks that can memorize and reproduce synchronized temporal patterns the same way a Hopfield network does with static patterns. The learning occurs via modification of synaptic weights and/or synaptic transmission delays. Index Terms — Canonical models, Class 1 neural excitability, integrateandfire neurons, multiplexing, synfire chain, transmission delay. I.
Dynamic Stochastic Synapses as Computational Units
, 1999
"... In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inhere ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. We consider
SelfOrganization of Spiking Neurons Using Action Potential Timing
, 1998
"... We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast t ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to previous approaches, which use rate coding, the winner among competing neurons can be determined fast and locally. Our model is a further step towards a more realistic description of unsupervised learning in biological neural systems. Furthermore, it may provide a basis for fast implementations in pulsed VLSI. Keywords Selforganizing map, spiking neurons, temporal coding, unsupervised learning. I. Introduction In the area of modelling information processing in biological neural systems, there is an ongoing debate about which essentials have to be taken into account (see e.g. [1], [2], [3], [4]). Discrete models, such as threshold gates or McCullochPitts neurons, are undoubtedly very simplis...
On Computing Boolean Functions by a Spiking Neuron
 Annals of Mathematics and Artificial Intelligence
, 1998
"... Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigma ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
(Show Context)
Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigmapi units (or polynomial threshold gates). In particular, we estimate the number of gates required to simulate a spiking neuron by a disjunction of threshold gates and we establish tight bounds for this threshold number. Furthermore, we analyze the degree of the polynomials that a sigmapi unit must use for the simulation of a spiking neuron. We show that this degree cannot be bounded by any fixed value. Our results give evidence that the use of continuous time as a computational resource endows singlecell models with substantially larger computational capabilities. 1 Introduction Biological neurons communicate by sending spikes among themselves. A spike is a discrete event in continuous ti...
On the relevance of time in neural computation and learning
 Theoretical Computer Science
"... ..."
(Show Context)
On the Complexity of Learning for a Spiking Neuron
, 1997
"... ) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captu ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or "Perceptrons"). We introduce a simple model of a spiking neuron that, in addition to the weights that model the plasticity of synaptic strength, also has variable transmission delays between neurons as programmable parameters. For coding of input and output values two modes are taken into account: binary coding for the Boolean and analog coding for the realvalued domain. We investigate the complexity of learning for a single spiking neuron within the framework of PAClearnability. With regard to sample complexity, we prove that the VCdimension is \Theta(n log n) and, hence, strictly larger than that of a thresho...
On computation with pulses
 Information and Computation
, 1999
"... We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for com ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI&quot;). In preceding work it was shown that the computational power of formal models for computation with pulses is quite high if the pulses arriving at a computational unit have an approximately linearly rising or linearly decreasing initial segment. This property is satis ed by common models for biological neurons. On the other hand several implementations of pulse stream VLSI employ pulses that are approximately piecewise constant (i.e. step functions). In this article we investigate the relevance of the shape of pulses in formal models for computation with pulses. It turns out that the computational power drops signi cantly if one replaces pulses with linearly rising or decreasing initial segments by piecewise constant pulses. We provide an exact characterization of the latter model in terms of a weak version of a random access machine (RAM). We also compare the language recognition capability of a recurrent version of this model with that of deterministic nite automata and Turing machines. 1