Results 1  10
of
190
Spatial and Temporal Pattern Analysis via Spiking Neurons
 Network: Computation in Neural Systems
, 1998
"... ..."
Computation with spiking neurons
 The Handbook of Brain Theory and Neural Networks
, 2001
"... ..."
Weakly pulsecoupled oscillators, FM interactions, synchronization, and oscillatory associative memory
 IEEE Trans. Neural Networks
, 1999
"... Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
(Show Context)
Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and oscillatory associative properties are easier to analyze and understand. Using the phase model, we can predict whether a given pulsecoupled network has oscillatory associative memory, or what minimal adjustments should be made so that it can acquire memory. In the search for such minimal adjustments we obtain a large class of simple pulsecoupled neural networks that can memorize and reproduce synchronized temporal patterns the same way a Hopfield network does with static patterns. The learning occurs via modification of synaptic weights and/or synaptic transmission delays. Index Terms — Canonical models, Class 1 neural excitability, integrateandfire neurons, multiplexing, synfire chain, transmission delay. I.
Dynamic Stochastic Synapses as Computational Units
, 1999
"... In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inhere ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. We consider
On Computing Boolean Functions by a Spiking Neuron
 Annals of Mathematics and Artificial Intelligence
, 1998
"... Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigma ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
(Show Context)
Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigmapi units (or polynomial threshold gates). In particular, we estimate the number of gates required to simulate a spiking neuron by a disjunction of threshold gates and we establish tight bounds for this threshold number. Furthermore, we analyze the degree of the polynomials that a sigmapi unit must use for the simulation of a spiking neuron. We show that this degree cannot be bounded by any fixed value. Our results give evidence that the use of continuous time as a computational resource endows singlecell models with substantially larger computational capabilities. 1 Introduction Biological neurons communicate by sending spikes among themselves. A spike is a discrete event in continuous ti...
SelfOrganization of Spiking Neurons Using Action Potential Timing
, 1998
"... We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast t ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to previous approaches, which use rate coding, the winner among competing neurons can be determined fast and locally. Our model is a further step towards a more realistic description of unsupervised learning in biological neural systems. Furthermore, it may provide a basis for fast implementations in pulsed VLSI. Keywords Selforganizing map, spiking neurons, temporal coding, unsupervised learning. I. Introduction In the area of modelling information processing in biological neural systems, there is an ongoing debate about which essentials have to be taken into account (see e.g. [1], [2], [3], [4]). Discrete models, such as threshold gates or McCullochPitts neurons, are undoubtedly very simplis...
On the Complexity of Learning for a Spiking Neuron
, 1997
"... ) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captu ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or "Perceptrons"). We introduce a simple model of a spiking neuron that, in addition to the weights that model the plasticity of synaptic strength, also has variable transmission delays between neurons as programmable parameters. For coding of input and output values two modes are taken into account: binary coding for the Boolean and analog coding for the realvalued domain. We investigate the complexity of learning for a single spiking neuron within the framework of PAClearnability. With regard to sample complexity, we prove that the VCdimension is \Theta(n log n) and, hence, strictly larger than that of a thresho...